Page 2,014«..1020..2,0132,0142,0152,016..2,0202,030..»

Technology is shaping learning in higher education – McKinsey

The COVID-19 pandemic forced a shift to remote learning overnight for most higher-education students, starting in the spring of 2020. To complement video lectures and engage students in the virtual classroom, educators adopted technologies that enabled more interactivity and hybrid models of online and in-person activities. These tools changed learning, teaching, and assessment in ways that may persist after the pandemic. Investors have taken note. Edtech start-ups raised record amounts of venture capital in 2020 and 2021, and market valuations for bigger players soared.

A study conducted by McKinsey in 2021 found that to engage most effectively with students, higher-education institutions can focus on eight dimensionsof the learning experience. In this article, we describe the findings of a study of the learning technologies that can enable aspects of several of those eight dimensions (see sidebar Eight dimensions of the online learning experience).

In November 2021, McKinsey surveyed 600 faculty members and 800 students from public and private nonprofit colleges and universities in the United States, including minority-serving institutions, about the use and impact of eight different classroom learning technologies (Exhibit 1). (For more on the learning technologies analyzed in this research, see sidebar Descriptions of the eight learning technologies.) To supplement the survey, we interviewed industry experts and higher-education professionals who make decisions about classroom technology use. We discovered which learning tools and approaches have seen the highest uptake, how students and educators view them, the barriers to higher adoption, how institutions have successfully adopted innovative technologies, and the notable impacts on learning (for details about our methodology, see sidebar About the research).

Exhibit 1

Survey respondents reported a 19 percent average increase in overall use of these learning technologies since the start of the COVID-19 pandemic. Technologies that enable connectivity and community building, such as social mediainspired discussion platforms and virtual study groups, saw the biggest uptick in use49 percentfollowed by group work tools, which grew by 29 percent (Exhibit 2). These technologies likely fill the void left by the lack of in-person experiences more effectively than individual-focused learning tools such as augmented reality and virtual reality (AR/VR). Classroom interaction technologies such as real-time chatting, polling, and breakout room discussions were the most widely used tools before the pandemic and remain so; 67 percent of survey respondents said they currently use these tools in the classroom.

Exhibit 2

The shift to more interactive and diverse learning models will likely continue. One industry expert told us, The pandemic pushed the need for a new learning experience online. It recentered institutions to think about how theyll teach moving forward and has brought synchronous and hybrid learning into focus. Consequently, many US colleges and universities are actively investing to scale up their online and hybrid program offerings.

Some technologies lag behind in adoption. Tools enabling student progress monitoring, AR/VR, machine learningpowered teaching assistants (TAs), AI adaptive course delivery, and classroom exercises are currently used by less than half of survey respondents. Anecdotal evidence suggests that technologies such as AR/VR require a substantial investment in equipment and may be difficult to use at scale in classes with high enrollment. Our survey also revealed utilization disparities based on size. Small public institutions use machine learningpowered TAs, AR/VR, and technologies for monitoring student progress at double or more the rates of medium and large public institutions, perhaps because smaller, specialized schools can make more targeted and cost-effective investments. We also found that medium and large public institutions made greater use of connectivity and community-building tools than small public institutions (57 to 59 percent compared with 45 percent, respectively). Although the uptake of AI-powered tools was slower, higher-education experts we interviewed predict their use will increase; they allow faculty to tailor courses to each students progress, reduce their workload, and improve student engagement at scale (see sidebar Differences in adoption by type of institution observed in the research).

While many colleges and universities are interested in using more technologies to support student learning, the top three barriers indicated are lack of awareness, inadequate deployment capabilities, and cost (Exhibit 3).

Exhibit 3

More than 60 percent of students said that all the classroom learning technologies theyve used since COVID-19 began had improved their learning and grades (Exhibit 4). However, two technologies earned higher marks than the rest for boosting academic performance: 80 percent of students cited classroom exercises, and 71 percent cited machine learningpowered teaching assistants.

Exhibit 4

Although AR/VR is not yet widely used, 37 percent of students said they are most excited about its potential in the classroom. While 88 percent of students believe AR/VR will make learning more entertaining, just 5 percent said they think it will improve their ability to learn or master content (Exhibit 5). Industry experts confirmed that while there is significant enthusiasm for AR/VR, its ability to improve learning outcomes is uncertain. Some data look promising. For example, in a recent pilot study, students who used a VR tool to complete coursework for an introductory biology class improved their subject mastery by an average of two letter grades.

Exhibit 5

Faculty gave learning tools even higher marks than students did, for ease of use, engagement, access to course resources, and instructor connectivity. They also expressed greater excitement than students did for the future use of technologies. For example, while more than 30 percent of students expressed excitement for AR/VR and classroom interactions, more than 60 percent of faculty were excited about those, as well as machine learningpowered teaching assistants and AI adaptive technology.

Eighty-one percent or more of faculty said they feel the eight learning technology tools are a good investment of time and effort relative to the value they provide (Exhibit 6). Expert interviews suggest that employing learning technologies can be a strain on faculty members, but those we surveyed said this strain is worthwhile.

Exhibit 6

While faculty surveyed were enthusiastic about new technologies, experts we interviewed stressed some underlying challenges. For example, digital-literacy gaps have been more pronounced since the pandemic because it forced the near-universal adoption of some technology solutions, deepening a divide that was unnoticed when adoption was sporadic. More tech-savvy instructors are comfortable with interaction-engagement-focused solutions, while staff who are less familiar with these tools prefer content display and delivery-focused technologies.

According to experts we interviewed, learning new tools and features can bring on general fatigue. An associate vice president of e-learning at one university told us that faculty there found designing and executing a pilot study of VR for a computer science class difficult. Its a completely new way of instruction. . . . I imagine that the faculty using it now will not use it again in the spring. Technical support and training help. A chief academic officer of e-learning who oversaw the introduction of virtual simulations for nursing and radiography students said that faculty holdouts were permitted to opt out but not to delay the program. We structured it in a were doing this together way. People who didnt want to do it left, but we got a lot of support from vendors and training, which made it easy to implement simulations.

Despite the growing pains of digitizing the classroom learning experience, faculty and students believe there is a lot more they can gain. Faculty members are optimistic about the benefits, and students expect learning to stay entertaining and efficient. While adoption levels saw double-digit growth during the pandemic, many classrooms have yet to experience all the technologies. For institutions considering the investment, or those that have already started, there are several takeaways to keep in mind.

In an earlier article, we looked at the broader changes in higher education that have been prompted by the pandemic. But perhaps none has advanced as quickly as the adoption of digital learning tools. Faculty and students see substantial benefits, and adoption rates are a long way from saturation, so we can expect uptake to continue. Institutions that want to know how they stand in learning tech adoption can measure their rates and benchmark them against the averages in this article and use those comparisons to help them decide where they want to catch up or get ahead.

Follow this link:
Technology is shaping learning in higher education - McKinsey

Read More..

Data Science and Machine Learning Service Market Size And Forecast to 2028 |Mango Solutions, Fico, ZS, DataScience.com, Microsoft Designer Women -…

The Global Data Science and Machine Learning Service MarketReport provides in-depth analysis of emerging trends, market drivers, development opportunities, and market constraints that may have an impact on the market dynamics of the industry. Each market sector is examined in depth in the Market Research Intellect, including goods, applications, and a competitive analysis.

The report was created using three different reconnaissance systems. The first step requires conducting extensive primary and secondary research on a wide range of topics. Approvals, evaluations, and discoveries based on accurate data obtained by industry specialists are the next steps. The research derives an overall estimate of the market size using top-down methodologies. Finally, the research evaluates the market for a number of sections and subparts using information triangulation and market separation techniques.

The primary objective of the report is to educate business owners and assist them in making an astute investment in the market. The study highlights regional and sub-regional insights with corresponding factual and statistical analysis. The report includes first-hand, the latest data, which is obtained from the company website, annual reports, industry-recommended journals, and paid resources. The Data Science and Machine Learning Service report will facilitate business owners to comprehend the current trend of the market and make profitable decisions.

Market Leaders Profiled:

Report Analysis & Segments:

The Data Science and Machine Learning Service is segmented as per the type of product, application, and geography. All of the segments of the Data Science and Machine Learning Service are carefully analyzed based on their market share, CAGR, value and volume growth, and other important factors. We have also provided Porters Five Forces and PESTLE analysis for a deeper study of the Data Science and Machine Learning Service.The report also constitutes recent development undertaken by key players in the market which includes new product launches, partnerships, mergers, acquisitions, and other latest developments.

Based on Product Type Data Science and Machine Learning Service is segmented into

Based on the Application Data Science and Machine Learning Service is segmented into

The report provides insights on the following pointers:

1 Market Penetration: Comprehensive information on the product portfolios of the top players in the Data Science and Machine Learning Service.

2 Product Development/Innovation: Detailed insights on the upcoming technologies, R&D activities, and product launches in the market.

3 Competitive Assessment: In-depth assessment of the market strategies, and geographic and business segments of the leading players in the market.

4 Market Development: Comprehensive information about emerging markets. This report analyzes the market for various segments across geographies.

5 Market Diversification: Exhaustive information about new products, untapped geographies, recent developments, and investments in the Data Science and Machine Learning Service.

Schedule a Consultation Call With Our Analysts / Industry Experts to Find a Solution For Your Business @ https://www.marketresearchintellect.com/ask-for-discount/?rid=351769

Various Analyses Covered:

Regional assessment of the Data Science and Machine Learning Service has been carried out over six key regions which include North America, Asia-pacific, Europe, Latin America, Middle East, and Africa. Moreover, the report also delivers deep insights on the ongoing research & development activities, revenue, innovative services, the actual status of demand and supply, and pricing strategy. In addition to this, this report also delivers details on consumption figures, export/import supply, and gross margin by region. In short, this report provides a valuable source of guidance and clear direction for the marketer and the part interested in the market.

North America(United States, Canada)Asia Pacific(China, Japan, India, South Korea, Australia, Indonesia, Others)Europe(Germany, France, United Kingdom, Italy, Spain, Russia, Others)Latin America(Brazil, Mexico, Others) The Middle East and Africa

Frequently Asked Questions:

About Us:Market Research Intellect

Market Research Intellect provides syndicated and customized research reports to clients from various industries and organizations with the aim of delivering functional expertise. We provide reports for all industries including Energy, Technology, Manufacturing and Construction, Chemicals and Materials, Food and Beverage, and more. These reports deliver an in-depth study of the market with industry analysis, the market value for regions and countries, and trends that are pertinent to the industry.

Contact Us:Mr. Steven FernandesMarket Research IntellectNew Jersey (USA)Tel: +1-650-781-4080

Email:sales@marketresearchintellect.com

Website: https://www.marketresearchintellect.com/

View post:
Data Science and Machine Learning Service Market Size And Forecast to 2028 |Mango Solutions, Fico, ZS, DataScience.com, Microsoft Designer Women -...

Read More..

How Artificial Intelligence Is Transforming Injection Molding – Plastics Today

The Industry 4.0 era of manufacturing depends so heavily on data-driven precision that artificial intelligence (AI) is playing an increasing role in harnessing that data to enhance the performance of machines including injection molders.

AI in manufacturing encompasses an array of technologies that allow machines to perform with intelligence that emulates that of humans. Machine learning and natural language processing help machines approximate the human capacity to learn, make judgments, and solve problems. Data-enhanced efficiency keeps processes moving faster and more cost-effectively.

AI is becoming increasingly important in mechanical engineering, not least because of the need to automate injection molding processes efficiently and flexibly despite ever smaller batch sizes and shorter product life cycles, said Werner Faulhaber, Director of Research and Development at Arburg. Application examples of AI include automatic programming of robotic systems, targeted malfunction remedying, and a spare parts system with intelligent image processing. Arburg is working on making injection molding more intelligent, step by step ensuring that the machine continuously learns, keeps itself stable, and can even optimize itself in the future.

Arburg forms flexible and controllable production systems by combining machines, automation, and proprietary IT solutions. The companys Gestica control system, with its intelligent assistant functions, is integral to those systems. All Kuka six-axis robots, for example, have been equipped with the new Gestica user interface as standard, Faulhaber noted. This simplifies programming, as well as the monitoring, storage, and evaluation of process data.

One application Arburg is working on is the automatic programming of its Multilift linear robotic systems. The idea is that the operator simply enters the destination, as with a car navigation device, and the system automatically calculates the optimal route. For robotic systems, this means that the operator simply enters the desired start and end positions, and the control system takes care of the rest.

Wittmann Battenfeld, which has fully embraced Industry 4.0 connectivity across its portfolio of injection molding and auxiliary machines over the past several years, employs AI with its robots to monitor cycle times and control robots speeds outside the molding machine.

The companys machine-learning capabilities HiQ Flow and CMS technology will be on display at this years K show on Oct. 19 to 26 in Dsseldorf, Germany. The speed of ROI can be as short as a few cycles with HiQ Flow, and the software can often be retrofitted to older injection molding machines equipped with a B8 machine control. A CMS Pro version will be available at a later date.

The technology draws new conclusions from current parameters and, thus, becomes increasingly intelligent as it monitors performance, said Product Manager Christian Glueck. We limit it to a methodical determination of parameters. Therefore, the time required to use the technology is minimal, as is the price.

Comparing AI and machine learning, Glueck said, AI actually requires a much higher time investment and, correspondingly, a higher financial investment. A large number of parameters must be recorded from a running process and the relevant parameters are determined on the basis of the deviations. These are compared with measurement data of the product.

Based on factors like changes in material, ambient temperature, machine wear, tool wear, and other influences, AI can determine which machine parameters need to be changed so that the product can be produced within its quality tolerances. This can take months, as errors first must occur in order to learn from them.

Wittmann co-funded such an assessment program with Austrias Montanuniversitt Leoben university, but we found that the time needed to make it workable for production had to be questioned because in addition to the long-term investigation of the process, you also need the manpower necessary to handle it.

The companys Eco-Mode saves wear and tear on the robot by ensuring it does not run faster than necessary ultimately saving maintenance and energy costs. Offered standard on many Wittmann robots, Eco-Mode requires no special programming or interface with the IMM or operator/programmer, said Jason Long, National Sales Manager for robots and automation for Wittmann USA. All the end user has to do is tell the robot how many seconds it should get back over the IMM before the mold opens.

Another Wittmann feature, Eco-Vac conserves energy by setting a few parameters on the robot and allowing the robot to turn its vacuum circuits off and on. The robot monitors the vacuum level of the circuit used for picking the part out of the mold. If the robot senses the vacuum has reduced to a level that it could drop the part before it is told to, the robot will turn the vacuum on until it reaches the safe level again, then shuts back off. This feature cuts the amount of compressed air each robot uses and could save customers hundreds of dollars a year per robot.

As AI and machine learning are further leveraged to improve injection molding operations, simply gathering data is not enough to optimize processes, Faulhaber cautioned. You also need the process expertise and domain knowledge. In the future, the evaluation of many data directly in the control unit will offer further added value.

Arburg uses AI to develop master models using experience and data collected over the years on process, material, and machinery, Faulhaber continued. The customer could then sharpen the provided master model on edge and optimize their processes. The in-house development Gestica control system, the Arburg host computer system, and the arburgXworld customer portal give an advantage here.

One of Arburg's medium-term goals is to develop a system for digital twins of customized injection molding machines. This will open up completely new possibilities for simulating the cycle and making energy predictions. In addition, 3D views and installation plans of the machine stored in the arburgXworld customer portal and in the control system support the operator, said Faulhaber.

View original post here:
How Artificial Intelligence Is Transforming Injection Molding - Plastics Today

Read More..

What Is End-to-End Call Encryption? – UC Today

End-to-end call encryption (E2EE) is a secure communication method that prevents third parties from accessing data transferred via VoIP calls.

Most popular messaging and call service providers use this technology including Facebook, WhatsApp, and Zoom to prevent the exposure of user information. While data is transferred from one end system or device to another, the data is encrypted on the senders system or device and in motion. Only the intended endpoint can decrypt the data, with unauthorized third parties unable to listen in.

How Does End-to-end Call Encryption Work?

The technology keeps the content you share private and secure from one endpoint to another. The shared content will be unreadable if intercepted in transit. VoIP phones use digital, encrypted communication between your phone and the cellular telephone base station. Your voice is decrypted at the base station and sent over the telephone network.

End-to-end encrypted calls provide the gold standard for protecting communication.

The security behind end-to-end encryption is enabled by creating a public-private key pair. This process is also known as asymmetric cryptography, which employs separate cryptographic keys for securing and decrypting the data. Public keys encrypt the data, while private keys decrypt data. For each person that joins, individual keys are generated. The public key will be stored on a server while the private key is stored on the device.

In online communication, there is an ISP, an intermediary, or various other organizations. Their server delivers data between both parties involved in an exchange. These intermediaries cannot decrypt and eavesdrop on the data. Only recipients can decrypt data with the matching key when end-to-end call encryption is in place.

Benefits of End-to-End Call Encryption

Personal data security and sensitive information are always an issue in online communication. E2EE completely encodes data, improving the security of calling services.

It prevents unauthorized access to personal conversations. Although authorities may try to access personal or private spaces, E2EE makes it impossible because the keys to decrypt them are missing. Digital signatures can detect content manipulation during transmission or whether the recipient has authorized access.

2. It facilitates secure data exchange

A crucial advantage of end-to-end encryption is that unauthorized persons cannot access personal data. Only unidentifiable numbers and letters can be recognized if a hacker circumvents the encryption. If intercepted by hackers or service providers, private communication and other details are not easily read.

3. It maintains data integrity

Data integrity is maintained because the key system prevents unauthorized devices from gaining access. Without E2EE, outside users can gain access to a piece of data and manipulate it before it reaches the recipient. End-to-end encryption denies them this access because they do not have the necessary key to access data in transit.

4. It makes calls tamper-proof

The decryption key is not transmitted; the recipient has it already. If an encrypted data gets tampered with in transit, the recipient will not be able to decrypt or tamper with it. End-to-end encryption can help organizations protect data by making it inaccessible to those who want to tamper with information.

Example of End-to-End Call Encryption: Microsoft Teams Calling

Teams support both cell and landline calls, with built-in online meetings and audio and video calling for individuals and groups. It has a cloud-based phone system with advanced features, including call transfer, multilevel auto attendants, and a call queue. Teams secures the following features during an end-to-end encrypted call audio, video, and screen sharing content. Call participants at both ends of the Teams session must turn on the E2EE setting, and the app will secure users presence status.

Keep in mind that several advanced features are not available during end-to-end encrypted calls, such as live captions and transcription, call transfer, merge and park, call companion and transfer to another device, etc.

For end-to-end encrypted calls on Teams, an administrator must first turn on the feature, and then device users must activate the settings locally. It is also possible to configure Teams E2EE using PowerShell.

Importance of End-to-end Call Encryption for Collaboration

E2EE is designed to protect users and their privacy by default using the highest grade end-to-end encryption. End-to-end call encryption keeps communication secure by ensuring that only those in the conversation can decrypt data, even if a server or network is compromised.

As VoIP calls become increasingly important for internal and external (i.e., customer-facing) communication, maintaining data privacy is essential. Sensitive information is often shared during these exchanges, and E2EE increases stakeholder confidence in communication systems while allowing flexible information sharing from any location. For these reasons, Zoom, too, launched end-to-end encrypted phone calls in September 2021 in addition to a Bring Your Own Key (BYOK) offering that allows users to choose their encryption keys.

Read more:
What Is End-to-End Call Encryption? - UC Today

Read More..

A new vulnerability in Intel and AMD CPUs lets hackers steal encryption keys – Ars Technica

Microprocessors from Intel, AMD, and other companies contain a newly discovered weakness that remote attackers can exploit to obtain cryptographic keys and other secret data traveling through the hardware, researchers said on Tuesday.

Hardware manufacturers have long known that hackers can extract secret cryptographic data from a chip by measuring the power it consumes while processing those values. Fortunately, the means for exploiting power-analysis attacks against microprocessors is limited because the threat actor has few viable ways to remotely measure power consumption while processing the secret material. Now, a team of researchers has figured out how to turn power-analysis attacks into a different class of side-channel exploit that's considerably less demanding.

The team discovered that dynamic voltage and frequency scaling (DVFS)a power and thermal management feature added to every modern CPUallows attackers to deduce the changes in power consumption by monitoring the time it takes for a server to respond to specific carefully made queries. The discovery greatly reduces what's required. With an understanding of how the DVFS feature works, power side-channel attacks become much simpler timing attacks that can be done remotely.

The researchers have dubbed their attack Hertzbleed because it uses the insights into DVFS to exposeor bleed outdata that's expected to remain private. The vulnerability is tracked as CVE-2022-24436 for Intel chips and CVE-2022-23823 for AMD CPUs. The researchers have already shown how the exploit technique they developed can be used to extract an encryption key from a server running SIKE, a cryptographic algorithm used to establish a secret key between two parties over an otherwise insecure communications channel.

The researchers said they successfully reproduced their attack on Intel CPUs from the 8th to the 11th generation of the Core microarchitecture. They also claimed that the technique would work on Intel Xeon CPUs and verified that AMD Ryzen processors are vulnerable and enabled the same SIKE attack used against Intel chips. The researchers believe chips from other manufacturers may also be affected.

In a blog post explaining the finding, research team members wrote:

Hertzbleed is a new family of side-channel attacks: frequency side channels. In the worst case, these attacks can allow an attacker to extract cryptographic keys from remote servers that were previously believed to be secure.

Hertzbleed takes advantage of our experiments showing that, under certain circumstances, the dynamic frequency scaling of modern x86 processors depends on the data being processed. This means that, on modern processors, the same program can run at a different CPU frequency (and therefore take a different wall time) when computing, for example, 2022 + 23823 compared to 2022 + 24436.

Hertzbleed is a real, and practical, threat to the security of cryptographic software.We have demonstrated how a clever attacker can use a novel chosen-ciphertext attack against SIKE to perform full key extraction via remote timing, despite SIKE being implemented as constant time.

Intel Senior Director of Security Communications and Incident Response Jerry Bryant, meanwhile, challenged the practicality of the technique. In a post, he wrote: "While this issue is interesting from a research perspective, we do not believe this attack to be practical outside of a lab environment. Also note that cryptographic implementations that are hardened against power side-channel attacks are not vulnerable to this issue." Intel has also released guidance here for hardware and software makers.

Neither Intel nor AMD are issuing microcode updates to change the behavior of the chips. Instead, they're endorsing changes Microsoft and Cloudflare made respectively to their PQCrypto-SIDH and CIRCL cryptographic code libraries. The researchers estimated that the mitigation adds a decapsulation performance overhead of 5 percent for CIRCL and 11 percent for PQCrypto-SIDH. The mitigations were proposed by a different team of researchers who independently discovered the same weakness.

AMD declined to comment ahead of the lifting of a coordinated disclosure embargo.

Visit link:
A new vulnerability in Intel and AMD CPUs lets hackers steal encryption keys - Ars Technica

Read More..

HelloXD ransomware bulked up with better encryption, nastier payload – The Register

Windows and Linux systems are coming under attack by new variants of the HelloXD ransomware that includes stronger encryption, improved obfuscation and an additional payload that enables threat groups to modify compromised systems, exfiltrate files and execute commands.

The new capabilities make the ransomware, first detected in November 2021 - and the developer behind it even more dangerous - according to researchers with Palo Alto Networks' Unit 42 threat intelligence group. Unit 42 said the HelloXD ransomware family is in its initial stages but it's working to track down the author.

"While the ransomware functionality is nothing new, during our research, following the lines, we found out the ransomware is most likely developed by a threat actor named x4k," the researchers wrote in a blog post.

"This threat actor is well known on various hacking forums, and seems to be of Russian origin. Unit 42 was able to uncover additional x4kactivity being linked to malicious infrastructure, and additional malware besides the initial ransomware sample, going back to 2020."

The analysts wrote that the malware author, or authors, are "now expanding into the ransomware business to capitalize on some of the gains other ransomware groups are making."

This comes as that both the ransom demands and the ransoms paid are increasing a 144 percent year-to-year increase in demanded ransom in 2021, reaching about $2.2 million, while the average ransom paid jumped 78 percent between 2020 and 2021, to $541,010 according to Unit 42's latest annual ransomware report. The incidence of stolen data being released publicly climbed 85 percent year-over-year, the report found.

The ransomware family is based on the Babuk (or Babyk) source code that was leaked on a Russian-language forum in September 2021. The group runs double extortion campaigns, exfiltrating the corporate data before encrypting it. Rather than threatening to release the files on a public leak site if the ransom isn't paid, the attackers instead directs victims to negotiate via the aTox chat service.

However, in the newer variants, the ransomware note also links to an onion domain for messaging. That said, the researchers wrote that as of now, the onion site is down, which could mean that it's currently under construction.

"The ransomware creates an ID for the victim which has to be sent to the threat actor to make it possible to identify the victim and provide a decryptor," they wrote. "The ransom note also instructs victims to download Toxand provides a Tox Chat ID to reach the threat actor. Tox is a peer-to-peer instant messaging protocol that offers end-to-end encryption."

Other ransomware groups, including those using LockBit 2.0, also use Tox Chat to communicate, they noted.

A key change to the latest version of Hello XD is the change in encryption algorithm. Unit 42 researchers wrote that they have seen two publicly available versions of HelloXD, an indication that the code is still under development. The first version uses Curve25519-Donna and a modified HC-128 algorithm to encrypt data in the files and is the least modified of the two versions from the original Babuk code.

In the most recent version dubbed by Unit 42 as HelloXD version 2 they changed the encryption algorithm, exchanging the modified HC-128 with the high-speed Rabbit symmetric cipher, also along with Curve25519-Donna. In addition, the developer changed the file marker, from a coherent string to random bytes.

"Both versions have been compiled with the same compiler (believed to be GCC 3.x and above based on the mangling of export names), resulting in very similar exports between not only the ransomware variants, but also other malware that we have linked to the potential author," the researchers wrote.

The most significant change between the two version was the introduction of the additional payload within version 2 that is a variant of the open-source MicroBackdoor and is encrypted with the WinCrypt API. The malware enables an attack to browse through the compromised file system, upload and download files and remote code execution (RCE). The malware also can remove itself from the system. The fact that the backdoor is delivered with the ransomware also is unusual.

"As the threat actor would normally have a foothold into the network prior to ransomware deployment, it raises the question of why this backdoor is part of the ransomware execution," they wrote. "One possibility is that it is used to monitor ransomed systems for blue team and incident response (IR) activity, though even in that case it is unusual to see offensive tools dropped at this point in the infection."

The researchers were able to see a hardcoded IP address that was used as the command-and-control (C2) to accelerate their hunt for the probable bad actor behind HelloXD. Through the IP address, they were able to see an email address that they linked to other domains and continued to follow the breadcrumbs through other malicious IPs, VirusTotal graphs and additional infrastructure and malware hosted on other domains, many of which used the x4k name.

The path followed through various graphs to a GitHub account, Russian-language hacking forums, other sites that referred to x4k and other aliases such as uKn0wn seen in the HelloXD samples. That was followed by the discovery of other GitHub accounts, another alias (Ivan Topor) and a YouTube account with another alias (Vanya Topor) that linked to videos in which the miscreant showed how he performed particular actions.

"The videos found gave us insight into x4koperations before moving into ransomware activity specifically," the researchers wrote. "We learned how this threat actor leverages Cobalt Strike for his operations, including how to set up Beacons as well as how to send files to compromised systems. In one of the videos, we actually observed the threat actor performing a DNS leak test on his Android phone."

The bad actor also often alluded to a "ghost" theme, similar to what the researchers saw in some earlier HelloXD ransomware samples. Most of the videos and written content are in Russian. Given that and some mistakes that he made convinced Unit 42 that the x4k is from Russia.

The rest is here:
HelloXD ransomware bulked up with better encryption, nastier payload - The Register

Read More..

This tiny, encrypted drive can fit on your keyring – ZDNet

While having access to an encrypted SSD -- like the new Kingston IronKey Vault Privacy 80 -- is nice, sometimes you want something smaller and more convenient to carry around with you.

Enter the Kingston IronKey Vault Privacy 50.

Kingston IronKey Vault Privacy 50

While the IronKey Vault Privacy 80 is an SSD, the IronKey Vault Privacy 50 is a USB flash drive, and as such is a lot smaller and more suitable for smaller amounts of data that you want to have with you.

Packed into what looks like a standard yet FIPS 197 certified USB flash drive is an XTS-AES 256-bit hardware encryption engine. The business end features a regular USB-A connector compatible with USB 3.2 Gen 1, giving it broad compatibility (if you want to use it on a device with USB-C ports, you'll need a dongle or an adapter) and good performance.

Kingston IronKey Vault Privacy 50

The drive is compatible with Windows and Mac, and you have to run an application on the drive to unlock the drive and access your data. Unlike the IronKey Vault Privacy 80, this drive is not operating system independent.

Kingston IronKey Vault Privacy 50 features a cap retainer on the read and a lanyard hole to allow you to put the drive on your keys

The drive offers built-in protection against attacks such asBadUSB, as well as brute force attacks.

Speeds for the drive are rated at around 250MB/s for read speeds, and 180MB/s write speeds, and in testing I was able to get read speeds of 225MB/s and write speeds of 150MB/s.

The drive features a lanyard hole that you can use to attach your drive to your keys, and the cap fits well and clips onto the rear of the drive when use, giving it a fighting chance against loss.

Kingston IronKey Vault Privacy 50 (left) and Kingston IronKey Vault Privacy 80 (right)

So you only have to buy the storage you need, the IronKey Vault Privacy 50 comes in a range of sizes, from 8GB all the way up to 256GB, which offers great flexibility.

I can't fault the Kingston IronKey Vault Privacy 50. It's an excellent way to secure your data when out and about, and comes highly recommended.

See more here:
This tiny, encrypted drive can fit on your keyring - ZDNet

Read More..

Android Users are Getting the Thunderbird Email: Open Sourced with End-to-End Encryption – Tech Times

Urian B., Tech Times 14 June 2022, 11:06 am

An open-source email application will finally be making its way to Android. Users of the iOS alternative will finally gain access to the Thunderbird Android email app, an open-sourced application that supports end-to-end encryption.

According to the story byZDNet, thanks to the K-9 Mail Android email app project, users will finally be able to use the Thunderbird app not just on the desktop but also on Android devices. The project resulted in an app for the open-sourced email project to be used on Android devices.

The Mozilla Foundation moved Thunderbird to its subsidiary two years ago. The open-sourced email solution came under MZLA Technologies Corporation, just like how Firefox was moved under the foundation's subsidiary, Mozilla Corporation, from the initial Mozilla Foundation.

Moving Thunderbird to its subsidiary allowed the project to create its own path. On top of this, the project added new features, including OpenPGP end-to-end encryption along with a mobile app that ZDNet described as "long-awaited."

The email service's team revealed that talks of an email app version of the service started as far as 2018. The talks were held between Ryan Lee Sipes, the product manager of Thunderbird, and Christian Ketterer, K-9's lead maintainer.

K-9 Mail is an already existing email app on theGoogle Play Storewith five million downloads. Instead of building the app from scratch, both parties plan on merging Thunderbird's systems and features with the existing app.

Four years later, the best decision, according to the two, was to simply have K-9 join the open-sourced service instead of having to build an app from the start. As perThunderbird, a lot of users have asked them for a mobile experience for the open-sourced email service.

Read Also:Telegram Founder Takes a Swipe on Apple's iOS Limitations! Here's What He Says

The team then announced that they plan to do this by helping K-9 provide an Android version of the Thunderbird email open-source service. K-9 is tasked with supplementing the open-sourced service and improving the email experience for mobile users.

As per the Thunderbird team, their commitment toward the desktop version of the email service remains the same, and the team is committed to making the most out of both worlds. As per ZDNet, this means that K-9 will be in charge of taking the name and branding of the original service.

Before this becomes possible, K-9 will still have to align with the visual appearance and feature set of Thunderbird. In order to do this, the team says that they are devoting finances and development in order to improve K-9 Mail continually.

Related Article:Apple and Google's 'Duopoly' over Mobile Markets is Anti-Competition - UK Regulator Claims

This article is owned by Tech Times

Written by Urian B.

2021 TECHTIMES.com All rights reserved. Do not reproduce without permission.

The rest is here:
Android Users are Getting the Thunderbird Email: Open Sourced with End-to-End Encryption - Tech Times

Read More..

For resiliency, the Army may look to rely more on commercial systems than SIPRNet, NIPRNet – FedScoop

Written by Mark Pomerleau Jun 15, 2022 | FEDSCOOP

The Armys top IT official on Wednesday questioned the utility of the services current classified and unclassified network configurations and instead pointed to the possibility of relying on commercial systems that could be more resilient in future conflicts against sophisticated adversaries.

Adversaries will contest U.S. forces unlike ever before, straining the network and making it harder for data to be passed back and forth and accessed at the right time, said Army CIO Raj Iyer. As a result, forces must be more adaptable and take advantage of various means for communication and transport, such as commercial solutions.

Our strategy again here is to get to greater resiliency, with commercial transport, using dark fiber, a heck of a lot more encryption when it comes to secret The need for us to have physical separation of data and networks for SIPR, or SIPR to ride on NIPR, those days are gone, Iyer said during a presentation hosted by GovConWire

It really questions what do we need a SIPRNet for? Why do we need a whole separate network, that we can actually do pretty damn well with encryption.

Iyer was referencing the SIPRNet or Secure Internet Protocol Router Network, which is the Pentagons network to handle secret classified information and NIPRNet the Non-classified Internet Protocol Router Network, which handles unclassified information.

What we have been able to show if you have the right encryption in place thats quantum-resistant and we were able to use solutions like commercial solutions for classified, and we have shown that today and validated that. It really questions what do we need a SIPRNet for? Why do we need a whole separate network, that we can actually do pretty damn well with encryption, he said. Then absolutely the same question on NIPRNet. If we move all of our data and applications to the cloud and if I can get to a virtual desktop in the cloud and I can use any open available internet to be able to access all of that through any device, then what do we really need the NIPRNet for?

These questions arise as the Army is developing its unified network plan part of its larger digital transformation strategy which aims to synchronize and connect the services enterprise and tactical network together.

Currently, silos exist between the two, creating barriers for troops who want to pass data across echelons or even theaters. This especially creates problems when troops move from one theater to another, as seen most recently in Afghanistan.

I saw forces come into the theater that were not able to join the network right away. It was really, really cumbersome for everything that we needed to do while I was there, Brig. Gen. Jeth Rey, director for the Army Network-Cross Functional Team, said in October.

For Iyer, the Army needs to question the status quo to evolve and succeed in future battlefield environments.

Were thinking out of the box. Im not saying you have all the solutions, but we really going back to the direction I have from my boss, this is how were going to transform, Iyer said about the Armys modernization approach and potential for using more commercial solutions in an attempt to be more resilient from adversary disruptions.

He added that if the Army doesnt question the status quo, it will be limited by aging technologies and architectures from the past.

One such example from the Ukraine-Russia conflict Iyer and others have pointed to is SpaceXs Starlink satellite constellation that provides internet coverage.

Despite Russian attempts to jam the system in Ukraine, the following day, Starlink reported adding new lines of code that rendered the jamming ineffective.

We saw how Starlink is actually tremendously helping establish a communications network in an environment that we thought would be degraded on day one, Iyer said in April.

Army forces must be able to communicate and pass data in denied and degraded environments in the future.

As we get into more of a distributed command and control structure, what we really dont want is a massive command post that has all of this IT in one place, where we become bullseye for our enemies, he said Wednesday. Moving to the distributed C2 means that were going to have to leave data in multiple places with greater resiliency, were going to have to rely on all kinds of transport, not just MILSATCOM, but commercial SATCOM, as well and this is where the example I gave you with Starlink and how were using that today in Europe is a great example. All of this coupled with compute at the edge is going to be absolutely critical in terms of supporting tactical operations.

Continued here:
For resiliency, the Army may look to rely more on commercial systems than SIPRNet, NIPRNet - FedScoop

Read More..

French Data Protection Authority publishes Q&A regarding use of Google Analytics – JD Supra

Background

Following complaints from the NOYB association regarding the use of the Google Analytics audience measurement solution, the French Data Protection Authority (CNIL) had issued several formal notices to French companies using this solution on their websites. These decisions were issued in the context of other decisions from European data protection authorities like the Austrian one, and following the Schrems II ruling of the ECJ invalidating the Privacy Shield that has imposed to implement additional measures to Standard Contractual Clauses to cover transfer of personal data outside the EU.

The CNIL had made public only one of these decisions in February 2022 in an anonymized way. In this decision, the CNIL considers that the use of the Google Analytics audience measurement solution is not GDPR compliant because personal data collected through the cookies of the solution are transferred to the United States without sufficient measures applied to prevent any possible access from the authorities to the personal data. Although efforts were made by Google to deploy additional measures in consideration of the Schrems II ruling, the CNIL considers that this is still not sufficient.

The CNIL recommends anonymizing personal data collected through audience measurement cookies. That way, the solution can benefit from the consent exemption applicable to audience measurement cookies in France. The consent exemption is only applicable to tools complying with a set of cumulative criteria published by the CNIL, one of them being to produce only statistic anonymous data. The controller must, however, still ensure that transfers outside the EU are compliant.

To provide more background on these decisions and providing possible solutions, the CNIL has released a Q&A on June 7, 2022 on the use of Google Analytics as well as guidance on the use of a compliant audience measurement solution.

The Q&A is short and does not provide much more information than already provided in the anonymized decision published in February 2022. All French companies among the 101 complaints of the NOYB association have now received a formal notice from the CNIL regarding the use of Google Analytics and they have 1 month (renewable) to comply.

The goal of this Q&A is for the CNIL to make clear that the prescription of the only published decision (February 2022 - anonymized) must be understood as being applicable to all companies using the solution and not only to the companies having received a formal notice.

The CNIL considers that any additional legal, organisational and technic safeguards deployed by Google like Standard Contractual Clauses and additional measures will still be not sufficient to prevent access by non-EU authorities as Google remains subject to US jurisdictions.

The CNIL categorically refuses a risk-based approach and consider that the risks remain as long as an access to the data is possible: according to the CNIL, even if access by US authorities to data collected through the Google Analytics solution is unlikely (i.e. in practice authorities are not making such data access requests), as long as an access is technically possible, then technical measures are necessary to make such access impossible or ineffective.

Several options are raised in the Q&A for a compliant use of the Google Analytics audience measurement solution, but most of them are considered as not sufficient by the CNIL and it seems that only the proxy solution is considered acceptable by the CNIL:

Modifying the settings of the Google Analytics solution (e.g. changing the characteristics of the processing of the IP address, only hosting personal data within the EU, , etc.) is not sufficient according to the CNIL as long as possible access by non-EU authorities is still possible and enable to identify the user and track his/her navigation from one website to another.

The CNIL highlights that encryption is only an acceptable solution if the encryption keys are kept under the sole control of the data exporter or by other entities established within the EU or in adequate countries.

Regarding Google Analytics, the CNIL considers that encryption of data is not sufficient as in practice Google LLC is the entity that:

The CNIL concludes that since Google LLC still has the possibility to access the data in clear, the encryption measures cannot be considered effective in case of requests from the US authorities. Conclusion to be drawn is therefore that encryption would be an appropriate measure if Google LLC did not have access to clear data or access to the encryption keys.

Collecting consent of users for data transfers is not sufficient as, although this is one of the safeguard listed by Article 49 of the GDPR, this is considered by the EDPB as only applicable to single and non-recurring transfers, and cannot be used as a permanent solution for systematic transfers of personal data.

The CNIL seems to only identify as a possible solution the use of a proxy. Indeed, as per the CNIL, the main issue relates to the direct contact, through a HTTPS connection, between the devices of the users and the Google servers, which enables to collect the IP address of the users as well as many other information that conduct to the re-identification of the user. Only solutions that break this contact between the device and the server, like a proxy, can address this issue, as data would be pseudonymized before being transferred outside the EU.

The proxy, or similar solution, must comply with the EDPB criteria, and in particular:

In addition, in the guidance on the use of a compliant audience measurement solution published together with the Q&A, the CNIL also underlines that the use of a proxy requires specific measures (e.g. absence of transfer of the IP address to the servers of the measurement tool, replacement of the user identifier by the proxy server, absence of any collection of cross-site identifiers, etc.) to be deployed and that the proxy server must be hosted in conditions that guarantee that the data it will be processing will not be transferred outside the EU.

In practice, all this criteria make it difficult from a technical standpoint to apply. The CNIL itself recognizes that this may be very costly and complex in practice, and eventually recommend using alternative solutions to Google Analytics.

The CNIL has published on its website a list of cookies solutions exempted from consent and that it considers as being compliant when properly configured. There are currently 18 certified solutions. The CNIL, however, indicates that such solutions have not been assessed on the issue of international transfers, which would means that, although they are listed by the CNIL as compliant, they cannot be used as such but first require to verify data transfers and apply Schrems IIs safeguards.

Solutions offered by the CNIL remain in practice difficult to apply and no workable solution is eventually offered to companies. As next steps, this Q&A should be seen as a reminder for Companies to assess their audience measure solution and consider whether the measures put in place to limit access to data by authorities are sufficient.

See the original post:
French Data Protection Authority publishes Q&A regarding use of Google Analytics - JD Supra

Read More..