Page 679«..1020..678679680681..690700..»

Get in the Cloud: Why Thousands of Researchers are Joining … – National Institutes of Health (.gov)

Posted November 1, 2023

By Deborah Guadalupe Duran, Ph.D.Senior Advisory- Data Science, Analytics and Systems, National Institute on Minority Health and Health Disparities

Luca Calzoni, M.D., MS, Ph.D. Cand.Physician and Data Scientist, National Institute on Minority Health and Health Disparities

Since February 17, 2023, thousands of researchers, educators, students, and community members have gathered at ScHARe Think-a-Thons to join a paradigm shift in health disparities and health outcomes research.

These Think-a-Thons are part of the Science Collaborative for Health disparities and Artificial intelligence bias Reduction (ScHARe)a new NIH resource designed to expand access to large health disparities and health outcomes datasets and the data science skills required to analyze them. ScHARe Think-a-Thons offer free training that enables participants to link cloud-based datasets, access federated data, begin learning the Python programming language, and more.

And were just getting started. As we prepare for exciting new phases in the Think-a-Thon series, we want to highlight what participants have gained so far and preview the many opportunities ahead.

Training Think-a-Thons: Successfully Upskilling to Advance Careers

People without access to data science tools have gained access for the first time through ScHARes Training Think-a-Thons. More than two-thirds of participants report having little to no prior experience in cloud computing or programming languages, and many belong to populations that are historically underrepresented in these fields. In participant polls, nearly all respondents agree that our Training Think-a-Thons have taught them how to access and work with the large datasets hosted on ScHARe and on Terra, the web platform where ScHARe is housed. With these new skills and membership in the ScHARe learning community, participants are poised to make significant novel contributions to health disparities and health outcomes research.

(Anyone can benefit from Training Think-a-Thons: recordings are posted online two weeks after sessions conclude.)

Training Think-a-Thons: Asking New Questions to Get Better Answers

Think-a-Thon participants are enthusiastic about a paradigm shift in health disparities and health outcomes research. More than 90 percent report that they want to learn to use AI tools and cloud computing to conduct new Big Data-driven research in these areas. As one Think-a-Thon participant noted, researchers can use ScHARes advanced computing tools and large datasets to figure out the basis of disparitieswhat mechanisms really drive themand thus yield powerful new approaches to persistent public health challenges.

Research Collaboration Think-a-Thons: People from Many Disciplines and Career Levels Are Joining to Advance Health Disparities & Health Outcomes Research

The ScHARe community includes people from many backgrounds and career levelsPython programmers, social scientists, community health workers, and more. Two-thirds have expressed interest in forming cross-disciplinary, multi-level collaborations to generate publishable research using cloud computing and AI tools. Starting in 2024, ScHARe Think-a-Thons will begin directly supporting these research collaborations, with dedicated space for participants to form new research teams or to introduce their existing research teams to ScHARe.

The first research collaboration Think-a-Thons will focus on Individual Social Determinants of Health (SDOH), structural SDOH, and health outcomes. Think-a-Thon participants have a wide range of research intereststhe intersection of the gastrointestinal microbiome, culture, and cognition; demographic impacts on gene expression; improving LGBTQ research through linked datasets; using informatics to improve community-based care; and more. All projects are welcome!

Coming in 2024: AI Bias Mitigation

Many Think-a-Thon participants want to tackle an important challenge: bias mitigation and ethical Artificial Intelligence strategies. Several ScHARe Think-a-Thons will focus on this goal.

AI is a key tool for analyzing large datasets and building new health-related systems, such as algorithms that assess patient risk and guide care. However, its use can also reproduce and amplify existing biases in data. For example, several populations are underrepresented in biomedical and demographic datasets, and when these datasets are biased, AI can also reflect these biases, ultimately leading to incomplete research, missed or delayed diagnoses, and worse health outcomes.

In 2024, well launch Think-a-Thons to address this challenge. By tapping the diverse expertise of members of the growing ScHARe community, well be able not only to share best practice tools and workflows, but also develop innovative solutions.

Opportunities for Tailored Think-a-Thons

Weve heard participants ask how to use ScHARe in unique settings, such as college-level research methods courses. In May 2023, we responded by launching special tailored Think-a-Thons.

At our first tailored Think-a-Thon, with more than 60 educators, we outlined how ScHARes free tools could help low-resourced colleges and community colleges teach data science and connect their students to this important field. This August, we offered a second tailored Think-a-Thon highlighting how ScHARe can be useful to Tribal Colleges and Universities and Native American-serving institutions. All Think-a-Thons are free and open to the public.

We welcome suggestionslet us know if theres a tailored Think-a-Thon you would like to see.

Join Us!

In just their first few months, ScHARe Think-a-Thons have become a space for a diverse community of individuals interested in health disparities, health outcomes, and AI bias research. Regardless of your knowledge of data science or cloud computing, you can join Think-a-Thons to build new skills and meet new collaborators. Women and members of groups traditionally underrepresented in data science and health research are especially encouraged to participate. Join us in making data science work for everyone!

Original post:
Get in the Cloud: Why Thousands of Researchers are Joining ... - National Institutes of Health (.gov)

Read More..

4 Fast-Growing Cloud Computing Companies to Power Your … – The Smart Investor

The last three years have seen a sharp surge in digitalisation by individuals and companies.

With the proliferation of smartphones and laptops, along with better connectivity with the advent of 4G and 5G networks, staying connected on the go is now the norm.

As investors, it is important to identify a growing, sustainable trend that you can base your investment ideas on.

Interest in generative artificial intelligence (AI) means that the industry should see significant future growth.

Gartner, an independent research firm, estimated that worldwide spending on cloud services grew 20.4% year on year to US$494.7 billion in 2020, with spending set to hit US$600 billion this year.

We feature four promising and fast-growing cloud firms that could end up on your buy watchlist.

Microsoft is a vendor of computer hardware, software, and gaming systems.

The company is also a leading cloud service provider and is famous for its Office 360 suite of word processing and spreadsheet software that is used by millions of people and corporations.

Microsoft reported a mixed set of results for its fiscal 2023 (FY2023) ending 30 June 2023.

Revenue rose 6.9% year on year to US$211.9 billion while operating profit increased by 6.2% year on year to US$88.5 billion.

Net profit, however, slid by 0.5% year on year to US$72.4 billion because of higher income tax provisions.

Despite the lower profit, Microsoft reported a 21% year-on-year jump in cloud revenue, demonstrating continued demand for its cloud services.

For FY2023, the software company also generated US$59.5 billion of free cash flow.

This momentum has continued into the first quarter of fiscal 2024 (1Q FY2024).

Revenue increased by 12.8% year on year to US$56.5 billion with operating profit climbing 25% year on year to US$26.9 billion.

Net profit improved by 27% year on year to US$22.3 billion.

Microsoft cloud revenue for the quarter increased by 24% year on year.

The gross margin for the division also increased slightly year-on-year from 72% to 73%.

Alphabet is a technology holding company and the parent company of search engine Google.

The Californian company also offers cloud services (Google Cloud) and owns video-sharing site YouTube.

For its 2023 third quarter (3Q 2023) result, Alphabet saw revenue rise 11% year on year to US$76.7 billion.

Operating profit jumped 24.6% year on year to US$21.3 billion while net profit surged 41.5% year on year to US$19.7 billion.

Google Cloud performed well for the quarter, notching up a 22.5% year-on-year revenue increase to US$8.4 billion.

The division also chalked up an operating profit of US$266 million, its second consecutive quarter of profitability after 2Q 2023s operating profit of US$395 million.

Alphabet is also a very strong free cash flow generator.

For the first nine months of 2023 (9M 2023), the technology company generated US$61.6 billion of free cash flow, up 40% year on year from US$44 billion in 9M 2022.

Oracle develops and markets computer software applications for businesses and provides cloud services for its clients.

For its fiscal 2023 ending 31 May 2023, total revenue rose 18% year on year to US$50 billion, an all-time high.

Operating profit climbed 20% year on year to US$13.1 billion.

Net profit surged 27% year on year to US$8.5 billion.

The cloud company also generated a healthy positive free cash flow of US$8.5 billion for FY2023.

For its first quarter of fiscal 2024 (1Q FY2024) ending 31 August 2023, revenue increased 9% year on year to US$12.5 billion with cloud revenue shooting up 30% year on year to US$4.6 billion.

Operating profit climbed 26% year on year for the quarter to US$3.3 billion while net profit leapt 56% year on year to US$2.4 billion.

Amazon is one of the largest e-commerce players in the world and operates cloud services under Amazon Web Services (AWS).

The company reported an encouraging set of results for the first nine months of 2023 (9M 2023).

Total sales rose 11% year on year to US$404.8 billion with operating profit more than doubling year on year to US$23.6 billion.

Amazon reported a net profit of US$19.8 billion for 9M 2023, reversing last years US$3 billion net loss.

Its AWS segment reported higher net sales of US$66.6 billion, up from US$58.7 billion a year ago.

However, operating income for the division dipped slightly from US$17.6 billion to US$17.4 billion for 9M 2023.

Are we really ready to live in a world with AI that could potentially take over our jobs? Check out our latest Special Free Report on this fascinating topic. We cover the latest developments in AI and how they could impact your life and investments. Click here to download a copy now.

Follow us on Facebook and Telegram for the latest investing news and analyses!

Disclosure: Royston Yang owns shares of Alphabet.

Original post:
4 Fast-Growing Cloud Computing Companies to Power Your ... - The Smart Investor

Read More..

Outdated cryptographic protocols put vast amounts of network traffic … – Help Net Security

Cryptography is largely taken for granted rarely evaluated or checked a practice that could have devastating consequences for businesses as attack surfaces continue to expand, the cost of a data breach rises year-over-year, and the age of quantum computing nears, according to Quantum Xchange.

Examining more than 200 terabytes of network traffic or the total sum of all packets, for all connections, between all pairs up to 80% was found to have some defeatable flaw in its encryption and 61% of the traffic unencrypted.

56.5% of the single, bi-directional TCP or UDP connections analyzed are unencrypted, compared to 43.4% of encrypted connections.

Old, outdated cryptographic protocols TLS 1.0 and SSL v3 are still in wide use today with industries like healthcare and higher education slow to change. More alarming still, up to 92% of all traffic on a hospital network uses no encryption at all. This suggests a laissez faire attitude and general reluctance to update working systems that are in production.

Strong cryptography is a basic requirement for insurance coverage. It is frightening to see healthcare falling so far behind.

45% of host pairs communicate via an unencrypted channel. 87% of encrypted, host-to-host relationships still use TLS 1.2, demonstrating that a large migration to TLS 1.3 is still forthcoming not a trivial upgrade given the significant differences between versions.

Industries, such as healthcare, have a significant long tail of TLS 1.1 and 1.0 usage, even SSL v3 can be found at scarily high volumes. This suggests an if it aint broke dont fix it attitude and a general reluctance to update working, albeit outdated, systems that are in production.

These findings serve as a snapshot of whats taking place within enterprise systems worldwide, said Vince Berk, Chief Strategist at Quantum Xchange. Zero trust is meaningless if your encryption is not bulletproof. Were trying to bring awareness to the here-and-now problem with cryptography so that organizations can shore up these weaknesses and better protect their systems from everyday cybersecurity risks and yet-to-be-discovered threats.

See the original post here:
Outdated cryptographic protocols put vast amounts of network traffic ... - Help Net Security

Read More..

What is Moore’s law, and how does it impact cryptography? – Cointelegraph

Moores law, explained

A fundamental concept in the technology sector, Moores law foretells the exponential rise in computing power over time and is named after Gordon Moore.

Gordon Moore, the co-founder and emeritus chairman of Intel Corporation, proposed Moores law in 1965. According to him, the number of transistors on microchips the fundamental building blocks of electronic devices will double roughly every two years while their production costs will stay the same or even go down. The consequences of this exponential rise in processing capacity for several facets of an individuals digital life are vast. How does Moores law predict the exponential growth of computing power?

Cryptography is a crucial field where Moores law is applied. The possible computational capacity for encryption and decryption grows along with processing power as a result of transistor density being doubled.

For instance, as computers get more powerful, cryptographic techniques that were formerly thought to be extremely secure may become vulnerable to attacks. Therefore, stronger encryption techniques and longer key lengths are being developed to guarantee the security and privacy of digital communications.

Moores law explains the computing industrys tremendous improvements, making it possible to produce smaller, more powerful, energy-efficient electronic products. This technological advancement significantly impacts several industries, including entertainment and healthcare.

Additionally, it stimulates economic growth through innovation, creates new markets for goods and services, and increases the effectiveness of existing ones. Maintaining a competitive edge in the market requires keeping up with Moores law, as those who adopt the newest technology get the most benefits.

Moores laws exponential increase in processing power has beneficial and detrimental effects on the encryption industry.

Cryptographers can create more advanced and reliable encryption methods by taking advantage of rising processing capabilities. They can develop encryption algorithms with larger key lengths and more difficult mathematical operations to make it more difficult for potential attackers to decrypt data. Additionally, improvements in cryptography may result in better cyber threat defense and improved security for sensitive data.

On the negative side, potential foes benefit from the quick increase in processing capability, which can decrease the time needed to break encryption keys and weaken security. Data secrecy may be at risk because once-secure cryptographic techniques may become outdated more quickly. To maintain efficient data protection, the field of cryptography must keep up with technological developments in computers.

Moores law influences blockchain technology by presenting prospects for scalability, security and energy efficiency, but it also raises issues that need to be resolved to maintain blockchain networks decentralization and integrity.

Moores laws prediction of constant doubling of computing power makes it possible for blockchain networks to grow successfully, supporting higher transaction volumes and larger data sets.

Moores law encourages the development of more secure cryptographic methods, even though it can provide prospective attackers with more computing capacity to attempt attacks on blockchains. Blockchain data can be protected using more robust encryption techniques, increasing its threat resistance and preserving system confidence.

Furthermore, the increased energy efficiency of hardware, driven by Moores law, can potentially reduce the environmental footprint of blockchain networks. The development of specialized hardware, such as ASICs, benefits blockchain stability. However, Moores law presents challenges like the potential centralization of blockchain networks.

Blockchain networks can store more data, such as smart contracts and transaction histories, without dramatically raising costs thanks to the growth in storage capacity as facilitated by Moores law. This makes it possible for blockchain technology to be used in more complex and robust ways than merely for cryptocurrencies.

Moores law has influenced the development and general accessibility of cloud computing.

Moores law significantly impacts the capabilities and architecture of cloud computing services. The ever-expanding capabilities of server technology allow cloud providers to offer increasingly powerful virtual machines and data storage at affordable prices.

It also encourages the broad use of cloud computing for various applications, from data storage and processing to machine learning and artificial intelligence. This allows organizations and individuals to find more effective and affordable solutions.

However, this increase in processing power also highlights how crucial data security and privacy are since more potent hardware may result in more sophisticated cyber threats and the requirement for enhanced encryption and security measures in the cloud.

Moores law continues to be a major force behind technological advancement, even with modifications to account for the rapidly changing semiconductor industry and its considerable effects on the security and scalability of cryptocurrencies and blockchain networks.

Moores law continues to be debated and discussed in the technology sector. Its important to note that Moores original theory has already undergone changes in practice, and some experts contend that it no longer accurately captures the rate of advancement in semiconductor technology.

The underlying principles of constant technical growth and innovation continue to drive progress in semiconductor technology, even though the exponential increase in transistor count on a microprocessor every two years might not be as steadfast as it used to be.

Moores law served as a guide for the advancement of classical computing technology, but quantum computing represents a paradigm shift that has the potential to continue the trend of exponential growth in computational power for particular problem domains, albeit with its own set of difficulties and constraints.

However, Moores law is still relevant from the perspective of cryptocurrencies and blockchain technology. Secure cryptographic algorithms are necessary for cryptocurrencies to protect transactions and uphold the blockchains integrity. While Moores law has improved processing power, strengthened encryption techniques, and improved blockchain security, it has also presented difficulties.

Due to the continual rise in processing power, which could simplify the efforts of malicious individuals attempting cryptographic attacks and potentially compromise the security of blockchain networks, cryptocurrencies have had to evolve and adapt their security measures. To counteract the benefits provided by Moores law, this has led to the development of more durable cryptographic algorithms like Advanced Encryption Standard and a focus on longer key lengths.

Additionally, theres been a shift to longer key lengths, for example, in RSA encryption, with lengths such as 2048 or 3072 bits. Longer keys make cryptographic procedures substantially more complex, making it much more difficult for attackers to decrypt data without the right key.

Therefore, the applicability of Moores law depends on how one understands its original formulation. The industrys objectives are still very much centered on the development of more potent, energy-efficient and inventive computing technology, even though the exact doubling of transistor count may have slowed.

Read more:
What is Moore's law, and how does it impact cryptography? - Cointelegraph

Read More..

A Once-in-a-Generation Investment Opportunity: 2 Highly … – The Motley Fool

Artificial intelligence (AI) has been around for years, but recent advances have made the technology more powerful and more compelling than ever, spotlighting its revolutionary potential. Indeed, some experts are calling AI the fourth industrial revolution.

The first three industrial revolutions were brought on by steam power, electricity, and digital technologies like computers and the internet. Those innovations changed the very fabric of daily life, and AI promises to have a similar impact. That hints at immense value creation, putting investors in front of a once-in-a-generation opportunity.

The most prudent way to benefit is to build a basket of AI stocks, and Wall Street is particularly bullish on Amazon (AMZN 2.12%) and Docebo (DCBO 1.31%). Both stocks carry a consensus rating of buy and neither has a single sell recommendation at the present time.

Here's what investors should know about these highly recommended growth stocks.

Amazon supplanted Apple as the world's most valuable brand in 2023, according to consultancy Brand Finance. That recognition reflects its strong presence in three large markets. Amazon runs the most visited online marketplace in the world, and its unmatched ability to source shopper data has snowballed into a booming adtech business. Additionally, Amazon Web Services (AWS) is the market leader in cloud computing.

That last point is particularly relevant. Leadership in cloud computing positions Amazon as a major player in the burgeoning artificial intelligence (AI) market. CEO Andy Jassy explained why during the second-quarter earnings call: "People want to bring generative AI models to the data, not the other way around. AWS not only has the broadest array of storage, database, analytics, and data management services for customers, it also has more customers and data stores than anybody else."

Innovation at all three layers of the AI stack should help AWS reinforce its strong position. At the infrastructure layer, Amazon is designing its own chips for AI training and inference as a cheaper (and less powerful) alternative to Nvidia graphics processing units. At the services layer, Amazon recently launched its Bedrock suite of pretrained models that lets businesses build custom generative AI applications. At the software layer, Amazon recently launched its AI-enabled coding companion CodeWhisperer to help software developers work more productively.

Amazon delivered a solid financial performance in the third quarter. Revenue rose 13% to $143 billion on strong momentum in retail and advertising, and net income according to generally accepted accounting principles (GAAP) more than tripled to reach $9.9 billion as the company continued to improve its cost structure.

Amazon is well positioned to maintain that momentum. Through 2030, retail e-commerce sales are expected to grow at 8.1% annually, adtech spend is expected to grow at 13.7% annually, and cloud computing revenue is expected to grow at 14.1% annually. That points to low-double-digit revenue growth for Amazon and, indeed, Morningstar analyst Dan Romanoff is forecasting revenue growth of 11% annually through 2027.

It is possible that Amazon grows more quickly -- Jim Kelleher of Argus Research sees AWS as "uniquely positioned" to benefit from the AI-as-a-service market -- but even the baseline projection makes its current valuation of 2.6 times sales look quite reasonable. That's why investors should include this growth stock in their AI basket.

Docebo specializes in corporate learning. Its platform allows businesses to create, deliver, and measure the impact of learning content across internal (employees) and external (customers, partners, suppliers) audiences. One particularly innovative application is Docebo Shape, a generative AI product that automates content creation by transforming virtually any source material -- from documents and articles to case studies and presentations -- into learning content.

Docebo has long been at the forefront of the corporate learning market. It was among the first companies to bring AI to its learning management system, and analysts have consistently ranked Docebo as a major player in the industry. To quote a recent report from Morgan Stanley, "Docebo is not only disrupting the internal learning management system market, taking share from legacy vendors, but it is also leading the market in a greenfield external learning opportunity."

Docebo delivered a solid financial report in the second quarter. Its customer count climbed 16% to 3,591, including the addition of an unnamed customer that is almost certainly Alphabet's Google. In turn, revenue rose 25% year over year to $43.6 million and the company reported adjusted net income of $4.7 million, up from a small loss a year earlier.

Docebo values its addressable market at $38 billion in 2026 and the company is leaning into automation to capitalize on that opportunity. Of particular note, Docebo announced new features for its generative AI application (Shape) that will launch in 2024, including virtual role play that provides learners with solution-specific simulations and real-time feedback, and an integrated copilot that further simplifies the creation of learning content.

On that note, strategist Josh Baer of Morgan Stanley sees Docebo as one of the software companies best positioned to monetize generative AI, and he expects Docebo to grow revenue at 17% annually through 2033. That forecast makes its current valuation of 9.2 times sales look very reasonable, especially when the three-year average is 15.4 times sales. That's why investors should add a small position in Docebo to their basket of AI stocks.

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Trevor Jennewine has positions in Amazon and Nvidia. The Motley Fool has positions in and recommends Alphabet, Amazon, Apple, Docebo, and Nvidia. The Motley Fool has a disclosure policy.

Follow this link:
A Once-in-a-Generation Investment Opportunity: 2 Highly ... - The Motley Fool

Read More..

How to Secure the 5 Cloud Environment Types – eSecurity Planet

eSecurity Planet content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Organizations have a variety of options for cloud deployments, each with its own set of capabilities and security challenges. In this article, we will explore the key characteristics, security threats, and best security practices for five key cloud security environments: public cloud, private cloud, hybrid cloud, multi-cloud, and multi-tenant cloud.

A public cloud architecture is a shared infrastructure hosted by a cloud service provider. Public clouds enable multiple businesses to share resources from a shared pool over the internet. The provider hosts and manages the environment, allowing for scalability and cost-efficiency. The responsibility for protecting these cloud resources is shared, with the cloud provider responsible for infrastructure security and customers responsible for access, application security, and data management. Users have a large responsibility for maintaining the integrity of their cloud environments under this shared responsibility paradigm.

While public cloud systems offer scalability, flexibility, and cost-efficiency, they can also pose significant risks if not properly secured. All cloud (and IT) environments share common security issues and solutions, but for public cloud users, compliance, access control, and proper configuration practices are some of the most important.

How they occur: Unauthorized access to sensitive data can happen as a result of vulnerabilities and misconfigurations such as flawed access permissions or unprotected data and instances.

Prevention: Implement robust encryption, access restrictions, data categorization, secure connections, and an incident response strategy.

How they occur: Improperly configured permissions can allow unauthorized individuals to access applications and data, possibly leading to data leaks and breaches and other security risks.

Prevention: Apply the concept of least privilege or zero trust, conduct frequent access audits, and use Identity and Access Management (IAM) tools.

How they occur: Vulnerable APIs and inadequately protected cloud interfaces allow for exploitation, potentially resulting in data leakage and breaches.

Prevention: API security practices and tools, perform regular vulnerability testing, and enforce strict access controls.

How it occurs: Attackers acquire unlawful access using stolen user credentials, which could result in unauthorized account and data access and misuse.

Prevention: Require multi-factor authentication (MFA), educate users on password security, and regularly monitor accounts for suspicious activities.

How it occurs: Without sufficient logging and monitoring, detecting security incidents in real time becomes difficult, leaving the cloud environment susceptible.

Prevention: Activate cloud logging and use SIEM systems to continually monitor network and system activity.

How they occur: Distributed Denial of Service (DDoS) attacks overload cloud and network systems, interrupting access and triggering service disruptions.

Prevention: DDoS attacks may be prevented and mitigated by using DDoS protection services, installing traffic filtering, and deploying content delivery networks (CDNs) to handle extra traffic.

How it occurs: Inadvertent data deletion, corruption or theft can result in irreversible data loss, disrupting operations and exposing sensitive data that could also violate data privacy regulations.

Prevention: Back up data on a regular basis, develop data classification and retention policies, utilize versioning features, use Data Loss Prevention (DLP) tools, and teach employees about data management and policy adherence.

Consider the following methods for increased security in a public cloud setting:

Also read:

A private cloud environment dedicates resources to a single business, allowing for greater control, privacy, and security. Private clouds offer the additional assurance of data, applications and assets being isolated inside a dedicated environment. Still, private cloud security requires many of the same measures as other cloud environments.

A mix of technology, processes, and strategic planning is required to handle these challenges of private cloud security.

How they occur: Private clouds still need to be configured properly, and misconfigurations can lead to exposed data, accounts, applications and other assets.

Prevention: Conduct frequent security audits and vulnerability assessments, and automate configurations wherever possible to reduce human error. Cloud Security Posture Management (CSPM) is one good tool for making sure that cloud environments are configured properly.

How it occurs: A lack of redundancy can cause system disruptions.

Prevention: Make sure your cloud environment includes redundancy, failover measures, and load balancing.

How they occur: Compliance issues can be somewhat easier in private clouds, particularly if they can avoid geographical data location issues, yet compliance challenges still exist.

Prevention: Keep up with compliance needs by utilizing Governance, Risk and Compliance (GRC) tools.

Consider the following ways to help ensure the security of private cloud systems.

Also read: What is Private Cloud Security? Everything You Need to Know

A hybrid cloud architecture integrates both public and private clouds. It enables businesses to take advantage of the flexibility of public cloud resources while keeping sensitive data in a private cloud. Data exchange across the two environments is possible, providing a balance of cost-efficiency and security. That flexibility introduces complexity, however, and hybrid cloud security must combine on-premises and cloud security controls to protect data both within and between environments.

Hybrid clouds enable enterprises to benefit from the scalability and flexibility of public clouds while protecting more sensitive data within their own infrastructure. However, hybrid cloud security brings particular challenges.

How they occur: As identifying roles and responsibilities is critical in hybrid clouds, shared responsibility can lead to misunderstandings and unintended security weaknesses.

Prevention: Understand your responsibilities and manage data, access, and application security across all environments, including incident response.

How they occur: Managing application security across multiple environments requires consistent rules, controls, authentication, and monitoring in order to prevent possible vulnerabilities and ensure compliance throughout the hybrid configuration.

Prevention: Integrate security into early development (Shift Left) and track issues and fixes with DevSecOps tools.

How they occur: Because hybrid clouds disseminate data across multiple locations, the danger of illegal access or data exposure increases.

Prevention: The intricacies of data encryption, data classification, and access control require careful management. Use encryption techniques to safeguard data in transit and at rest and use DLP and access management tools to control risks.

How they occur: Meeting compliance standards across hybrid settings with multiple vendors and architectures may be difficult.

Prevention: Preventive measures include activating cloud providers built-in compliance capabilities, centralizing compliance and auditing, and automated monitoring and reporting.

How it occurs: Integrating cloud systems can be difficult because of the variety of technologies, potential conflicts, and the need to ensure continuous data flow.

Prevention: Plan integration carefully, maintain seamless data flow, and use API and configuration best practices to secure data across all environments.

There are a number of ways to properly secure hybrid cloud environments while maintaining their advantages.

Also read: What Is Hybrid Cloud Security? How it Works & Best Practices

Multiple public and private clouds are used concurrently in multi-cloud environments. Their design is decentralized, with apps and data dispersed across several cloud providers. Redundancy, cost minimization, and flexibility are all advantages, but maintaining security across various providers may be complicated, requiring uniform security solutions, policies and practices for protection.

Enterprises confront a variety of difficulties in exchange for the flexibility and scalability benefits of multi-cloud environments, not the least of which is a significantly larger potential attack surface. These are some of the major multi-cloud security threats.

How it occurs: Attackers acquire unauthorized access to cloud accounts, which may result in data theft, resource manipulation, and other malicious actions.

Prevention: Even in the case of stolen credentials, strong authentication and access controls and proper configuration management can help secure cloud accounts.

How they occur: With data scattered across many cloud environments, the risk of unauthorized access, data leaks, and breaches rises.

Prevention: Implement strong access controls and authentication and make sure that each cloud instance is properly configured.

How they occur: With a greater cloud attack surface to defend, DDoS attacks can be harder to prevent.

Prevention: For continued service availability, implement DDoS prevention and mitigation methods such as traffic filtering, infrastructure hardening, and overprovisioning.

How they occur: Unsecured accounts and excessive permissions can allow unauthorized access, data disclosure, and resource exploitation.

Prevention: Preventive measures include appropriately configuring IAM policies, conducting regular audits, following the principle of least privilege, and securing privileged accounts.

How they occur: Using third-party suppliers and services in a multi-cloud system might introduce extra risks, and the risk extends to software dependencies in the software supply chain.

Prevention: To successfully manage these risks, third-party risk management (TPRM) tools are a good place to start.

How it occurs: Multi-cloud has many of the same challenges as other cloud computing approaches, only multiplied across more environments.

Prevention: Prioritize visibility and monitoring technologies that can track risks across cloud environments.

See also:

Securing multi-cloud setups requires thorough planning and a well-defined strategy. There are a number of considerations and approaches.

Read more: What Is Multi-Cloud Security? Everything to Know

A multi-tenant cloud architecture is the most common public cloud architecture. It allows multiple customers, or tenants, to utilize the same environment while keeping their data separate. This architecture is frequently used in infrastructure as a service (IaaS) and platform as a service (PaaS) environments, where data exchange is carefully managed to maintain security and isolation. The degree of multi-tenancy varies based on the architecture of the cloud service provider and the individual needs of users or organizations.

While multi-tenancy provides considerable cost savings and resource efficiency, it also raises a number of security and privacy challenges. These issues must be addressed in order to ensure the safe coexistence of multiple uses inside shared cloud environments.

How they occur: Vulnerabilities, weak passwords, misconfigurations, and API and access control issues matter more than ever in multi-clouds.

Prevention: Strong access management, authentication, encryption, proper configuration, and employee training all play a role, and technologies like DLP can detect problems early.

How it occurs: Inadequate tenant isolation might lead to data contamination or illegal access.

Prevention: Improve tenant isolation by using virtualization, proper controls and configurations, and cloud network segmentation.

How they occur: Meeting regulatory criteria can be made more difficult due to shared resources, data commingling,and even the geographical location of cloud services.

Prevention: Make sure your cloud service provider can meet your specific compliance needs, and DLP and automated data classification can help implement the right controls for the right data.

Access restrictions, data segregation, and compliance must all be prioritized when it comes to securing multi-tenant cloud settings. Consider the following strategies:

Read more: Multi-Tenancy Cloud Security: Definition & Best Practices

Every type of cloud environment public, private, hybrid, multi-cloud, and multi-tenant has its own set of risks and demands. From the shared responsibilities of public cloud to the tailored protection of private clouds, the strategic balance of hybrid cloud, and the challenges of multi-cloud and multi-tenant environments, adopting robust security measures is critical for protecting data and ensuring compliance and business continuity. The good news is that cloud service providers are generally pretty good at securing their environments. By doing their part and applying best practices for each environment, businesses may protect their data and resources while reaping the benefits of cloud computing.

RecoveryManager Plus is an integrated backup and recovery solution for your Exchange Online, on-premises Exchange, and Google Workspace mailboxes. Backup and restore all items in your mailboxes, including all attachments. Export entire Exchange Online and on-premises Exchange mailboxes or just a part of it as a PST file and secure them with a password for an additional layer of security. Try free for 30 days!

Learn more

BDRCloud, a cloud-based service from BDRSuite offers secure, reliable, and scalable cloud backup solutions for Microsoft 365, Google Workspace, Servers, Endpoints, and Applications. Seamlessly backup and restore critical data from the BDRCloud at your convenience. With customizable policies for scheduling, retention, backup encryption, and multiple recovery options, BDRCloud ensures your data is always secure and accessible.

Learn more

Continue reading here:
How to Secure the 5 Cloud Environment Types - eSecurity Planet

Read More..

Reconciling HDCP With AV-over-IP Open Standards – AV Network

Everyone in the AV community knows HDCP, the high-bandwidth digital content protection protocol. Love it or loathe it, HDCP is now part of everyday AV working lives. It was designed by Intel and adopted by Hollywood as the method by which valued content is secured, primarily to prevent piracy in a real-time environment. In other words, whilst content is being played, as opposed to whilst moving a file. The focus is on the transmission channel, providing protection as content passes from source to sink.

HDCP addresses this with three interlinked systems: Authentication, Encryption, Key Revocation

Simply put, HDCP only allows content to move to endpoints that are legitimate and authenticated HDCP receivers and then protects the content whilst in motion. This uses a system of encryption keys that work together to secure the link.

The original scope perceived the signal distribution as being point-to-point, circuit-switched connectionsvia connectors like DVI, HDMI, DisplayPort, and across cables and matrices.

All security systems are challenges for people who try to break them. Each of the three HDCP sub-systems, authentication, encryption, and revocation were targets. Devices that stripped the protection soon became available. A master key that could not be revoked found its way into circulation. And thus, despite legal action, the original HDCP system was defeatedbut nonetheless was still required for compliance.

The advent of increasing resolutions and the emergence of AV-over-IP raised both an additional challenge and an additional opportunity. UHD was the new value to be protected and there would now be a much larger number of endpoints involved, (i.e., multicast). Further, due to being transmitted via a network, the connections were now packet-switched, not a continuous connection.

HDCP evolved from v1 to v2, but this was not a backward-compatible step because the content transmission architecture was different. Several parameters were left as optional alternatives. One of the aspects that was not tied down for this multi-endpoint network environment was how to transfer HDCP messages between source and sinks. The consequence is that for direct connections (e.g., HDMI to HDMIfrom player to display) all brands do it the same way, but for network connections, it tends to be same brand at both ends.

Enter IPMX. IPMX is, by design, moving high-quality videoright up to and including uncompressed UHD60across networks. But the same brand at both ends is not compatible with its interoperability ethos. So, a consistent method for exchanging keys needed to be defined.

Most people think that the protection provided by encryption is reinforced by not revealing the method used. The magic of modern encryption methods is that they know exactly how it is protectedeven telling them how it is done, such as putting it in an open standardand yet they still cant decrypt it with the resources they have. The HDCP protocol itself addresses exactly that. Their method of using the keys is rigorous. To bring inter-brand HDCP to the IPMX multicast environment, the Video Services Forum (VSF) has made a significant contribution in proposing the HDCP Key Exchange Protocol (HKEP). This is a standardized approach, or method in common of transferring the already defined messages between IPMX transmitters and receivers. Additionally, it defines the closing down process for a single HDCP session in that same multicast network environment.

The methodology that VSF has recommended in TR-10-5, for use in IPMX, is compatible with everything that has been agreed to so far yet is lightweight enough to be incorporated into devices that may be varied, and probably limited in processing power (to keep their cost down). The extension to the already established HDCP specs covers the exchange of HDCP control messages over TCP/IP among NMOS-based transmitters and receivers. You can think of it as a preamble to the existing HDCP protocol v2.3 which prescribes how to transfer parameters, left optional by DCP, should be used in the IPMX environment.

Go here to see the original:
Reconciling HDCP With AV-over-IP Open Standards - AV Network

Read More..

Chancellor May Wins Education Award from Engineering Society – University of California, Davis

Chancellor Gary S. May is this years recipient of the Education Award from the Institute of Electrical and Electronics Engineers Electron Devices Society, the organization announced last month.

The society calls the award its highest honor to recognize distinguished contributions to education within [our] field of interest.

Mays award honors dedicated leadership and mentorship that has diversified academic leaders in education, the society said.

May has made mentorship, especially of underrepresented groups, a key focus of his career, and has created nationally recognized programs to attract, mentor and retain people from those groups in the fields of science, technology, engineering and math, or STEM.

In 2011 he was honored with the Lifetime Mentor Award from the American Association for the Advancement of Science, and in 2015, President Barack Obama selected May for the Presidential Award for Excellence in STEM Mentoring.

Earlier this year, May received the Lifetime Member of the Year award from the National Society of Black Engineers.

His field of study is computer-aided manufacturing of integrated circuits, a topic in which he has authored more than 200 technical publications, contributed to 15 books and holds a patent.

The award will be presented at a meeting in December.

Original post:

Chancellor May Wins Education Award from Engineering Society - University of California, Davis

Read More..

What impact does Telegram’s heavy encryption have on performance? – TickerTV News

What impact does Telegrams heavy encryption have on performance?

Telegram, the popular messaging app known for its emphasis on privacy and security, has gained a reputation for its robust encryption protocols. While this level of security is undoubtedly a boon for users concerned about their privacy, it begs the question: what impact does Telegrams heavy encryption have on performance?

Encryption, defined: Encryption is the process of encoding information in such a way that only authorized parties can access it. In the context of messaging apps like Telegram, encryption ensures that messages and other data are protected from unauthorized access.

Telegrams encryption is based on the MTProto protocol, which employs end-to-end encryption to secure user communications. This means that messages are encrypted on the senders device and can only be decrypted the intended recipient. While this level of security is commendable, it does have implications for performance.

One of the primary impacts of Telegrams heavy encryption is increased data usage. The encryption process adds additional data to each message, resulting in larger file sizes. This can lead to higher data consumption, especially for users on limited data plans or in areas with slow internet connections.

Furthermore, the encryption process itself requires additional computational resources. As a result, devices may experience a slight delay when sending or receiving messages, particularly when dealing with large files or in areas with poor network coverage. However, its worth noting that these delays are generally minimal and may not be noticeable to most users.

FAQ:

Does Telegrams encryption impact battery life?While encryption does require additional computational resources, the impact on battery life is generally negligible. Modern smartphones are equipped to handle the encryption process efficiently, and any potential drain on battery life is minimal.

Can I disable encryption on Telegram?No, encryption is an integral part of Telegrams security measures and cannot be disabled. It ensures that your messages and data remain secure and private.

Is Telegrams encryption secure?Telegrams encryption protocols have been praised security experts for their robustness. However, its important to note that no encryption system is entirely foolproof, and vulnerabilities can emerge over time. Telegram regularly updates its encryption protocols to address any potential weaknesses and maintain a high level of security.

In conclusion, while Telegrams heavy encryption does have some impact on performance, such as increased data usage and slight delays in message transmission, the trade-off for enhanced privacy and security is well worth it for most users. As technology continues to advance, it is likely that any minor performance issues will be further mitigated, ensuring a seamless and secure messaging experience for Telegram users.

Original post:
What impact does Telegram's heavy encryption have on performance? - TickerTV News

Read More..

Ammonia fuel offers great benefits but demands careful action … – Engineering at Princeton University

Ammonia, a main component of many fertilizers, could play a key role in a carbon-free fuel system as a convenient way to transport and store clean hydrogen. The chemical, made of hydrogen and nitrogen (NH3), can also itself be burned as a zero-carbon fuel. However, new research led by Princeton University illustrates that even though it may not be a source of carbon pollution, ammonias widespread use in the energy sector could pose a grave risk to the nitrogen cycle and climate without proper engineering precautions.

Publishing their findings November 6 in PNAS, the interdisciplinary team of 12 researchers found that a well-engineered ammonia economy could help the world achieve its decarbonization goals and secure a sustainable energy future. A mismanaged ammonia economy, on the other hand, could ramp up emissions of nitrous oxide (N2O), a long-lived greenhouse gas around 300 times more potent than CO2 and a major contributor to the thinning of the stratospheric ozone layer. It could lead to substantial emissions of nitrogen oxides (NOx), a class of pollutants that contribute to the formation of smog and acid rain. And it could directly leak fugitive ammonia emissions into the environment, also forming air pollutants, impacting water quality, and stressing ecosystems by disturbing the global nitrogen cycle.

Fortunately, the researchers found that the potential negative impacts of an ammonia economy can be minimized with proactive engineering practices. They argued that now is the time to start seriously preparing for an ammonia economy, tackling the potential sticking points of ammonia fuel before its widespread deployment.

We know an ammonia economy of some scale is likely coming, said research leader Amilcare Porporato, the Thomas J. Wu 94 Professor of Civil and Environmental Engineering and the High Meadows Environmental Institute. And if we are proactive and future-facing in our approach, an ammonia economy could be a great thing. But we cannot afford to take the risks of ammonia lightly. We cannot afford to be sloppy.

As interest in hydrogen as a zero-carbon fuel has grown, so too has an inconvenient reality: it is notoriously difficult to store and transport over long distances. The tiny molecule must be stored at either temperatures below -253 degrees Celsius or at pressures as high as 700 times atmospheric pressure, conditions that are infeasible for widespread transport and prone to leakage.

Ammonia, on the other hand, is much easier to liquify, transport, and store, capable of being moved around similarly to tanks of propane.

Moreover, an established process for converting hydrogen into ammonia has existed since the early 20th century. Known as the Haber-Bosch process, the reaction combines atmospheric nitrogen with hydrogen to form ammonia. While the process was originally developed as a cost-effective way to turn atmospheric nitrogen into ammonia for use in fertilizers, cleaning products, and even explosives, the energy sector has looked to the Haber-Bosch process as a way to store and transport hydrogen fuel in the form of ammonia.

Ammonia synthesis is inherently energy-intensive, and fossil fuels without CO2 capture are currently used to meet almost all of its feedstock and energy demands. But as the researchers pointed out in their article, if new, electricity-driven processes that are currently under development can replace conventional fossil-fuel-derived ammonia synthesis, then the Haber-Bosch process or a different process altogether could be widely used to convert clean hydrogen into ammonia, which can itself be burned as a zero-carbon fuel.

Ammonia is an easy way to transport hydrogen over long distances, and its widespread use in agriculture means there is already an established infrastructure for producing and moving ammonia, said Matteo Bertagni, postdoctoral researcher at the High Meadows Environmental Institute working on the Carbon Mitigation Initiative. You could therefore create hydrogen in a resource-rich area, transform it into ammonia, and then transport it anywhere its needed around the globe.

Ammonias transportability is especially attractive to industries reliant on long-distance transportation, such as maritime shipping, and countries with limited available space for renewable resources. Japan, for example, already has a national energy strategy in place that incorporates the use of ammonia as a clean fuel. Straightforward storage requirements mean that ammonia might also find use as a vessel for long-term energy storage, complementary to or even replacing batteries.

At first glance, ammonia seems like an ideal cure for the problem of decarbonization, Porporato said. But almost every medicine comes with a set of potential side effects.

In theory, burning ammonia should yield only harmless nitrogen gas (N2) and water as products. But in practice, Michael E. Mueller, associate chair and professor of mechanical and aerospace engineering, stated that ammonia combustion can release harmful NOx and N2O pollutants.

Most N2O emissions from ammonia combustion are the result of disruptions to the combustion process. N2O is essentially an intermediate species in the combustion process, Mueller said. If the combustion process is allowed to finish, then there will be essentially no N2O emissions.

Yet Mueller said that under certain conditions, such as when a turbine is ramping up or down or if the hot combustion gases impinge upon cold walls, the ammonia combustion process can become disrupted and N2O emissions can quickly accumulate.

For instance, the researchers found that if ammonia fuel achieves a market penetration equal to around 5% of the current global primary energy demand (which would require 1.6 billion metric tons of ammonia production, or ten times current production levels), and if 1% of the nitrogen in that ammonia is lost as N2O, then ammonia combustion could produce greenhouse gas emissions equivalent to 15% of todays emissions from fossil fuels. The greenhouse gas intensity of such a loss rate would mean that burning ammonia fuel would be more polluting than coal.

Like ammonias N2O emissions, Robert Socolow, a professor of mechanical and aerospace engineering, emeritus, and senior scholar at Princeton, said that widespread usage of ammonia in the energy sector will add to all the other impacts that fertilizer has already had on the global nitrogen cycle.

In a seminal paper published in 1999, Socolow discussed the environmental impacts of the food systems widespread use of nitrogen-enriched fertilizers to promote crop growth, writing that, Excess fixed nitrogen, in various guises, augments the greenhouse effectcontaminates drinking water, acidifies rainand stresses ecosystems.

As the energy sector looks toward ammonia as a fuel, Socolow said that it can learn from agricultures use of ammonia as a fertilizer. He urged those in the energy sector to consult the decades of work from ecologists and agricultural scientists to understand the role of excess nitrogen in disturbing natural systems.

Ammonia fuel can be done, but it cannot be done in any way we wish, said Socolow, whose 2004 paper with Stephen Pacala, the Frederick D. Petrie Professor in Ecology and Evolutionary Biology, emeritus, on stabilization wedges has become a foundation of modern climate policy. Its important that we look before we leap.

While the environmental consequences of an ammonia economy gone wrong are serious, the researchers emphasized that the potential stumbling blocks they identified are solvable through proactive engineering.

I interpret this paper as a handbook for engineers, Mueller said. By identifying the worst-case scenario for an ammonia economy, were really identifying what we need to be aware of as we develop, design, and optimize new ammonia-based energy systems.

For instance, Mueller said there are alternative combustion strategies that could help to minimize unwanted NOx and N2O emissions. While each strategy has its own set of pros and cons, he said that taking the time now to evaluate candidate systems with an eye toward mitigating emissions will ensure that combustion systems are poised to operate optimally for ammonia fuel.

Another option for accessing the energy in ammonia involves partially or fully splitting ammonia back into hydrogen and atmospheric nitrogen through a process known as cracking. Ammonia cracking, a line of research being actively pursued by Emily A. Carter, could help to make the fuel composition more favorable for combustion or even bypass the environmental concerns of ammonia burning by regenerating hydrogen fuel at the point of use. Carter is the Gerhard R. Andlinger Professor of Energy and the Environment and senior strategic advisor and associate laboratory director for applied materials and sustainability sciences at the Princeton Plasma Physics Laboratory (PPPL).

Furthermore, several technologies already exist at the industrial scale to convert unwanted NOx emissions from combustion back into N2 through a process known as selective catalytic reduction. These technologies could be straightforward to transfer to ammonia-based fuel applications. And as a convenient bonus, many of them rely on ammonia as a feedstock to remove NOx something that there would already be plenty of in an ammonia-based system.

Beyond the engineering practices that could be developed to minimize the environmental impacts of an ammonia economy, Porporato said future work will also look beyond engineering approaches to identify policies and regulatory strategies that would ensure the best-case scenario for ammonia fuel.

Imagine the problems we could have avoided if we knew the risks and environmental impacts of burning fossil fuels before the Industrial Revolution began, Porporato said. With the ammonia economy, we have the chance to learn from our carbon-emitting past. We have the opportunity to solve the challenges weve identified before they become an issue in the real world.

The paper, Minimizing the Impacts of the Ammonia Economy on the Nitrogen Cycle and Climate, was published November 6 in PNAS. In addition to Porporato, Bertagni, Mueller, Socolow, and Carter, coauthors include J. Mark P. Martirez of the Princeton Plasma Physics Laboratory (PPPL); Chris Greig, Yiguang Ju, Sankaran Sundaresan, Mark Zondlo, and Rui Wang of Princeton University; and Tim Lieuwen of the Georgia Institute of Technology. The research was supported by the U.S. Department of Energy, the National Science Foundation, the BP-funded Carbon Mitigation Initiative at Princeton University, and the Moore Foundation.

Here is the original post:

Ammonia fuel offers great benefits but demands careful action ... - Engineering at Princeton University

Read More..