Page 2,219«..1020..2,2182,2192,2202,221..2,2302,240..»

ZenHub Joins the Cloud Native Computing Foundation – GlobeNewswire

VANCOUVER, British Columbia, March 22, 2022 (GLOBE NEWSWIRE) -- ZenHub, the leading productivity management solution for software teams, today announced that it has joined the Cloud Native Computing Foundation (CNCF), which builds sustainable ecosystems for cloud native software. This collaboration gives ZenHub the ability to increase its participation in and give back to the Kubernetes ecosystem that ZenHubs entire production and CI/CD infrastructure runs on. Additionally, this news reaffirms ZenHubs commitment to supporting the open source model and the communities that make it the most powerful engine for innovation in the world today.

Supporting open source projects has been a core component of ZenHub and its mission since our inception, said Thabang Mashologu, ZenHubs VP, Marketing. Our product runs on Kubernetes, so were thrilled to join the CNCF to further enable us to support this growing community. Beyond simply helping our customers speed innovation based on commercial open source, we also are committed to working with organizations to ensure everyone has the opportunity to be a part of this world-changing movement.

ZenHub adds unique value to the open source ecosystem. Of the more than 7,000 next-generation software organizations that use our platform globally, at least 4,500 are open source projects and public entities. All of ZenHubs features that power high-growth startups and scaleups are available free of charge for open source and public repositories, including real-time roadmap visibility, automated sprints, and team productivity insights. As a result, ZenHub is unique in enabling its customers to transition to the open source model, simultaneously supporting the planning and tracking of projects in both public and private software repositories.

"Were excited to welcome ZenHub as a Silver member, said Chris Aniszczyk, CNCFs CTO. The CNCF is always interested in working with organizations that are committed not just to supporting our cloud native community, but who are interested in advancing the open source ecosystem as a whole.

Successful projects and open source-based companies such as Grafana, O3DE, OpenSSL, New Relic, Red Hat, Swagger, and many others use ZenHub to deliver software innovation faster. ZenHub helps strategic open source projects foster effective communication among team members, set goals and plan with more transparency and participation, and ship releases more predictably. Interested parties that wish to know more about how ZenHub helps its customers and community manage both private and public software development efforts can click here for more information - https://www.zenhub.com/customer-stories/swagger-api-open-source-customer-story

About ZenHubZenHub enables software teams at high-growth organizations and open source projects to build better code more quickly by providing a developer-friendly productivity management platform. ZenHub connects the dots across all teams with automated agile features, real-time roadmap visibility, and team productivity insights. More than 7,000 disruptive teams worldwide rely on ZenHub to ship great code faster.

Third-party trademarks mentioned are the property of their respective owners.

About The Cloud Native Computing FoundationCloud native computing empowers organizations to build and run scalable applications with an open source software stack in public, private, and hybrid clouds. The Cloud Native Computing Foundation (CNCF) hosts critical components of the global technology infrastructure, including Kubernetes, Prometheus, and Envoy. CNCF brings together the industry's top developers, end users, and vendors and runs the largest open source developer conferences in the world. Supported by more than 500 members, including the world's largest cloud computing and software companies, as well as over 200 innovative startups, CNCF is part of the nonprofit Linux Foundation. For more information, please visit http://www.cncf.io.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page. Linux is a registered trademark of Linus Torvalds.

Media Contact:Nichols Communications for ZenHubJay Nichols+1 408 772 1551jay@nicholscomm.com

Go here to read the rest:
ZenHub Joins the Cloud Native Computing Foundation - GlobeNewswire

Read More..

INTELITECHS announces acquisition of Total Cloud IT – Utah Business – Utah Business

Salt Lake City INTELITECHS, a Utah-based managed IT services provider, is pleased to announce its acquisition of Total Cloud IT. The acquisition opens the door to expansion by INTELITECHS into serving a broader group of small to mid-sized businesses across the country with both managed IT and cloud computing services.

We are pleased to be able to augment our proven model of delivering managed IT services to a larger group of clients across the country, says founder Jake Hiller. Its a win/win our regional clients now have access to expanded cloud services, and at the same time, previous cloud services-only clients can now benefit from the more comprehensive services INTELITECHS offers in the areas of managed IT and data security, he added.

With todays home-based workforce, there is a higher level of concern with data access and end-user security. So many companies now have more employees working from home than in the office using company and non-company devices to access sensitive data, says Eric Sessions, co-founder. We see a significant increase in need for our services like Office 365 security, directory synchronization, multi-factor authentication, SPAM filtering and DMARC/DKIM/SPF configuration and monitoring, he adds.

INTELITECHS provides managed IT services and IT department staff augmentation by serving as its customers outsourced IT department. Its goal is to become clients geek-speak-translators and to use IT to drive business growth and profitability. The company serves clients across the country with managed IT services, managed backups, 24/7/365 monitoring, and a wide variety of other sophisticated security services. The company also assists clients with hardware sales, cloud-based computing, server migration, and Office 365 support. What sets INTELITECHS apart from other technology companies is its focus on partnership with clients versus just being an IT vendor. With its home office in Salt Lake City, UT, the company now serves small to medium-sized businesses across the country.

Read the rest here:
INTELITECHS announces acquisition of Total Cloud IT - Utah Business - Utah Business

Read More..

Powering the next generation of trustworthy AI in a confidential cloud using NVIDIA GPUs – Microsoft

Cloud computing is powering a new age of data and AI by democratizing access to scalable compute, storage, and networking infrastructure and services. Thanks to the cloud, organizations can now collect data at an unprecedented scale and use it to train complex models and generate insights.

While this increasing demand for data has unlocked new possibilities, it also raises concerns about privacy and security, especially in regulated industries such as government, finance, and healthcare. One area where data privacy is crucial is patient records, which are used to train models to aid clinicians in diagnosis. Another example is in banking, where models that evaluate borrower creditworthiness are built from increasingly rich datasets, such as bank statements, tax returns, and even social media profiles. This data contains very personal information, and to ensure that its kept private, governments and regulatory bodies are implementing strong privacy laws and regulations to govern the use and sharing of data for AI, such as the General Data Protection Regulation (GDPR) and the proposed EU AI Act. You can learn more about some of the industries where its imperative to protect sensitive data in this Microsoft Azure Blog post.

Microsoft recognizes that trustworthy AI requires a trustworthy cloudone in which security, privacy, and transparency are built into its core. A key component of this vision is confidential computinga set of hardware and software capabilities that give data owners technical and verifiable control over how their data is shared and used. Confidential computing relies on a new hardware abstraction called trusted execution environments (TEEs). In TEEs, data remains encrypted not just at rest or during transit, but also during use. TEEs also support remote attestation, which enables data owners to remotely verify the configuration of the hardware and firmware supporting a TEE and grant specific algorithms access to their data.

At Microsoft, we are committed to providing a confidential cloud, where confidential computing is the default for all cloud services. Today, Azure offers a rich confidential computing platform comprising different kinds of confidential computing hardware (Intel SGX, AMD SEV-SNP), core confidential computing services like Azure Attestation and Azure Key Vault managed HSM, and application-level services such as Azure SQL Always Encrypted, Azure confidential ledger, and confidential containers on Azure. However, these offerings are limited to using CPUs. This poses a challenge for AI workloads, which rely heavily on AI accelerators like GPUs to provide the performance needed to process large amounts of data and train complex models.

The Confidential Computing group at Microsoft Research identified this problem and defined a vision for confidential AI powered by confidential GPUs, proposed in two papers, Oblivious Multi-Party Machine Learning on Trusted Processors and Graviton: Trusted Execution Environments on GPUs. In this post, we share this vision. We also take a deep dive into the NVIDIA GPU technology thats helping us realize this vision, and we discuss the collaboration among NVIDIA, Microsoft Research, and Azure that enabled NVIDIA GPUs to become a part of the Azure confidential computing ecosystem.

Today, CPUs from companies like Intel and AMD allow the creation of TEEs, which can isolate a process or an entire guest virtual machine (VM), effectively eliminating the host operating system and the hypervisor from the trust boundary. Our vision is to extend this trust boundary to GPUs, allowing code running in the CPU TEE to securely offload computation and data to GPUs.

Unfortunately, extending the trust boundary is not straightforward. On the one hand, we must protect against a variety of attacks, such as man-in-the-middle attacks where the attacker can observe or tamper with traffic on the PCIe bus or on a NVIDIA NVLink connecting multiple GPUs, as well as impersonation attacks, where the host assigns an incorrectly configured GPU, a GPU running older versions or malicious firmware, or one without confidential computing support for the guest VM. At the same time, we must ensure that the Azure host operating system has enough control over the GPU to perform administrative tasks. Furthermore, the added protection must not introduce large performance overheads, increase thermal design power, or require significant changes to the GPU microarchitecture.

Our research shows that this vision can be realized by extending the GPU with the following capabilities:

NVIDIA and Azure have taken a significant step toward realizing this vision with a new feature called Ampere Protected Memory (APM) in the NVIDIA A100 Tensor Core GPUs. In this section, we describe how APM supports confidential computing within the A100 GPU to achieve end-to-end data confidentiality.

APM introduces a new confidential mode of execution in the A100 GPU. When the GPU is initialized in this mode, the GPU designates a region in high-bandwidth memory (HBM) as protected and helps prevent leaks through memory-mapped I/O (MMIO) access into this region from the host and peer GPUs. Only authenticated and encrypted traffic is permitted to and from the region.

In confidential mode, the GPU can be paired with any external entity, such as a TEE on the host CPU. To enable this pairing, the GPU includes a hardware root-of-trust (HRoT). NVIDIA provisions the HRoT with a unique identity and a corresponding certificate created during manufacturing. The HRoT also implements authenticated and measured boot by measuring the firmware of the GPU as well as that of other microcontrollers on the GPU, including a security microcontroller called SEC2. SEC2, in turn, can generate attestation reports that include these measurements and that are signed by a fresh attestation key, which is endorsed by the unique device key. These reports can be used by any external entity to verify that the GPU is in confidential mode and running last known good firmware.

When the NVIDIA GPU driver in the CPU TEE loads, it checks whether the GPU is in confidential mode. If so, the driver requests an attestation report and checks that the GPU is a genuine NVIDIA GPU running known good firmware. Once confirmed, the driver establishes a secure channel with the SEC2 microcontroller on the GPU using the Security Protocol and Data Model (SPDM)-backed Diffie-Hellman-based key exchange protocol to establish a fresh session key. When that exchange completes, both the GPU driver and SEC2 hold the same symmetric session key.

The GPU driver uses the shared session key to encrypt all subsequent data transfers to and from the GPU. Because pages allocated to the CPU TEE are encrypted in memory and not readable by the GPU DMA engines, the GPU driver allocates pages outside the CPU TEE and writes encrypted data to those pages. On the GPU side, the SEC2 microcontroller is responsible for decrypting the encrypted data transferred from the CPU and copying it to the protected region. Once the data is in high bandwidth memory (HBM) in cleartext, the GPU kernels can freely use it for computation.

The implementation of APM is an important milestone toward achieving broader adoption of confidential AI in the cloud and beyond. APM is the foundational building block of Azure Confidential GPU VMs, now in private preview. These VMs, designed in collaboration with NVIDIA, Azure, and Microsoft Research, feature up to four A100 GPUs with 80 GB of HBM and APM technology and enable users to host AI workloads on Azure with a new level of security.

But this is just the beginning. We look forward to taking our collaboration with NVIDIA to the next level with NVIDIAs Hopper architecture, which will enable customers to protect both the confidentiality and integrity of data and AI models in use. We believe that confidential GPUs can enable a confidential AI platform where multiple organizations can collaborate to train and deploy AI models by pooling together sensitive datasets while remaining in full control of their data and models. Such a platform can unlock the value of large amounts of data while preserving data privacy, giving organizations the opportunity to drive innovation.

A real-world example involves Bosch Research, the research and advanced engineering division of Bosch, which is developing an AI pipeline to train models for autonomous driving. Much of the data it uses includes personal identifiable information (PII), such as license plate numbers and peoples faces. At the same time, it must comply with GDPR, which requires a legal basis for processing PII, namely, consent from data subjects or legitimate interest. The former is challenging because it is practically impossible to get consent from pedestrians and drivers recorded by test cars. Relying on legitimate interest is challenging too because, among other things, it requires showing that there is a no less privacy-intrusive way of achieving the same result. This is where confidential AI shines: Using confidential computing can help reduce risks for data subjects and data controllers by limiting exposure of data (for example, to specific algorithms), while enabling organizations to train more accurate models.

At Microsoft Research, we are committed to working with the confidential computing ecosystem, including collaborators like NVIDIA and Bosch Research, to further strengthen security, enable seamless training and deployment of confidential AI models, and help power the next generation of technology.

The Confidential Computing team at Microsoft Research Cambridge conducts pioneering research in system design that aims to guarantee strong security and privacy properties to cloud users. We tackle problems around secure hardware design, cryptographic and security protocols, side channel resilience, and memory safety. We are also interested in new technologies and applications that security and privacy can uncover, such as blockchains and multiparty machine learning. Please visit our careers page to learn about opportunities for both researchers and engineers. Were hiring.

Read more:
Powering the next generation of trustworthy AI in a confidential cloud using NVIDIA GPUs - Microsoft

Read More..

The Data Center Industry Begins to Feel the Supply Chain Pinch – Data Center Frontier

Data center operators have employed a variety of strategies to navigate supply chain pressures and line up equipment for construction projects. (Image: Shutterstock)

Supply chain disruptions are tough on everyone. But the digital infrastructure sector faces a particular challenge, as it must manage the supply chain crisis during a period of dramatic growth amid a pandemic-driven shift to digital service delivery and distributed computing. Pre-ordering and inventory management kept the data center industry on schedule in 2020 and 2021, but can this continue as supply chain disruptions persist?

Our panelists include Sean Farney of Kohler Power Systems, Michael Goh from Iron Mountain Data Centers, DartPoints Brad Alexander, Amber Caramella of Netrality Data Centers and Infrastructure Masons, and Peter Panfil of Vertiv. The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier. Heres todays discussion:

Data Center Frontier:How would you assess the state of the data center supply chain? Are the global supply chain challenges impacting the delivery of data center capacity?

Brad Alexander, DartPoints: The data center supply chain is extremely strained currently. Transportation bottlenecks, massive labor and material shortages, and the increasing cost of critical components are causing roadblocks in both new construction and expansion. For example, we are currently going through an expansion project in one of our markets. The project is 14 percent over budget due to increased labor and components costs and is five months delayed because of generator shortages. Smaller components such as servers and storage are being delayed an additional eight weeks, and we have seen delays as long as seven months for various pieces of core networking equipment. Nearly every hardware vendor has increased costs by 7-12 percent since the beginning of 2022.

I feel the industry has built considerable capacity in the cloud and major data center markets over the last three years, and a critical capacity constraint has not yet fully been felt. However, customers may soon have to rely on more regions or multiple providers to meet capacity requirements. Smaller markets where the peak capacity was only a fraction of the larger markets are feeling the delays in expansion as these edge markets are gaining more and more traction.

Amber Caramella, Infrastructure Masons and Netrality: The supply chain continues to struggle, considering extensive global shortages of labor, materials, and equipment resulting in longer lead times. Shortages and impact vary across regions and types of equipment.

Broadly, I have seen supply chain shortages and shipping delays affect delivery time frames with new data center builds and bringing new capacity online.

Capacity planning, creating, and executing strategies is critical. Key strategies include ordering equipment ahead of time not only when needed and storing extra inventory. Expand your supplier list to provide optionality for the availability and delivery of equipment. Establishing a roadmap, capacity planning, relative to your data center needs is paramount.

Sean Farney, Kohler Power Systems: Supply chains everywhere are feeling pressure right now. Everything from basic materials like lumber, steel, and copper to microchips are in short supply, with doubling of delivery times in last year. Some equipment providers are even selling manufacturing production slots on their assembly lines!

In the meantime, data centers have been facing a tidal wave of demand. This has impacted the priority of what we can do today. Companies are needing to closely evaluate their processes to ensure they are operating existing facilities in the most efficient way possible. For example, sustainability is a priority to all of us in the data center space; we all have goals to be net-zero within a few years.

With the constraints were facing, many are having to double down on efforts to make efficiency upgrades and revisit operating procedures. Even in challenging circumstances, data center operators are finding ways to meet capacity demands, bring power usage down, and turn it all into a model that they can effectively sell.

Peter Panfil, Vertiv:The pandemic has created disruptions in the supply chain, as has the way data centers responded to it. Many are taking a bounce forward approach in which they are conducting major modernization and upgrades so they can come out of this period stronger and more resilient than they went into it. That has put further pressure on the supply chain.

Vendors across the value chain are working with their customers to get ahead of supply chain issues through proactive communication, longer term project planning, and enhanced maintenance practices that extend the lifecycle of existing equipment.

Michael Goh, Iron Mountain Data Centers: Supply chain challenges are visible in the data center industry. This is the case in other industries as well because of the shortages that occurred during COVID. We are coping well, but we see new capacity coming online at a slower pace.

In a fast growing and high-demand industry such as the data center industry, new capacity lead times are taking longer than before COVID. We also see this with any sort of equipment that needs a semiconductor to function.

NEXT: Is nuclear power an option for battling climate change?

Keep pace with the fact-moving world of data centers and cloud computing by following us onTwitterandFacebook, connecting with DCF onLinkedIn, and signing up for our weekly newspaper using the form below:

Visit link:
The Data Center Industry Begins to Feel the Supply Chain Pinch - Data Center Frontier

Read More..

Inside the Army’s distributed mission command experiments in, and over, the Pacific – Breaking Defense

U.S. Army Soldiers assigned to Americas First Corps maneuver a Stryker combat vehicle off United States Naval Ship City of Bismarck while conducting roll on-roll off training at Naval Base Guam, Feb. 9, 2022. (U.S. Army/ailene Bautista)

WASHINGTON: The US Armys I Corps is testing a new distributed mission command concept that could fundamentally change how the Army Corps functions across the vast distances of the Indo-Pacific.

Instead of the I Corps maintaining a single headquarters with several hundred staff shuffling around, the service is looking at breaking down the traditional headquarters infrastructure into functional nodes that would be distributed across the region but can remain in constant communication, Brig. Gen. Patrick Ellis, I Corps chief of staff, told Breaking Defense in a recent interview.

If were running the Corps and performing our command and control functions from six, three, five, six locations as opposed to just one, were much more resilient, and honestly much more survivable, in the event that were ever targeted, said Ellis. The existing Corps structure the kind of the doctrinal, the way the Corps are built didnt necessarily make the most sense for us.

The Corps and Army formations from large to small want to move away from static command posts or headquarters marked by tents, trucks and generators, and fight in a distributed manner, meaning that a Corps can coordinate the battlefield using assets that are hundreds, if not thousands, of miles apart.

In a recent experiment in Guam using four Strykers loaded with advanced communication capabilities, the I Corps worked to prove that it can pass important battlefield data, including fires and targeting information, between platforms, even while they are in transit in the air or on a boat. For example, Ellis was able to send mission command information from a Stryker, in mid-air transit aboard a C-17 headed to Guam, back to Joint Base Lewis-McChord using the airplanes antenna. Additionally, the Stryker element succeeded in sending information while plugged into the network of the Navys USNS City of Bismarck, a military sealift ship, though it remained portside.

U.S. Army Soldiers assigned to Americas First Corps and service members assigned to Joint Communications Support Element establish communications onboard the U.S. Naval Ship City of Bismarck, Feb. 16, 2022, as part of a joint training operation to experiment and exercise distributed mission command in the Indo-Pacific. (Army/Jailene Bautista,)

Instead of losing situational awareness, like we do now either in flight or in transit when your stuffs on a vessel, we figured out that if we could use the transport thats on these vessels and perform mission command functions and stay situationally aware as were transiting from one location to another, he said.

Related: At Project Convergence, Army struggling to see joint battlefield as it heeds hard lessons

Its worth reiterating what happened here: from the back of the C-17, the Army conducted a video teleconference. While that may not sound impressive given the last two years of remote work, its an important feat given the high bandwidth requirements to support live video particularly on a military aircraft in the air. Ellis said thats not a function the Corps would always use, but it proved that we could move large amounts of data.

Second, and perhaps a more germane function for the Corps, is that it was able to pass targeting data from the Armys Field Artillery Tactical Data System from the aircraft to the ground.

Its a useful capability for us because it allows us to provide updates in-flight from mission commander back down to launchers, Ellis said. Or in the event that we flipped it and we put some of our HIMARS launchers aboard the aircraft and we can actually update their targeting data from the ground while theyre in route.

The ability to update targeting data while flying for several hours would prevent targeting data from going stale, he added.

The Corps experiment will also help feed into Joint All-Domain Command and Control, the Pentagons future warfighting construct in which sensors and shooters across the battlespace are connected to provide up-to-date information.

The opportunity to operate with the Joint Forces is key in this organization, said Col. Elizabeth Casely, I Corps G6, or the network manager. The ability to sense from a different service component and fire from a different service component is predominant for JADC2.

Enabling technologies

Central to the whole concept is cloud computing at the edge. Using the cloud, soldiers at the different nodes can access the data they need to accomplish their mission and operate more efficiently. However, soldiers will need to bring forward some network hardware to have some immediate access to data in an environment where they were disconnected from the hyper-scale cloud hub thousands of miles away.

So instead of having to bring a separate computer to do fires, a separate computer to do common operational picture and a separate computer to do logistics, we could actually access all those systems through one standard laptop, Ellis said. You bring your little slice of the cloud forward with you when you come forward, so in the event that you are disconnected from the hyper-scale, you can continue to perform your mission command functions.

But the challenge that the Corps has to consider is how much data and what types of mission command data are vital for soldiers to bring forward with them, versus how much can remain in the main cloud hub. The I Corps is working through what information exchanges absolutely have to occur, Casely said.

For example, Ellis said its Corps-level fires personnel are being asked questions about how much data they need to do their job do they need all of their targeting data or just imagery, and how long before a mission? Thats a question similar to one from Project Convergence, where the Army grappled with what data has to be sent, and in what format, while not eating up the limited bandwidth in a conflict zone.

It really, really challenges not just the folks in the G6, but the entire staff and the entire whole staff process to think about what information needs to be exchanged, in order for them to perform their mission, Casely said.

Another challenge is deconflicting updates in the cloud system, Ellis said. If one group of soldiers come out of the disconnected environment and update the broader cloud, while someone else is updating the same type of information in the larger cloud, how do the soldiers sort out whose data is the most relevant?

Were in the very nascent stages of determining how some of the cloud computing works to support this manner of fighting, Casely said. But Im feeling really, really good about where were headed.

More:
Inside the Army's distributed mission command experiments in, and over, the Pacific - Breaking Defense

Read More..

Financial Sector and Cloud Security Providers Complete Initiative To Enhance Cybersecurity – Business Wire

WASHINGTON--(BUSINESS WIRE)--The Cyber Risk Institute (CRI), the Cloud Security Alliance (CSA), and the Bank Policy Institute-BITS announced today the release of a cloud extension for the CRI Profile version 1.2. The Cloud Profile represents the collaboration of over 50 financial institutions and major cloud service providers (CSPs) to extend the CRI Profile, which is a widely accepted cybersecurity compliance framework for the financial sector.

Todays release marks an historic achievement, said CRI President Joshua Magri. This is the first time that financial institutions, the major CSPs, and trade associations have come together to develop a set of baseline expectations related to cybersecurity and roles and responsibilities for cloud deployment. We are exceedingly proud of the work done here and what it may mean for future cloud usage in the financial services sector. We are pleased to be part of a collaborative solution to a longstanding challenge.

As more financial institutions move to the cloud, financial regulators globally have become increasingly focused on ensuring firms use sound risk management practices during cloud implementation. The Cloud Profile provides guidance to financial institutions and CSPs on commonly understood responsibilities related to cloud deployment across software-as-a-service, platform-as-a-service, and infrastructure-as-a-service delivery models.

Financial regulators need clear, consistent, and timely information on firms relationships with their third parties. The Cloud Profile helps clarify where a firms responsibilities end and a cloud service providers responsibilities begin, said Chris Feeney, Executive Vice President of BPI and President of BPI-BITS. A common understanding of cybersecurity controls for cloud implementation that has been developed, vetted, and accepted by firms and CSPs is a sound approach in ensuring our financial sector is more secure.

This guidance is designed to enable financial institutions and CSPs to come to contractual understanding more easily and should also facilitate more streamlined and secure processes for deploying cloud services.

"We are very happy to work with a like-minded organization such as CRI, and we are excited about these initial results. The Cloud Profile extension brings together the CRI Profile with the security controls and security shared responsibility model of the CSA Cloud Controls Matrix v4.0. This represents a very powerful tool to support financial institutions in building a cloud security governance and compliance program that can meet their strict sectorial requirements," said Daniele Catteddu, Chief Technology Officer, Cloud Security Alliance.

CRI, CSA, and BPI will continue working on ways to leverage this joint framework and look forward to greater collaboration.

About Cyber Risk Institute.

The Cyber Risk Institute (CRI) is a not-for-profit coalition of financial institutions and trade associations. Were working to protect the global economy by enhancing cybersecurity and resiliency through standardization. https://cyberriskinstitute.org/ *The CRI Profile is the successor to the Financial Services Sector Coordinating Council (FSSCC) Cybersecurity Profile, a NIST and IOSCO based approach to assessing cybersecurity in the financial services industry.

About Bank Policy Institute.

The Bank Policy Institute (BPI) is a nonpartisan public policy, research and advocacy group, representing the nations leading banks and their customers. Our members include universal banks, regional banks and the major foreign banks doing business in the United States. Collectively, they employ almost 2 million Americans, make nearly half of the nations small business loans, and are an engine for financial innovation and economic growth.

About Cloud Security Alliance.

The Cloud Security Alliance (CSA) is the worlds leading organization dedicated to defining and raising awareness of best practices to help ensure a secure cloud computing environment. CSA harnesses the subject matter expertise of industry practitioners, associations, governments, and its corporate and individual members to offer cloud security-specific research, education, training, certification, events, and products. CSA's activities, knowledge, and extensive network benefit the entire community impacted by cloud from providers and customers to governments, entrepreneurs, and the assurance industry and provide a forum through which different parties can work together to create and maintain a trusted cloud ecosystem. For further information, visit us at http://www.cloudsecurityalliance.org, and follow us on Twitter @cloudsa.

Read the original:
Financial Sector and Cloud Security Providers Complete Initiative To Enhance Cybersecurity - Business Wire

Read More..

Cryptocurrency market down overall heading into Monday morning – Fox Business

Here are your FOX Business Flash top headlines for March 18.

Bitcoin was trading above $41,000 early Monday morning as the cryptocurrency market was down overall.

According to Coindesk, Bitcoin was trading at $41,218, down 1.64%, while Ethereum and Dogecoin were trading at $2,882 (-1.53%) and approximately 12 cents (-1.88%), respectively, the report said.

CLICK HERE TO READ MORE ON FOX BUSINESS

Bitcoin and other major cryptocurrencies fell slightly last week, but finished a mostly upbeat week higher than when they started the week, showing stamina to withstand the U.S. central bank's first interest rate hike in four years and Russia's escalating attacks on Ukraine.

Bitcoin was trading above $41,000 early Monday morning as overall the cryptocurrency market was down overall. (iStock / iStock)

Bitcoin was off about 1.3% over the past 24 hours. Bitcoin topped $42,000 late during U.S. trading hours Friday, a more than 7% increase from where it started the week as investors digested the long-expected Federal Reserve's 25-basis-point increase on Wednesday and global unrest tied to Russia's invasion.

Meanwhile, Ether, the second-largest crypto by market cap, was changing hands a little under $2,900, a 1.8% drop over the same period, but well up from where it began the week. Most other major cryptos were in the red over the weekend. Trading volume fell over the past three days, Coindesk reported.

In other cryptocurrency news, an Austin, Texas, city council member last week announced a resolution that would explore possible uses of Bitcoin and other cryptocurrencies in the city.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

The resolution, from Austin City Council Member Mackenzie Kelly (District 6), came ahead of South by Southwests return to the city after two years of the COVID-19 pandemic.

Original post:
Cryptocurrency market down overall heading into Monday morning - Fox Business

Read More..

Where the World Regulates Cryptocurrency – Statista

Some countries declare Bitcoin to be official legal tender while others announce outright bans on cryptocurrency. A world map based on data collected by the Law Library of the U.S. Congress shows where countries have been trying to stop the cryptocurrency hype and where crypto has been given more or less free reign.

One example of a country embracing cryptocurrency is El Salvador, where Bitcoin was declared an official currency in September of 2021 by populist president Nayib Bukele. The country also taxes and otherwise regulates cryptocurrency. El Salvador is in a special position because it does not have its own currency and instead relies on the U.S. dollar, like some other countries in the region.

Other countries which are applying laws to regulate digital currencies probably wouldn't go as far as El Salvador. Rather, these places - which are most typically developed countries - have been investing in projects to launch their own central bank digital currencies. This is arguably a very different approach to using blockchain technology than that of original cryptocurrencies, which are explicitly independent of any state control, but can be very volatile as a result. Among those exploring the concept are the U.S., European countries, Russia and Australia. India and Thailand, both of which are also broadly regulating cryptocurrency, already have more concrete plans to issue their own digital currencies.

Ukraine was also among the countries which have been regulating cryptocurrency, but the nation went one step further on Wednesday when legislating a framework for the cryptocurrency industry in the country. The nation made the move after it received donations in crypto following its invasion by Russia. Rules like having to register or acquire a license for a crypto exchange also exist in the EU, the UK, Canada, the U.S., Mexico, Chile, Japan and Korea, among others.

China was the first major economy to start issuing its national currency on the blockchain in early 2021. The country has taken a more extreme approach to regulating cryptocurrency by issuing an absolute ban on it. According to the Law Library of Congress, nine countries had so far taken this measure, while many more were implicitly banning the use of cryptocurrency through their other laws. This practice was most common in Africa, the Middle East and Asia.

View original post here:
Where the World Regulates Cryptocurrency - Statista

Read More..

Future predictions about Cryptocurrency after the 2021 breakthrough – Cyprus Mail

What Are Cookies

As is common practice with almost all professional websites, https://cyprus-mail.com (our Site) uses cookies, which are tiny files that are downloaded to your device, to improve your experience.

This document describes what information they gather, how we use it, and why we sometimes need to store these cookies. We will also share how you can prevent these cookies from being stored however this may downgrade or break certain elements of the Sites functionality.

How We Use Cookies

We use cookies for a variety of reasons detailed below. Unfortunately, in most cases, there are no industry standard options for disabling cookies without completely disabling the functionality and features they add to the site. It is recommended that you leave on all cookies if you are not sure whether you need them or not, in case they are used to provide a service that you use.

The types of cookies used on this Site can be classified into one of three categories:

Disabling Cookies

You can prevent the setting of cookies by adjusting the settings on your browser (see your browsers Help option on how to do this). Be aware that disabling cookies may affect the functionality of this and many other websites that you visit. Therefore, it is recommended that you do not disable cookies.

Third-Party Cookies

In some special cases, we also use cookies provided by trusted third parties. Our Site uses [Google Analytics] which is one of the most widespread and trusted analytics solutions on the web for helping us to understand how you use the Site and ways that we can improve your experience. These cookies may track things such as how long you spend on the Site and the pages that you visit so that we can continue to produce engaging content. For more information on Google Analytics cookies, see the official Google Analytics page.

Google Analytics

Google Analytics is Googles analytics tool that helps our website to understand how visitors engage with their properties. It may use a set of cookies to collect information and report website usage statistics without personally identifying individual visitors to Google. The main cookie used by Google Analytics is the __ga cookie.

In addition to reporting website usage statistics, Google Analytics can also be used, together with some of the advertising cookies, to help show more relevant ads on Google properties (like Google Search) and across the web and to measure interactions with the ads Google shows.

Learn more about Analytics cookies and privacy information.

Use of IP Addresses

An IP address is a numeric code that identifies your device on the Internet. We might use your IP address and browser type to help analyze usage patterns and diagnose problems on this Site and improve the service we offer to you. But without additional information, your IP address does not identify you as an individual.

Your Choice

When you accessed this Site, our cookies were sent to your web browser and stored on your device. By using our Site,you agree to the use of cookies and similar technologies.

More Information

Hopefully, the above information has clarified things for you. As it was previously mentioned, if you are not sure whether you want to allow the cookies or not, it is usually safer to leave cookies enabled in case it interacts with one of the features you use on our Site. However, if you are still looking for more information, then feel free to contact us via email at [emailprotected]

Go here to read the rest:
Future predictions about Cryptocurrency after the 2021 breakthrough - Cyprus Mail

Read More..

Mass Adoption and Cryptocurrency Usage – Progressive Grocer

Most experts agree that paying at grocery with cryptocurrencies such as Bitcoin will only grow more popular as more and more consumers, the majority of them younger, start using this form of payment in all aspects of their lives. What happens, though, when an entire nation decides to go digital with its currency? As might be imagined, this move is beneficial to a cryptocurrencys stability and its ultimate use by merchant and shoppers.

Mass adoption occurs when a countrys government adopts an electronic version of the local currency for the U.S., a digital version of the U.S. dollar, explains Peter Jensen, CEO of San Francisco-based RocketFuel, whose partnership with global payment solutions provider ACI enables grocers to accept a variety of crypto coins and accommodate various crypto wallets.This is what China did more than one year ago, what India did [recently], and what many smaller countries have done, [such as] El Salvador. Because the digital version is tied to the physical version, there is no volatility, and the fact that the government is behind it validates the initiative, instills trust among consumers and businesses, and accelerates adoption. I was in El Salvador twice [lately], and its amazing to see the adoption among businesses within the six months since the law was adopted.

Will the United States switch to crypto as its official currency? Probably not any time soon, but Jensen is still bullish on the possibilities of this method of payment, which he believes will eventually overtake credit card transactions in a few years, due to credits high amount of fraud, as well as the [m]any different intermediaries that get a piece of the pie [and] contribute to [its] high costs.

As Jensen puts it: Crypto is yet another example of a new technology that is more efficient and replaces and consolidates the intermediaries. A good comparison is how ridesharing technology with Uber and Lyft overtook the usage of regular taxis, drove costs down and increased inefficiencies.

For more expert adviceabout what grocers should consider when deciding to accept crypto payments, click here.

Read this article:
Mass Adoption and Cryptocurrency Usage - Progressive Grocer

Read More..