Category Archives: Cloud Servers

AI and Blockchain Integration for Preserving Privacy – Unite.AI

With the widespread attention, and potential applications of blockchain and artificial intelligence technologies, the privacy protection techniques that arise as a direct result of integration of the two technologies is gaining notable significance. These privacy protection techniques not only protect the privacy of individuals, but they also guarantee the dependability and security of the data.

In this article, we will be talking about how the collaboration between AI and blockchain gives birth to numerous privacy protection techniques, and their application in different verticals including de-identification, data encryption, k-anonymity, and multi-tier distributed ledger methods. Furthermore, we will also try to analyze the deficiencies along with their actual cause, and offer solutions accordingly.

The blockchain network was first introduced to the world when in 2008 Nakamoto introduced Bitcoin, a cryptocurrency built on the blockchain network. Ever since its introduction, blockchain has gained a lot of popularity, especially in the past few years. The value at which Bitcoin is trading today, and it crossing the Trillion-dollar market cap mark indicates that blockchain has the potential to generate substantial revenue and profits for the industry.

Blockchain technology can be categorized primarily on the basis of the level of accessibility and control they offer, with Public, Private, and Federated being the three main types of blockchain technologies. Popular cryptocurrencies and blockchain architectures like Bitcoin and Ethereum are public blockchain offerings as they are decentralized in nature, and they allow nodes to enter or exit the network freely, and thus promotes maximum decentralization.

The following figure depicts the structure of Ethereum as it utilizes a linked list to establish connections between different blocks. The header of the block stores the hash address of the preceding block in order to establish a linkage between the two successive blocks.

The development, and implementation of the blockchain technology is followed with legitimate security and privacy concerns in various fields that cannot be neglected. For example, a data breach in the financial industry can result in heavy losses, while a breach in military or healthcare systems can be disastrous. To prevent these scenarios, protection of data, user assets, and identity information has been a major focus of the blockchain security research community, as to ensure the development of the blockchain technology, it is essential to maintain its security.

Ethereum is a decentralized blockchain platform that upholds a shared ledger of information collaboratively using multiple nodes. Each node in the Ethereum network makes use of the EVM or Ethereum Vector Machine to compile smart contracts, and facilitate the communication between nodes that occur via a P2P or peer-to-peer network. Each node on the Ethereum network is provided with unique functions, and permissions, although all the nodes can be used for gathering transactions, and engaging in block mining. Furthermore, it is worth noting that when compared to Bitcoin, Ethereum displays faster block generation speeds with a lead of nearly 15 seconds. It means that crypto miners have a better chance at acquiring rewards quicker while the interval time for verifying transactions is reduced significantly.

On the other hand, AI or Artificial Intelligence is a branch in modern science that focuses on developing machines that are capable of decision-making, and can simulate autonomous thinking comparable to a humans ability. Artificial Intelligence is a very vast branch in itself with numerous subfields including deep learning, computer vision, natural language processing, and more. NLP in particular has been a subfield that has been focussed heavily in the past few years that has resulted in the development of some top-notch LLMs like GPT and BERT. NLP is headed towards near perfection, and the final step of NLP is processing text transformations that can make computers understandable, and recent models like ChatGPT built on GPT-4 indicated that the research is headed towards the right direction.

Another subfield that is quite popular amongst AI developers is deep learning, an AI technique that works by imitating the structure of neurons. In a conventional deep learning framework, the external input information is processed layer by layer by training hierarchical network structures, and it is then passed on to a hidden layer for final representation. Deep learning frameworks can be classified into two categories: Supervised learning, and Unsupervised learning.

The above image depicts the architecture of deep learning perceptron, and as it can be seen in the image, a deep learning framework employs a multiple-level neural network architecture to learn the features in the data. The neural network consists of three types of layers including the hidden layer, the input payer, and the output layer. Each perceptron layer in the framework is connected to the next layer in order to form a deep learning framework.

Finally, we have the integration of blockchain and artificial intelligence technologies as these two technologies are being applied across different industries and domains with an increase in the concern regarding cybersecurity, data security, and privacy protection. Applications that aim to integrate blockchain and artificial intelligence manifest the integration in the following aspects.

In the current scenario, data trust systems have certain limitations that compromise the reliability of the data transmission. To challenge these limitations, blockchain technologies can be deployed to establish a dependable and secure data sharing & storage solution that offers privacy protection, and enhances data security. Some of the applications of blockchain in AI privacy protection are mentioned in the following table.

By enhancing the implementation & integration of these technologies, the protective capacity & security of current data trust systems can be boosted significantly.

Traditionally, data sharing and data storing methods have been vulnerable to security threats because they are dependent on centralized servers that makes them an easily identifiable target for attackers. The vulnerability of these methods gives rise to serious complications such as data tampering, and data leaks, and given the current security requirements, encryption methods alone are not sufficient to ensure the safety & security of the data, which is the main reason behind the emergence of privacy protection technologies based on the integration of artificial intelligence & blockchain.

Lets have a look at a blockchain-based privacy preserving federated learning scheme that aims to improve the Multi-Krum technique, and combine it with homomorphic encryption to achieve ciphertext-level model filtering and model aggregation that can verify local models while maintaining privacy protection. The Paillier homomorphic encryption technique is used in this method to encrypt model updates, and thus providing additional privacy protection. The Paillier algorithm works as depicted.

De-Identification is a method that is commonly used to anonymize personal identification information of a user in the data by separating the data from the data identifiers, and thus reducing the risk of data tracking. There exists a decentralized AI framework built on permissioned blockchain technology that uses the above mentioned approach. The AI framework essentially separates the personal identification information from non-personal information effectively, and then stores the hash values of the personal identification information in the blockchain network. The proposed AI framework can be utilized in the medical industry to share medical records & information of a patient without revealing his/her true identity. As depicted in the following image, the proposed AI framework uses two independent blockchain for data requests with one blockchain network storing the patient's information along with data access permissions whereas the second blockchain network captures audit traces of any requests or queries made by requesters. As a result, patients still have complete authority and control over their medical records & sensitive information while enabling secure & safe data sharing within multiple entities on the network.

A multi-layered distributed ledger is a data storage system with decentralization property and multiple hierarchical layers that are designed to maximize efficiency, and secure the data sharing process along with enhanced privacy protection. DeepLinQ is a blockchain-based multi-layered decentralized distributed ledger that addresses a users concern regarding data privacy & data sharing by enabling privacy-protected data privacy. DeepLinQ archives the promised data privacy by employing various techniques like on-demand querying, access control, proxy reservation, and smart contracts to leverage blockchain networks characteristics including consensus mechanism, complete decentralization, and anonymity to protect data privacy.

The K-Anonymity method is a privacy protection method that aims to target & group individuals in a dataset in a way that every group has at least K individuals with identical attribute values, and therefore protecting the identity & privacy of individual users. The K-Anonymity method has been the basis of a proposed reliable transactional model that facilitates transactions between energy nodes, and electric vehicles. In this model, the K-Anonymity method serves two functions: first, it hides the location of the EVs by constructing a unified request using K-Anonymity techniques that conceal or hide the location of the owner of the car; second, the K-Anonymity method conceals user identifiers so that attackers are not left with the option to link users to their electric vehicles.

In this section, we will be talking about comprehensive analysis and evaluation of ten privacy protection systems using the fusion of blockchain and AI technologies that have been proposed in recent years. The evaluation focuses on five major characteristics of these proposed methods including: authority management, data protection, access control, scalability and network security, and also discusses the strengths, weaknesses, and potential areas of improvement. It's the unique features resulting from the integration of AI and blockchain technologies that have paved ways for new ideas, and solutions for enhanced privacy protection. For reference, the image below shows different evaluation metrics employed to derive the analytical results for the combined application of the blockchain and AI technologies.

Access control is a security & privacy technology that is used to restrict a users access to authorized resources on the basis of pre-defined rules, set of instructions, policies, safeguarding data integrity, and system security. There exists an intelligent privacy parking management system that makes use of a Role-Based Access Control or RBAC model to manage permissions. In the framework, each user is assigned one or more roles, and are then classified according to roles that allows the system to control attribute access permissions. Users on the network can make use of their blockchain address to verify their identity, and get attribute authorization access.

Access control is one of the key fundamentals of privacy protection, restricting access based on group membership & user identity to ensure that it is only the authorized users who can access specific resources that they are allowed to access, and thus protecting the system from unwanted to forced access. To ensure effective and efficient access control, the framework needs to consider multiple factors including authorization, user authentication, and access policies.

Digital Identity Technology is an emerging approach for IoT applications that can provide safe & secure access control, and ensure data & device privacy. The method proposes to use a series of access control policies that are based on cryptographic primitives, and digital identity technology or DIT to protect the security of communications between entities such as drones, cloud servers, and Ground Station Servers (GSS). Once the registration of the entity is completed, credentials are stored in the memory. The table included below summarizes the types of defects in the framework.

Data protection is used to refer to measures including data encryption, access control, security auditing, and data backup to ensure that the data of a user is not accessed illegally, tampered with, or leaked. When it comes to data processing, technologies like data masking, anonymization, data isolation, and data encryption can be used to protect data from unauthorized access, and leakage. Furthermore, encryption technologies such as homomorphic encryption, differential privacy protection, digital signature algorithms, asymmetric encryption algorithms, and hash algorithms, can prevent unauthorized & illegal access by non-authorized users and ensure data confidentiality.

Network security is a broad field that encompasses different aspects including ensuring data confidentiality & integrity, preventing network attacks, and protecting the system from network viruses & malicious software. To ensure the safety, reliability, and security of the system, a series of secure network architectures and protocols, and security measures need to be adopted. Furthermore, analyzing and assessing various network threats and coming up with corresponding defense mechanisms and security strategies are essential to improve the reliability & security of the system.

Scalability refers to a systems ability to handle larger amounts of data or an increasing number of users. When designing a scalable system, developers must consider system performance, data storage, node management, transmission, and several other factors. Furthermore, when ensuring the scalability of a framework or a system, developers must take into account the system security to prevent data breaches, data leaks, and other security risks.

Developers have designed a system in compliance with European General Data Protection Rules or GDPR by storing privacy-related information, and artwork metadata in a distributed file system that exists off the chain. Artwork metadata and digital tokens are stored in OrbitDB, a database storage system that uses multiple nodes to store the data, and thus ensures data security & privacy. The off-chain distributed system disperses data storage, and thus improves the scalability of the system.

The amalgamation of AI and blockchain technologies has resulted in developing a system that focuses heavily on protecting the privacy, identity, and data of the users. Although AI data privacy systems still face some challenges like network security, data protection, scalability, and access control, it is crucial to consider and weigh these issues on the basis of practical considerations during the design phase comprehensively. As the technology develops and progresses further, the applications expand, the privacy protection systems built using AI & blockchain will draw more attention in the upcoming future. On the basis of research findings, technical approaches, and application scenarios, they can be classified into three categories.

The technologies belonging to the first category focus on the implementation of AI and blockchain technologies for privacy protection in the IoT industry. These methods use AI techniques to analyze high volumes of data while taking advantage of decentralized & immutable features of the blockchain network to ensure authenticity and security of the data.

The technologies falling in the second category focus on fusing AI & Blockchain technologies for enhanced privacy protection by making use of blockchains smart contract & services. These methods combine data analysis and data processing with AI and use blockchain technology alongside to reduce dependency on trusted third parties, and record transactions.

Finally, the technologies falling in the third category focus on harnessing the power of AI and blockchain technology to achieve enhanced privacy protection in large-scale data analytics. These methods aim to exploit blockchains decentralization, and immutability properties that ensure the authenticity & security of data while AI techniques ensure the accuracy of data analysis.

In this article, we have talked about how AI and Blockchain technologies can be used in sync with each other to enhance the applications of privacy protection technologies by talking about their related methodologies, and evaluating the five primary characteristics of these privacy protection technologies. Furthermore, we have also talked about the existing limitations of the current systems. There are certain challenges in the field of privacy protection technologies built upon blockchain and AI that still need to be addressed like how to strike a balance between data sharing, and privacy preservation. The research on how to effectively merge the capabilities of AI and Blockchain techniques is going on, and here are several other ways that can be used to integrate other techniques.

Edge computing aims to achieve decentralization by leveraging the power of edge & IoT devices to process private & sensitive user data. Because AI processing makes it mandatory to use substantial computing resources, using edge computing methods can enable the distribution of computational tasks to edge devices for processing instead of migrating the data to cloud services, or data servers. Since the data is processed much nearer the edge device itself, the latency time is reduced significantly, and so is the network congestion that enhances the speed & performance of the system.

Multi-chain mechanisms have the potential to resolve single-chain blockchain storage, and performance issues, therefore boosting the scalability of the system. The integration of multi-chain mechanisms facilitates distinct attributes & privacy-levels based data classification, therefore improving storage capabilities and security of privacy protection systems.

Read more:
AI and Blockchain Integration for Preserving Privacy - Unite.AI

New revelations from the Snowden archive surface – ComputerWeekly.com

He risked his neck. When Edward Snowden chose to expose the U.S. National Security Agency (NSA)'s mass surveillance Leviathan, and that of its British counterpart, GCHQ, 10 years ago, he put his life on the line. And he has always declared he has never regretted it.

But years after his act of extraordinary courage, the Snowden archive remains largely unpublished. He trusted in journalists to decide what to publish. In an article published in June 2023, by Guardian Pulitzer prize winner Ewen MacAskill - who flew to Hong Kong with Glenn Greenwald and Laura Poitras to meet Edward Snowden - McAskill confirmed that most of the archive has not been made public. "In the end, we published only about 1 percent of the document, he wrote.

What does the 99 percent of the Snowden archive contain? A decade on, it remains shrouded in secrecy.

A doctoral thesis by American investigative journalist and post-doctoral researcher Jacob Appelbaum has now revealed unpublished information from the Snowden archive. These revelations go back to a decade but they remain of indisputable public interest:

These revelations have surfaced for the first time thanks to a doctoral thesis authored by Appelbaum towards earning a degree in applied cryptography from the Eindhoven University of Technology in the Netherlands.

Titled "Communication in a world of pervasive surveillance", it is a public document and has been downloaded over 18,000 times since March 2022 when it was first published.

Appelbaum's work, supervised by professors Tanja Lange and Daniel J. Bernstein, is among the top ten most popular Ph.D. theses at the Eindhoven University.

When we asked whether the U.S. authorities had contacted the Eindhoven University of Technology to object to the publication of some of the revelations from the Snowden files, a university spokesperson replied that they had not.

In 2013, Jacob Appelbaum published a remarkable scoop for Der Spiegel, revealing that the NSA had spied on Angela Merkel's mobile phone. This scoop won him the highest journalistic award in Germany, the Nannen prize (later known as the Stern Award).

Nevertheless, his work on the NSA revelations and his advocacy for Julian Assange and WikiLeaks and for high-profile whistleblowers has put him in a precarious condition. As a result of this he has resettled in Berlin, where he has spent the last decade.

In June 2020, when the United States issued a second superseding indictment against Julian Assange, it was clear that Appelbaum's concerns were not a matter of paranoia; the indictment criminalizes political speeches given by Assange as well as by former WikiLeaks journalist Sarah Harrison and by Jacob Appelbaum himself, identified under the codename "WLA-3".

Public speeches made by Appelbaum taking a humorous and provocative tone and with titles like Sysadmins of the World, Unite! were interpreted as an attempt to recruit sources and as incitement to steal classified documents. To this day, however, there are no publicly-known charges against Appelbaum or Harrison.

We asked Jacob Appelbaum, currently a post-doctoral researcher at the Eindhoven University of Technology, why he chose to publish those revelations in a technically written thesis rather than a mass-circulation newspaper.

"As an academic", he replied, "I see that the details included are in the public interest, and highly relevant for the topic covered in my thesis, as it covers the topic of large-scale adversaries engaging in targeted and mass surveillance".

One of the most important unpublished revelations from the Snowden archive regards American semiconductor vendor Cavium. According to Appelbaum, the Snowden files list Cavium "as a successful SIGINT enabled CPUs vendor".

"The NSA's successful cryptographic enabling is by definition the introduction of intentional security vulnerabilities that they are then able to exploit, and they do exploit them often in an automated fashion to spy," he said.

"One such method", he added, "is sabotaging a secure random generator".

A random number generator that is unpredictable to everyone "is an essential requirement for meaningful cryptographic security. In most cases, the NSA sabotage happens in a way where the owners, developers, and users are unaware of the sabotage as a core goal".

The purpose of this sabotage is to allow the NSA to breach the security offered by a given company, device and/or other services.

At no point does Appelbaum write or even suggest that Cavium was complicit in these sabotage activities or was aware of them.

The Snowden documents date back to 2013. In 2018, Cavium was acquired by the U.S. company Marvell Technology, one of the two firms which, according to financial services giant J.P. Morgan, will dominate the custom-designed semiconductors market driven by Artificial Intelligence.

We contacted Marvell to ask a series of questions, including whether Cavium's CPUs have basically remained the same in the last decade, and whether it is certain that Cavium CPUs, which according to the 2013 Snowden files were backdoored, are no longer marketed and in use.

We also asked Marvell whether the company conducted any internal investigations after we informed them about Appelbaum's revelation. One of the co-founders of Cavium, Raghib Hussain, is currently one of the presidents of Marvell.

Marvell has not provided answers to our specific questions. Its vice president for Corporate Marketing, Stacey Keegan, said that it did not implement backdoors for any government.

Her statement reads in full:

"Marvell places the highest priority on the security of its products. Marvell does not implement backdoors for any government. Marvell supports a wide variety of protocols and standards including IPsec, SSL, TLS 1.x, DTLS and ECC Suite B.

Marvell also supports a wide variety of standard algorithms including several variants of AES, 3DES, SHA-2, SHA-3, RSA 2048, RSA 4096, RSA 8192, ECC p256/p384/p521, Kasumi, ZUC and SNOW 3G.

All Marvell implementations are based on published security algorithm standards [.] Marvells market leading NITROX family delivers unprecedented performance for security in the enterprise and virtualized cloud data centers.

The NITROX product line is the industry leading security processor family designed into cloud data center servers and networking equipment, enterprise and service provider equipment including servers, Application Delivery Controllers, UTM Gateways WAN Optimization Appliances, routers, and switches".

Appelbaum said that as the new owner of Cavium, Marvell "should conduct a serious and transparent technical security investigation into the matter and make the result available to the public".

He said that he wrote to the company, including to their security response email address, and set this forth in extreme detail, but has never heard back from them.

The two other important and yet unpublished revelations from the Snowden files concern the compromise of foreign government infrastructure by the NSA.

Appelbaum writes in his thesis that the Snowden archive includes largely unpublished internal NSA documents and presentations that discuss targeting and exploiting not only deployed, live interception infrastructure.

The documents also discuss targeting and exploiting the vendors of the hardware and software used to build the infrastructure.

Primarily these documents remain unpublished because the journalists who hold them fear they will be considered disloyal or even that they will be legally punished," he writes.

Appelbaum adds that "Targeting lawful interception (LI) equipment is a known goal of the NSA".

"Unpublished NSA documents specifically list their compromise of the Russian SORM LI infrastructure as an NSA success story of compromising civilian telecommunications infrastructure to spy on targets within reach of the Russian SORM system, he says.

Though Appelbaum did not publish the NSA slides on SORM in his thesis, he reports that they show two Russian officers wearing jackets bearing the slogan: "you talk, we listen".

He says that it is not unreasonable to assume that parts, if not the entire American lawful interception system, known as CALEA, have been compromised.

In his doctoral thesis he says that key European lawful interception systems "have been compromised by NSA and/or GCHQ". Appelbaum said that the Snowden archive contained many named target systems, companies, and other countries that had been impacted.

According to Appelbaum, "compromise" means different things: sometimes it is a matter of technical hacking, others it is a matter of "willful complicity from inside the company by order of some executives after being approached by the NSA.

Woe to those who do not comply immediately, he says.

Some of the most important revelations published from the Snowden archive concerned PRISM, a mass surveillance program which allowed the NSA to access emails, calls, chats, file transfers, web search histories.

The NSA slides claimed that this collection was conducted from the servers of internet giants like Google, Apple, Facebook, Microsoft, AOL, Skype, PalTalk and YouTube, but when the existence of this program was exposed by Glenn Greenwald and Ewen MacAskill inThe Guardianand by Laura Poitras and BartonGellmann in theWashington Post, the internet giants denied any knowledge of the program and denied that they had granted direct access to their servers.

Though PRISM was one of the very first revelations from the Snowden archive, Appelbaum reveals that "The PRISM slide deck was not published in full" and "several pages of the PRISM slide list targets and related surveillance data, and a majority of them appear to be a matter of political surveillance rather than defense against terrorism".

He explains that one such example of PRISM's targets being a matter of political surveillance rather than anti-terrorism "shows a suggestion for targeting the Tibetan Government in Exile through their primary domain name".

In 1950 the Peoples Republic of China took control of Tibet and met with considerable resistance from the Tibetan people. In 1959, the Fourteenth Dalai Lama left Tibet to seek political asylum in India, and there was a major exodus of Tibetans into India. The Dalai Lama set up the Tibetan Government in Exile in India and exiled Tibetans have accused China of cruelty and repression for decades.

Appelbaum reveals that the main domain of the Tibetan Government in Exile (tibet.net) "is named as an unconventional example that analysts should be aware of as also falling under the purview of PRISM". He explains that the email domain was "hosted by Google Mail, a PRISM partner, at the time of the slide deck creation and it is currently hosted by Google Mail as of early 2022". At the time of this writing, it still is.

According to him, tibet.net exemplifies the political reality of accepting aid from the United States. The system administrators wanted to be protected from Chinese hacking and surveillance. To fight Chinese surveillance, the technical team opted to host with Google for email and Cloudfare for web hosting. The reason Google appealed to the technical team behind tibet.net was the excellent reputation of Google's security team at that time.

"What was unknown at the time of this decision", Appelbaum explains, "was that Google would, willing or unwillingly, give up the data to the US government in secret. Thus in seeking to prevent surveillance by the Chinese government some of the time when the Chinese government successfully hack their servers, they unknowingly accepted aid that ensured their data will be under surveillance all of the time".

As a result, to fight the well-known devil of Chinese surveillance, the Tibetan Government in Exile put itself in the hands of the NSA.

How many important revelations like these do the unpublished documents still contain? It is impossible to say so long as the archive remains unpublished. It is also unclear how many copies of the full archive remain available and who has access to them.

Appelbaum says,"there was a discussion among many of the journalists who worked on the archive about opening access to the Snowden archive for academics to discuss, study, and of course to publish. This is a reasonable idea and it should happen, as it is clearly in the public interest".

He said it was a terrible day when The Guardian allowed GCHQ to destroy the copy of the archive in the United Kingdom. However, according to Ewen MacAskill's reporting in The Atlantic, "A copy of the Snowden documents remains locked in an office at the Times, as far as I know".

According to Jacob Appelbaum, The Intercept - the media outlet co-founded by Glenn Greenwald and Laura Poitras to publish the Snowden files - is no longer in possession of the documents. "I was informed that they destroyed their copy of the archive", Appelbaum tells us.

In 2013, the author of this article worked with Glenn Greenwald on the Snowden files regarding Italy, publishing all the documents that Greenwald shared with us in her newspaper at the time, the Italian newsmagazinel'Espresso.

After that journalistic work, we were contacted again to work on additional files, but unfortunately after some preliminary contacts, we never heard fromThe Intercept staff again. All of our attempts to work on the files came to nothing, though we never learned what the problem was.

We asked the Intercept whether the publication is still in possession of the Snowden file. A spokesperson replied: "The Intercept does not discuss confidential newsgathering materials".

Appelbaum is highly critical of those who destroyed the Snowden files: "Even if the privacy violating intercepts are excluded from publication, there is an entire parallel history in that archive".

See the original post:
New revelations from the Snowden archive surface - ComputerWeekly.com

I-T unearths unaccounted trade worth Rs 2.5K crore from jewellery, bullion firms – The Indian Express

The Surat Income Tax department has unearthed alleged undisclosed transactions worth Rs 2,500 crore done by two major jewellery manufacturing firms, retail shops associated with them and a bullion company in a 133-hour-long search carried out in Surat and Rajkot.

The firms allegedly underreported their business and were doing three times more business without disclosing it, officials said.

The searches uncovered business links of Surat-based Parth Ornaments Private Limited, among the top jewellers in the state, and Tirth Gold, its sister company, with Akshar Jewellers, Kantilal Brothers Jewellery and Harikala Bullion of Surat city. A team of 150 I-T officials, led by Additional Director of Investigations, Vibhor Badoni, carried out the searches at 35 locations in Surat and two in Rajkot, including secret rooms, residences, and other places linked to the companies from 6.00 am on September 13 and went on till Monday evening.

An official who was part of the searches, on condition of anonymity, said Parth and Tirth had hired a Rajkot-based software company to help them store a majority of their sales data on cloud servers in other cities to enable them to under-report the books of account.

The officials have recovered data on Parth and Tirth from the software company. The officials have also recovered several documents from the five firms. These are being scrutinised. They have also seized 10 bank lockers of these firms.

We have not seized jewellery as it is part of their business. We suspect that cash might have been hidden in some secret place and we are trying to find it out. We have retrieved sales data of five years from Parth and Tirth, and found that they have done business transactions to the tune of Rs 2,500 crore without disclosing it on the books of account. The investigation is still in progress, but the searches have ended, an official said, on the condition of anonymity.

Parth Ornaments has two jewellery factories in Surat and has over 1,000 employees. The firm, which began around 10 years ago, has shops across the country. Tirth Gold, which is also into gold manufacturing, is owned by a cousin of the promoters of Parth Ornaments.

According to sources, Akshar Jewellers and Kantilal Brothers are into the retail jewellery business, and a majority of their sales transactions are done in cash and have not been shown in the book of accounts. The officials have also seized several documents that were kept hidden by the firms and not disclosed to the I-T department.

The bridal jewellery range at Kantilal Brothers starts from Rs 10 lakh onwards, apart from its retail jewellery sales. We have also found unaccounted stock of Rs 100 crore from these five firms. The data recovered from the bullion firm is also being analysed. The exact figures will be revealed after that, the official added.

The Indian Express (P) Ltd

First published on: 18-09-2023 at 23:50 IST

Original post:
I-T unearths unaccounted trade worth Rs 2.5K crore from jewellery, bullion firms - The Indian Express

Microsoft AI Researchers Accidentally Exposed Terabytes of Internal … – Slashdot

Microsoft AI researchers accidentally exposed tens of terabytes of sensitive data, including private keys and passwords, while publishing a storage bucket of open source training data on GitHub. From a report: In research shared with TechCrunch, cloud security startup Wiz said it discovered a GitHub repository belonging to Microsoft's AI research division as part of its ongoing work into the accidental exposure of cloud-hosted data. Readers of the GitHub repository, which provided open source code and AI models for image recognition, were instructed to download the models from an Azure Storage URL. However, Wiz found that this URL was configured to grant permissions on the entire storage account, exposing additional private data by mistake. This data included 38 terabytes of sensitive information, including the personal backups of two Microsoft employees' personal computers. The data also contained other sensitive personal data, including passwords to Microsoft services, secret keys and more than 30,000 internal Microsoft Teams messages from hundreds of Microsoft employees.

Read the original here:
Microsoft AI Researchers Accidentally Exposed Terabytes of Internal ... - Slashdot

Cloud Computing: DOD Needs to Improve Tracking of Data User Fees – Government Accountability Office

What GAO Found

Data user fees (ingress and egress) are related to how users transfer and access data in a cloud environment. Data ingress is transferring data into the cloud and data egress is transferring data from the cloud. While data ingress is often free to users, cloud service providers generally charge data egress fees for transferring data out of storage (see figure).

Figure: Transfer of Data into and out of the Cloud

The Department of Defense (DOD) has begun to consider data egress fees when procuring and implementing cloud services. The department's recent contract negotiations with commercial providers resulted in discounts on data fees, including data egress fees. Vendor lock-in can happen in cloud computing when the cost of moving to a new provider is so high that a user stays with their incumbent provider. However, DOD officials stated that egress fees had not been a primary cause for vendor lock-in. These officials added that other factors could cause vendor lock-ins, including a lack of specific skills by government staff, or the reliance on cloud services unique to a specific cloud provider.

DOD has mechanisms that could mitigate the impact data egress fees could have on DOD as it procures and implements cloud services across the department. DOD officials reported that data egress fees account for less than 1 percent of known cloud expenditures. However, the department does not have the capability to track and report on these fees. In addition, DOD's contract-specific tools do not track cloud expenditures, including data egress fees department-wide. DOD officials identified improved insight into cloud expenditures through recent department-wide contracts such as the Joint Warfighting Cloud Capabilitya cloud contract with four commercial service providersand other tools. However, DOD does not yet have a plan or time frame for adopting a tool that tracks data egress fees. Until DOD acquires and implements such a tool, it will continue to lack full insight into the impact of egress fees.

Cloud computing enables agencies to have on-demand access to shared computing resources. The costs of doing so are often lower than if the agencies were maintaining the resources. In fiscal year 2022, major federal agencies obligated about $7 billion for cloud computing contracts, including approximately $3 billion by DOD. Cloud service providers charge users fees for transferring data out of the cloudknown as data egress fees. Committee reports from the Senate and House Armed Services Committees accompanying the James M. Inhofe National Defense Authorization Act for Fiscal Year 2023 include provisions for GAO to review cloud data egress fees at DOD, including their effects on vendor lock-in.

This report determines the extent to which DOD (1) considered data egress fees when procuring and implementing cloud services and their potential for vendor lock-in and (2) mitigated the impact of data egress fees and tracked and reported on them. To assess DOD's cloud data egress fees, GAO analyzed relevant department guidance on cloud services and the tracking and reporting of cloud expenditures. It also reviewed supporting department documentation on cost reporting and tracking. In addition, GAO interviewed DOD officials.

Originally posted here:
Cloud Computing: DOD Needs to Improve Tracking of Data User Fees - Government Accountability Office

Do backups belong in the cloud or on-premise? – IT-Online

Give a nod to marketers. When it comes to the cloud, they have helped shape how we think of this technology, almost as a magic solution to any server or data hosting scenario. Of course, the cloud often delivers on such claims. In many cases, it does provide a better alternative to traditional systems.

But cloud marketing did too good a job. The cloud doesnt solve every IT problem, says Bryce Tatham, GM: business development at Sithabile Technology Services.

The cloud era has been great for many reasons, but its also enlightened us about the complex and bespoke side of technology. By that, I mean the principle that you have to look at business requirements first. If you do that wrong, technology becomes very expensive and underwhelming. For a while, the cloud looked like a way to sidestep those issues. But instead, it reminds us that nuance is very important, because otherwise your problems actually become bigger.

Backups are a prime example. On paper, you should throw all your backups into the cloud. Its cheaper, more accessible, and you dont need to own or run the underlying infrastructure.

But reality disagrees: cloud data costs can skyrocket, especially when moving data away from a public cloud; accessibility is dampened by latency; the higher reliance on external networks creates cybercrime risks; and, sometimes, you want to have a hand in the systems that run your data.

Of course, these risks can be managed. Cloud backups are great. But on-premises backups also have their advantages. It depends on what the business needs.

Cloud versus on-premise

Before comparing the two options, its important to qualify them. Cloud backups typically mean using a public cloud provider, and on-premise backups can be at a business but also in a third-party data centre, often using private cloud technology controlled by the business.

The substantial difference between cloud and on-premise is not about new versus traditional, says Tatham. They tend to use the same modern backup technologies. The difference is about the backup strategy, cost, access, skills, and legislation.

These five topics provide the best way to grasp backup options.

* Backup strategy Not all data is the same. Some can languish in archives while other data needs to be always available. This difference informs the backup strategy. In some circumstances, on-premise recoveries are faster, but the cloud is faster in other events. For example, a remote site with its own servers is likely better than one relying on the cloud. However, data used by employee SaaS applications works best with cloud backups.

* Cost How much to pay for backups again depends on the type of data and its access requirements. Cloud storage can seem cheaper, but its costs can become complex and hard to control. Nor is it best practice to store cold archives of data in the public cloud its often cheaper to use local tape storage. Yet on-premise storage has additional skills and infrastructure costs, though those can be balanced through the value of access.

* Access Data is not much use if there isnt timely access. Again, there is no clear winner about which option is faster. Moving large volumes of data to or from a public cloud data centre abroad can cause delays, yet using local public cloud servers is faster. That being said, one must ensure the correct data is on the right servers. On-premise and private cloud systems dont have this issue, though that flexibility often has higher infrastructure costs.

* Skills Since public cloud systems are from third parties, most companies that use those for backups work with partners and their skilled employeesa significant cost saving, though it also means heavy reliance on third parties. On-premise systems require in-house skills, which can be expensive. Again, it depends on the business needs. Smaller companies tend to prefer the cloud, while larger companies blend the two options.

* Legislation Regulations govern some data types to protect personal information, safeguard tax records, or cover various other legal requirements based on a companys size, sector and geographic operations. On-premise data backups provide a level of control to manage legal risks. Public cloud services can cater for legislation, but its a complicated exercise and requires close reading of service contracts covering liabilities.

Companies often opt for a hybrid backup strategy that blends different public cloud and on-premise options. This is the mature approach: those organisations looked at their requirements and chose appropriate backup locations for different data needs.

Related

Original post:
Do backups belong in the cloud or on-premise? - IT-Online

Huawei Cloud in Token2049: Fueling Web3 Advances with Key … – PR Newswire

SINGAPORE, Sept. 13, 2023 /PRNewswire/ -- From September 11 to 12, Huawei Cloud showcased staking node engine, confidential computing, andZK rollup at Token2049 in Singapore. In this event, Huawei Cloud discussed the future of the Web3 industry with top vendors, builders, and developers, and stated its commitment to accelerating Web3 innovation with technical breakthroughs.

Node Creation in Seconds, 99% Staking Effectiveness: A New Engine for Web3

Web3 is in the early stage of development and faces many technical challenges regarding performance and security of transactions on chains.On September 12, in the MetaEra Summit, Zhang Ziyi, Chief Architect of Huawei Cloud Blockchain, introduced Huawei Cloud's Ethereum staking node hosting service. With innovative algorithms, this service achieves up to 99% staking effectiveness and higher rewards. QingTian Enclave security framework supports environment and identity authentication and full-link data encryption and decryption. With this security framework, applications and data can run secured on Huawei Cloud Elastic Cloud Servers (ECSs). In addition, Huawei Cloud accelerates ZK rollup hardware innovation. With Huawei-developed XPU, architecture innovation, and algorithm optimization, the average confirmation time of ZK rollup transactions plummeted from hours to minutes.

Huawei Cloud QingTian Enclave Safeguards Transactions

Security has been significant to Web3. Currently, Web3 wallet applications still face severe security challenges when it comes to the storage and attack defense of wallet private keys. Blockchain Security Alliance Meetup, initiated by the Blockchain Security Alliance, is one of the major activities of Token2049 and has become the focus of the global blockchain security and Web3 ecosystem. Jia Xiaoqiang, Director of Huawei Cloud Virtualization Products, introduced Huawei Cloud's confidential computing solution and Web3 solution aiming to secure systems, applications, and sensitive data in wallet scenarios.

Jia Xiaoqiang said, "Huawei Cloud QingTian Enclave confidential computing solution is just like a safe for wallet private keys. It supports environment and identity authentication and full-link data encryption and decryption. Applications and data can be run with protection on Huawei Cloud Elastic Cloud Servers (ECSs). Unauthorized users and third parties are isolated to maintain transaction security of private keys and wallets." Huawei Cloud is committed to building a highly secure and available solution for Web3, and all the efforts turned into QingTian Enclave, an end-to-end solution securing the execution environment for applications and sensitive data through software and hardware.

IPFS Cloud Data Ecosystem with Tenfold Rewards

Decentralized storage is another key infrastructure of Web3. However, the industry shares the same headaches in migrating data to the cloud, reducing costs, and improving effectiveness. At the Fil Dev Summit on September 12, Bai Tao, Chief Cloud Storage Solution Architect of Huawei Cloud, shared InterPlanetary File System (IPFS) data service and elastic cloud deployment solution jointly developed by Huawei Cloud and partners. Huawei Cloud's cloud data entry simplifies the process of storing data on IPFS, resulting in a 10-fold increase in data rewards and reduced data storage overhead for enterprises. Additionally, Huawei Cloud's large-ratio EC technologies enable 91% effective capacity usage of disks and scalable storage for data up to 10 EB-level. With the elastic cloud deployment solution, users can choose offline encapsulation, cloud verification, or end-to-end cloud deployment. Huawei Cloud, with over 18 years of experience in enterprise-level storage, aims to deliver cloud storage services that are not only cost-effective but also highly efficient.

Providing "Everything as a Service", Huawei Cloud aims to offer robust infrastructure that enables advanced, distributed computing power for Web3. This commitment extends to building an open, efficient, and secure blockchain platform that can foster innovation in the Web3 ecosystem.

SOURCE Huawei Cloud APAC

Read the original post:
Huawei Cloud in Token2049: Fueling Web3 Advances with Key ... - PR Newswire

AI and cloud computing: A symbiotic relationship or a hostile … – Lexology

Artificial intelligence (AI) and cloud computing are both popular buzzwords generally referred to as separate concepts without considering the potential for interplay and interconnectedness between the two technologies. Companies could benefit from incorporating cloud computing functionality into their AI operations. For example, AI algorithms can be trained and deployed on a far greater capacity using cloud computing rather than on local servers. Cloud-based AI can also process vast quantities of data through cloud computing infrastructure. This article explores the potential for synergy between cloud computing and AI algorithms while identifying the dissonance that might occur as a result of the collaboration.

The synergy between AI and cloud computing

Cloud-computing and AI tools are individually relied on as technological developments that have provided convenience to users and, to some extent, have disrupted existing industries. When combining the two, the company may experience the following benefits:

The dissonance between AI and the Cloud

As with all technological developments and tools, there are various risks associated with the use of cloud-based AI, which includes (without limitation) the following:

Cloud-based AI provides an opportunity to significantly enhance a company's AI capabilities and companies that are first to embrace cloud-based AI could obtain a competitive advantage when doing so. However, these advantages may only be realised when a company has carefully considered the cloud marketplace, engaged in vendor due diligence processes, and entered favourable cloud service agreements. failure to maintain proper vigilance over this process could lead to hidden expenses that may outweigh the benefits of cloud-based AI. It is essential to have legal support during this process to ensure that your risk exposure is reduced or mitigated.

More:
AI and cloud computing: A symbiotic relationship or a hostile ... - Lexology

Bringing your Microsoft workloads to AWS: Why and how you might … – The Stack

Microsoft workloads have been the backbone of organisations IT strategies for decades: 70% of all Enterprise Applications are Microsoft-based and over three quarters of those are still running on-premise. The technology in use is likely to include legacy .NET versions running on Windows Server and SQL Server databases licensed with an Enterprise Agreement, writes Rhys Jacob, CTO at AWS consultancy D55

But as cloud technology has evolved, and digital transformation continues to expand and reach new heights, Microsoft Workloads run on-premises arent fit for purpose. In fact, 48% of on-premise Microsoft workloads are forecasted to migrate in the next two years, making it one of the largest market opportunities in core IT and strategic focuses for CIOs and CTOs.

Businesses that remain running on these workloads on-prem face a range of challenges; the inability to scale their infrastructure for peaks in demand (unless theyre willing to spend a lot to do it), typically high operational and maintenance costs for their growing on-premise technology, inability to be agile without wholesale code changes, and not being able to digitally transform quick enough, impacting their ability to remain competitive.

Despite these barriers for growth, the question of whether to migrate Microsoft workloads, particularly onto non-Microsoft cloud services like AWS, can be a difficult one for CTOs to answer. If teams are thinking about migrating Microsoft Workloads to the cloud, Microsoft Azure might seem like the obvious choice but in reality, AWS is running more Microsoft applications and at a better price.

While the thought of migrating all your Microsoft workloads onto the cloud may seem daunting, it has become such common practice that providers like AWS have streamlined the journey to the point where application downtime is non-existent.

The first step is understanding what needs to be migrated and what workloads can be retired is key. Once this is clear, applications can begin to be reallocated to EC2 Windows Servers and then ultimately to ECS/Lambda. This supports the lift and shift process, including SQL Server databases, which for most enterprises delivers immediate benefits, like reducing costs and increasing agility. The temptation for many here, however, is to pause the transformation journey once they have realised the immediate return on investment of a successful lift and shift.

As with any migration to the cloud, however, to gain the real benefits of it, including improved agility, speed to market, reduced cost, better scalability and reduced energy consumption, businesses must take a cloud native approach to their data systems. This means modernising and platform optimisation.

This can be done, to some extent, through re-platforming, whereby organisations migrate their applications without making wholesale changes to the architecture or code. This approach also means organisations can migrate their on-premise SQL Server databases to Amazon Relational Database Service (Amazon RDS). In doing so, businesses can continue using SQL Server but are no longer required to undertake time intensive tasks such as installation, patching, configuration and updates. However, the costly licensing fees attached to SQL Server will continue.

For the full benefits of cloud, re-factoring and re-platforming, which typically involves application changes and entire re-architecture strategies, is the step which allows organisations to truly untap the potential of cloud technology.

Here, businesses that no longer want to continue paying the licensing costs for SQL Server can move their database across to AWS Aurora, a fully managed database built for the cloud, and AWS Babelfish, which allows Aurora to understand queries from applications written for SQL Server, completing the database modernisation.

Meanwhile to re-platform a business applications, converting it to dotnet core and running the application on AWS Linux instead of Windows not only saves on the Windows Server license fee, but it also allows further re-architecture for greater modernisation, and supports organisations in becoming fully cloud-native. This means breaking down existing and legacy monoliths into more maintainable microservices, allowing each microservice to adapt and grow independently. Having a clean separation between microservices allows developers to focus on individual services without it impacting the broader system.

Crucially, microservices also allow applications to communicate with one another via. Put simply, when one application or service emits an event, other applications can be notified and decide whether or not they need to do anything with that data.

The benefits of modernisation are far reaching. From enhanced security to increased flexibility and lower licensing and consumption costs, organisations can unlock huge growth potential once their cloud infrastructure is optimised to support business objectives. Its these benefits that have made 84% of AWSs customers prioritise application modernisation in the next two years.

Migrating Microsoft workloads to AWS may seem drastic, but the process to getting there has never been more streamlined thanks to AWS technology. In fact, AWS now has a 45% larger share of Microsoft workloads on the cloud than the next largest cloud provider, and its why our industry has reported a 23% increase in CXO level modernisation conversations. The process of migrating workloads is now becoming just a matter of time for most organisations.

Read more:
Bringing your Microsoft workloads to AWS: Why and how you might ... - The Stack

What Is Cloud Networking? Definition, Types & Benefits – Forbes

What Is Cloud Networking? Definition, Types & Benefits Forbes Advisor

Leeron is a New York-based writer with experience covering technology and politics. Her work has appeared in publications such as Quartz, the Village Voice, Gothamist, and Slate.

For over 15 years, Kiran has served as an editor, writer and reporter for publications covering fields including advertising, technology, business, entertainment and new media.He has served as a reporter for AdAge/Creativity and spent several years as an edito and writer at Adweek. Along the way, he has also served in managing editor roles at the likes of PSFK and Ladders, worked in PR as a director of content, and most recently served as a Senior Editor at Dotdash Meredith for personal finance brand The Balance and then Entertainment Weekly. At Forbes Advisor, Kiran brings his experience and expertise to reinforce the brand's reputation as the most informative, accessible and trusted resource in small business.

Leeron is a New York-based writer with experience covering technology and politics. Her work has appeared in publications such as Quartz, the Village Voice, Gothamist, and Slate.

For over 15 years, Kiran has served as an editor, writer and reporter for publications covering fields including advertising, technology, business, entertainment and new media.He has served as a reporter for AdAge/Creativity and spent several years as an edito and writer at Adweek. Along the way, he has also served in managing editor roles at the likes of PSFK and Ladders, worked in PR as a director of content, and most recently served as a Senior Editor at Dotdash Meredith for personal finance brand The Balance and then Entertainment Weekly. At Forbes Advisor, Kiran brings his experience and expertise to reinforce the brand's reputation as the most informative, accessible and trusted resource in small business.

Contributor, Editor

Matt is a proven leader in IT, combining a masters degree in Management Information Systems and solid experience with a proven track record in IT, leading business initiatives to help organizations meet their goals. He has led the security practices at 2 different MSPs, been a Health IT Director, a project manager, business analyst, system administrator, systems architect...if it has to do with IT, he's probably done it. He helped author the CMMC Certified Professional and CMMC Certified Assessor field guides and has spoken at conferences all over the country regarding CMMC, IT security, risk. Matt has worked with Fortune 500 companies and small businesses, in areas ranging from engineering to marketing and supply chain to health care.

Matt is a proven leader in IT, combining a masters degree in Management Information Systems and solid experience with a proven track record in IT, leading business initiatives to help organizations meet their goals. He has led the security practices at 2 different MSPs, been a Health IT Director, a project manager, business analyst, system administrator, systems architect...if it has to do with IT, he's probably done it. He helped author the CMMC Certified Professional and CMMC Certified Assessor field guides and has spoken at conferences all over the country regarding CMMC, IT security, risk. Matt has worked with Fortune 500 companies and small businesses, in areas ranging from engineering to marketing and supply chain to health care.

Published: Sep 13, 2023, 12:00pm

Editorial Note: We earn a commission from partner links on Forbes Advisor. Commissions do not affect our editors' opinions or evaluations.

Show more

These days almost every business relies on the cloud to some capacity. Cloud networking is scalable and flexible. It allows organizations to increase their infrastructure according to changing demands. Cloud networking also saves costs, as companies only pay for the services they use as they go. This article covers the terms cloud networking and cloud computing, the various types and the benefits of this technology for small and medium businesses.

Cloud networking is an element of cloud computing and refers to the way the networking infrastructure works within it.

Cloud computing has revolutionized the way companies run, making it easier, faster and cheaper to complete functions that previously required a company to have its own data center. Almost every type of business today uses cloud computing for a wide range of purposes, including data backup, email and customer-facing web applications.

Cloud computing refers to the on-demand delivery of IT products online, which enables businesses to access databases, power and storage through the cloud, instead of through a physical data center. Microsoft Azure and Amazon Web Services (AWS) are the two main cloud service providers.

Examples of cloud services include:

Cloud networking capabilities are provided by the cloud service providers.

There are various types of cloud networking. Here are the main ones:

A public cloud means that the servers are being shared by other people. You might think of a public cloud as similar to a public swimming pool. This type of cloud is adjustable to different capacities of a companys IT department. Multiple users may be using a public cloud but will all be able to benefit from the service.

While a public cloud is like sharing a pool, a virtual private cloud (VPC) is more like putting a rope around the pool and creating a private area. Companies can choose to build their own private cloud within a public cloud. Increased security is the main reason a company would choose a VPC.

Hybrid cloud refers to the combination of public clouds and VPCs. The term hybrid cloud network is also used to refer to the connection between a physical data center and a public cloud.

The term multicloud refers to using more than one cloud provider. For example, a company might choose to use services both from AWS and Microsoft Azure. While they both offer cloud computing, they do have differences. A company may find the differences between the two providers make it worth it to use both.

All sorts of businesses use cloud networking for a range of reasons. The capabilities are endless and the use of cloud networks is only expected to grow in the coming years. Essentially, the cloud gives companies capabilities, storage and infrastructure to build and develop in a way that was not previously possible.

Small and medium businesses from a wide range of industries are able to benefit from the use of cloud networking. Here are a few examples of how cloud networking can be used:

Cloud computing refers to delivering cloud services over the internet and the on-demand delivery of IT products online, which enables businesses to access databases, power and storage through the cloud. Cloud networking refers to the connection between the different devices required for cloud computing. Though these terms are distinct, they are often used interchangeably.

There are countless benefits to using cloud technology. The main benefits of cloud networking include:

The first steps to creating a cloud network include familiarizing yourself with cloud networking concepts, developing a network architecture plan and choosing a cloud provider. Amazon Web Services and Azure are the two most popular providers. Youll want to consult with the cloud provider for a full set of instructions on how to create a cloud network. Amazon Web Services has extensive online video tutorials that can help with the process.

Cloud technology has revolutionized the way companies run. The cloud has opened up endless possibilities for small and medium businesses, making it faster and more affordable to fulfill tasks, scale, keep track of large sets of data and communicate and collaborate remotely. Almost every type of business uses cloud computing today, from working on shared documents that are stored on the cloud, to hosting customer-facing web applications. There are many benefits of cloud networking, including scalability, flexibility, mobility and reduced operating costs.

There are plenty of reasons to use cloud networking and these days almost every business relies on the cloud to some capacity. Cloud networking is scalable and flexible. It allows organizations to increase their infrastructure according to changing demands. Cloud networking also saves costs, as companies only pay for the services they use as they go. In addition, cloud networking also offers increased security for businesses that dont have the capacity to run an entire cybersecurity team in-house.

The Virtual Private Cloud (VPC) of Amazon Web Services (AWS) is an example of a cloud network. AWS VPC enables users to create a private cloud within the AWS cloud, which increases the security.

The first steps to creating a cloud network include familiarizing yourself with cloud networking concepts, developing a network architecture plan and choosing a cloud provider. Amazon Web Services and Azure are the two most popular providers. Youll want to consult with the cloud provider for a full set of instructions on how to create a cloud network. Amazon Web Services has extensive online video tutorials that can help with the process.

Because almost every company today uses cloud computing, expertise in cloud technologies is in high demand. There are plenty of career paths to choose from in this field such as cloud architect, cloud engineer, cloud consultant and DevOps engineer. These jobs are intellectually challenging and offer high earning potential in a field with growing demand.

Share your feedback

Thank You for your feedback!

Something went wrong. Please try again later.

Forbes Advisor adheres to strict editorial integrity standards. To the best of our knowledge, all content is accurate as of the date posted, though offers contained herein may no longer be available. The opinions expressed are the authors alone and have not been provided, approved, or otherwise endorsed by our partners.

Leeron is a New York-based writer with experience covering technology and politics. Her work has appeared in publications such as Quartz, the Village Voice, Gothamist, and Slate.

For over 15 years, Kiran has served as an editor, writer and reporter for publications covering fields including advertising, technology, business, entertainment and new media.He has served as a reporter for AdAge/Creativity and spent several years as an edito and writer at Adweek. Along the way, he has also served in managing editor roles at the likes of PSFK and Ladders, worked in PR as a director of content, and most recently served as a Senior Editor at Dotdash Meredith for personal finance brand The Balance and then Entertainment Weekly. At Forbes Advisor, Kiran brings his experience and expertise to reinforce the brand's reputation as the most informative, accessible and trusted resource in small business.

Matt is a proven leader in IT, combining a masters degree in Management Information Systems and solid experience with a proven track record in IT, leading business initiatives to help organizations meet their goals. He has led the security practices at 2 different MSPs, been a Health IT Director, a project manager, business analyst, system administrator, systems architect...if it has to do with IT, he's probably done it. He helped author the CMMC Certified Professional and CMMC Certified Assessor field guides and has spoken at conferences all over the country regarding CMMC, IT security, risk. Matt has worked with Fortune 500 companies and small businesses, in areas ranging from engineering to marketing and supply chain to health care.

Are you sure you want to rest your choices?

cancel ok

View original post here:
What Is Cloud Networking? Definition, Types & Benefits - Forbes