Page 834«..1020..833834835836..840850..»

3 critical stops on the back-end developer roadmap – TechTarget

Those seeking a career in back-end development and enterprise architecture will find occupational roadmaps contain a somewhat predictable list of required skills. These skills typically revolve around a proficiency in one or more high-profile programming languages, an understanding of both relational and NoSQL database operations, the ability to work with major back-end development frameworks and experience with container orchestration.

While knowledge of relational databases and RESTful APIs is essential, back-end developers shouldn't overlook the importance of other important development concepts.

A good roadmap will include certain overlooked skills that are just as important as Node.js runtimes and RESTful API builds.

To help new back-end developers get a step ahead on their journey, let's review three essential topics: messaging, cloud-based services and the modern design patterns that make microservices and cloud-native deployments scalable and productive.

New developers often see topics, queues and messaging as advanced areas. As a result, there is a lack of familiarity with this important back-end concept, along with a reluctance to incorporate messaging into an enterprise architecture.

Back-end developers need a strong understanding of how to incorporate message-based, publish and subscribe systems into their networks. The benefits of these architectures include the following:

In a traditional, synchronous system, the client makes a request to the server and waits for a response. In I/O-based architectures, each request triggers the creation of a new process on the server. This limits the number of concurrent requests to the maximum number of processes the server can create.

With traditional architectures, the server handles requests in the order it receives them. This can result in situations where simple actions stall and fail because the server is bogged down with complex queries that had arrived earlier. By introducing topics, queues and message handling into an enterprise architecture, back-end developers can enable synchronous interactions.

With a message-based system, developers place requests in a topic or queue. Subscribers, which might be SOA-based components or lightweight microservices, will read messages off the queue and reliably handle incoming requests when resources are available. This makes architectures more resilient, as they can spread out peak workloads over an extended period.

Queues can also categorize messages they receive. A publish-subscribe system can call on a server with more power to handle complex requests, while other machines handle the rest.

In modern environments, back-end developers will create subscribers as lightweight microservices that can be easily containerized and managed through an orchestration tool such as Kubernetes. As such, message-based systems are easily integrated into modern, cloud-native environments.

Back-end developers should introduce messaging and queues into an enterprise system whenever the following applies:

A traditional back-end architecture involves an application server that will interact with a relational database or a NoSQL system. Back-end developers are typically well-versed and comfortable with these types of system designs.

The problem with the inclusion of topics and queues is that they require back-end developers or system architects to introduce a new component into the enterprise architecture. Systems that include delayed processing, publish-subscribe systems and asynchronous communication are not typically part of an initial system design. As a result, back-end developers who want to use these types of systems must introduce a new server-side technology into the mix.

A reluctance to change and an excessive aversion to risk can often be a barrier to inclusions of messaging systems in modern, enterprise architectures.

Back-end developers trained and experienced with on-premises data centers sometimes overlook the benefits cloud computing can deliver. To be fully qualified to work in a modern enterprise, back-end developers must know how to create and deploy lambda expressions and how to provision and deploy managed services in the cloud.

Serverless computing allows programmers to develop business logic as lambda functions and deploys code directly into the cloud without the need to provision underlying servers or manage services at runtime.

The cloud vendor hosts lambda expressions on reliable, fault-tolerance infrastructure that can scale up or down to handle invocations as they happen. Without any infrastructure to manage, lambda expressions and the serverless computing architecture that supports them can greatly simplify the deployment stack and make the continuous delivery of code to the cloud easier.

Not only does serverless computing reduce the runtime management overhead, it can also be a cheaper deployment model. The pay-per-invocation serverless computing model has the capacity to reduce an organization's cloud spending, which is always an important nonfunctional aspect of enterprise deployments.

Back-end developers must be aware of the array of managed services the cloud makes available.

In the past, organizations would think about cloud computing as a reliable location for data storage and remotely hosted VMs. Today, with managed services, a cloud vendor handles the complexities of installation, provisioning and runtime management.

For example, in the past, to deploy container-based microservices into the cloud, the client would need to provision multiple VMs, such as EC2 instances in AWS, and install software to support the master Kubernetes node, master note replicas, multiple worker nodes and networking between master and worker nodes.

Runtime management, software updates, logs, upgrades and audits would be the client's responsibility. With a managed Kubernetes service, such as AWS Fargate, these complexities are hidden from the client.

With a managed Kubernetes service, microservices can be deployed directly into the cloud -- without the need to configure the environment. Logging, auditing and change tracking are provided by the cloud vendor.

A complete roadmap for back-end developers must include an ability to build and deploy serverless applications, along with an understanding of the types of fully managed services cloud vendors make available to their clients.

Singletons, factories, bridges and flyweights are widely known by developers as design patterns. Unfortunately, these design patterns are so common that new patterns created from the continuous delivery of cloud-native software hosted in orchestrated containers don't always get recognized.

Every back-end developer must know the standard Gang of Four design patterns and its categories: creational, behavioral and structural. They must also be familiar with modern, cloud-native design patterns as well, such as the API gateway, the circuit-breaker and the log aggregator.

The API gateway is now commonplace in cloud-native deployments. It provides a single interface for clients that might need to access multiple microservices. Development, integration and testing is easier when API makers deliver their clients a single, uniform interface to use.

Additionally, an API gateway can translate the data exchange format used by microservices into a format that is consumable by devices that use a nonstandard format, such as IoT.

A cloud-native request-response cycle might include multiple downstream calls before a roundtrip to a back-end resource is complete. However, if one of those microservices at the end of an invocation chain fails, then the failed microservices pipeline has wasted a great deal of processing power.

To stop a flood of microservices calls that will inevitably lead to failure, a common cloud-native design pattern is to include a circuit breaker in the invocation flow. A circuit breaker will recognize when calls to a microservice have either failed or taken an unreasonable length of time to be fulfilled.

When an error trips the circuit breaker, the client gets an immediate error response, and the circuit breaker will stop any downstream calls. This allows the microservice to continue to function while the failed call is worked out and saves the client time. It relieves the back-end system from consuming unnecessary resources.

When a predetermined amount of time transpires, the circuit breaker will send requests downstream again. If those requests are returned successfully, the circuit breaker resets and clients proceed as normal.

Administrators can deploy stateless, cloud-native applications to any computer node participating in a distributed cluster. However, by default, container-based applications log all events to their node's local hard drive, not to a shared folder or central repository.

As a result, every cloud-native deployment needs a mechanism to push log files from each worker node to a central data store. The logs are then managed within a log aggregator.

The aggregator will not only store the logs for auditing and troubleshooting purposes, it will also standardize the logs in a format that will make it possible to trace user sessions and distributed transactions that touched multiple nodes in the cluster.

Along with knowledge of important microservices design patterns, a cloud-native developer must also be familiar with the tenets of the 12-factor app, which provides guidance on how to configure several things, including the following:

While there is no official standard on how to develop and deploy a cloud-native application, back-end developers who stick to the tenets of the 12-factor app should encounter fewer problems with development, deployment and runtime management of microservices-based systems.

Read the original here:
3 critical stops on the back-end developer roadmap - TechTarget

Read More..

Quantum International Corp. (QUAN) Provides Information on Their … – Marketscreener.com

Denver, Colorado--(Newsfile Corp. - September 18, 2023) - Quantum International Corp. (OTC Pink: QUAN), an app development company employing a group of highly specialized developers from around the world announces its providing information on their upcoming Crowd Funding app.

Quantum International Corp. (OTC Pink: QUAN), an app development company, is finalizing a new crowdfunding app called QFund.AI. QFund.AI is a for-profit platform that allows people to raise money for events ranging from life events to challenging circumstances. The main difference between QFund.AI and other crowdfunding platforms is that it uses cryptocurrency instead of native currency.

To use QFund.AI, users will create a campaign page and describe their fundraising cause. They can then share their page on social media and email. People who want to donate can do so using any cryptocurrency account or wallet. QFund.AI will charge a small fee, which can be paid in either cryptocurrency or native currency.

QFund.AI is unique in the crowdfunding space because it based on crypto, moderated by AI and not incentive-based. This means that users cannot offer rewards or incentives in exchange for donations. However, QFund.AI does allow projects that are meant to fund other projects, such as those by musicians or inventors.

QFund.AI is scheduled to have special section dedicated to users who are trying to raise money for tuition. An example, a user could try and raise funds for out-of-state tuition to a Law program or a homeless high school valedictorian could try and raise funds to attend college and help their family.

QFund.AI is similar to other crowd funding platforms in that it targets social media platforms to create awareness for campaigns. However, QFund.AI takes a different approach to social media promotion. Instead of encouraging users to share their campaign pages on social media, QFund.AI will use AI to identify potential donors and reach out to them directly.

QFund.AI will be a decentralized platform that uses the LGCY Blockchain to store client funds and deposits. This makes it different from other centralized platforms, which are more vulnerable to security breaches. QFund.AI prioritizes the security and safety of its users' funds, and the use of blockchain technology helps to ensure this.

QFund.AI is still in its final development stage, but it has the potential to disrupt the crowdfunding space. The platform's use of cryptocurrency, AI-based social media promotion, and focus on personal causes could make it a more attractive option for users than traditional crowdfunding platforms.

"This platform will be the next step in Crowd Funding and we are looking forward to completing the App. We are striving to have the App available soon to try and help the fire survivors in Lahaina, Maui, My Home Town," stated CEO Justin Waiau.

Follow us on (@QuantumintlCorp), Instagram and (@QuantumintlCorp}.

Welcome to a New Era of App development.

Statements in this press release that are not historical fact may be deemed forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Although Quantum International Corp believes the expectations reflected in any forward-looking statements are based on reasonable assumptions, Quantum International Corp is unable to give any assurance that its expectations will be attained. Factors or events that could cause our actual results to differ may emerge, and it is not possible for us to predict all of them. The Company undertakes no obligation to publicly update any forward-looking statement, whether as a result of new information, future developments or otherwise, except as may be required by law.

Justin Waiau quantumintlcorp@gmail.com

To view the source version of this press release, please visit https://www.newsfilecorp.com/release/180949

Read the rest here:
Quantum International Corp. (QUAN) Provides Information on Their ... - Marketscreener.com

Read More..

Loadshedding Cannot Win Against the Cloud – IT News Africa

As load shedding continues to challenge businesses and individuals alike, there emerges a beacon of hope and efficiency: cloud computing. This revolutionary solution is not just the future; its the present, ensuring that load shedding is but a minor glitch in our digital lives.

Load shedding has posed significant challenges to businesses, particularly in South Africa where, according to a 2019 study by the South African Department of Energy, it cost the economy approximately R59 billion, hindering economic growth, says Graeme Millar, managing director of SevenC Computing.

Millar adds, Beyond the immediate loss of productivity due to power outages, businesses also suffer from disrupted communication channels, potential data losses, compromised security systems, and a general decrease in consumer confidence. This sporadic power supply not only stalls daily operations but erodes the competitive edge of South African businesses in the global market.

Millar continues to say that while load shedding may challenge operations, with the power of cloud computing, businesses can transform these hurdles into mere inconveniences, ensuring data remains accessible and protected in the grand digital scheme of things.

At its core, cloud computing is about storing and accessing data and applications over the internet, rather than relying on a local server or personal computer. This internet-based computing method offers a variety of services including storage, management, and processing of data.

What gives the cloud its edge, especially in combatting load shedding, is its decentralised nature. Instead of data being stored in a single location, cloud services use a network of remote servers hosted on the internet.

This means that even if one server in one location faces power issues, the others can take the load, ensuring continuous data access.

While the uninterrupted access provided by cloud services is a significant advantage during load shedding, the benefits of cloud computing dont end there:

Power interruptions dont just cut off access; they can cause data corruption or loss. Cloud storage offers a safeguard against such losses. With real-time data backup, the cloud ensures that data remains intact and uncompromised even during sudden power outages.

The cloud reduces the need for businesses to invest in expensive infrastructure. Instead of maintaining their own servers (which are susceptible to local power issues), companies can leverage the power of robust cloud servers.

Cloud solutions can easily be scaled up or down based on the needs of a business. This dynamic adjustability ensures businesses only pay for what they use and can adapt swiftly to changing demands.

While cloud computing offers a solution for the present, its potential for the future is limitless.

As technology continues to evolve, the capabilities of the cloud will expand, offering even more robust solutions against challenges like load shedding. Its not just about having a backup plan; its about adopting a system that offers efficiency, protection, and scalability.

while load shedding might be a hurdle, its not insurmountable. With cloud power on our side, businesses and individuals can look forward to a future where operations remain unhindered, where data is continuously accessible and protected, and where load shedding is merely a minor inconvenience in the grand digital scheme of things.

Go here to read the rest:
Loadshedding Cannot Win Against the Cloud - IT News Africa

Read More..

Building a Cyber Resilient Business: The Recover Layer – MSSP Alert

The security landscape is changing rapidly and attacks are becoming more sophisticated and complex. At the same time, businesses worldwide are digitizing their workflows and relying on cloud platforms to carry out their operations.

While these tools are great for storing data and interacting with customers, they can also make businesses more vulnerable to cybercrime. In 2022,one in threebusinesses experienced data loss from these types of SaaS platforms.

Now more than ever businesses need to become cyber resilient by incorporating a layered approach to security that includes steps for prevention, protection, and recovery. When a breach or other disaster happens, every business should have systems in place to minimize loss and resume business as soon as possible.

Data security and cyberattack prevention often get the most focus, but a comprehensive approach to cyber resiliency must consider business continuity and recovery too. Any business downtime can put your customers, your reputation, your digital infrastructure, and your business model at risk.

There are a number of things that can cause disruption, business downtime, and data and device vulnerability. An attack or breach is only one threat you need to be prepared to recover from. Data loss can also occur from:

Humans make mistakes all the time. An employee could accidentally wipe out data that you need.

Hurricanes, earthquakes, floods, tornados, and fire can all pose a threat to on-site servers.

Hardware and network failures prevent access to your applications and data which disrupt your business operations.

When these events happen, you want the right tools in place to restore your data quickly.

In order to be cyber resilient, consider implementing the following strategies:

Its easy to believe that using a cloud-based application means your data is backed up and secure, but its not. SaaS vendors explicitly state that data protection and backup is the responsibility of the customer, yet many businesses rely on the native recovery options.

Those options are limited and ineffective. Deleted files are only stored in an applications trash bin for a matter of weeks, while theaverage time to detect a breach is ten months. By the time most companies notice an attack has happened, its too late to restore from the application alone.

Having a separate backup is not just necessary to ensure business continuity, but its also required to remain regulation compliant. GDPR, HIPAA, Sarbanes-Oxley, New Yorks SHIELD, and Californias CCPA all require that a business prove it can recover information after a loss.

OpenText Cybersecuritys Carbonite Cloud-to-Cloud Backup protects critical data stored on SaaS platforms by automating backups, encrypting the information, and keeping it secure so that you can restore data quickly.

Companies throughout the world are transitioning to hybrid operations where they use a mix of local servers and cloud system storage for storing data and doing business. As they expand their digital footprint, theyre also expanding their risk of cybercrime and data loss between systems.

In the event of a breach or other disaster, you need a cohesive and continual backup that will allow a full restoration.

OpenText Cybersecuritys Carbonite Server Backup provides a secure and continuously updated backup for critical local and cloud servers. With this tool, you can manage your backups and recover specific data or entire systems when needed.

You can have every tool in place for preventing and protecting against cyber attacks and still be susceptible to data loss from natural disasters, hardware failures or power outages. Many of these events give us little to no warning, so being prepared to restore data at any time is necessary.

OpenText Cybersecuritys Carbonite Recover is a continuously updated, running backup of critical servers and systems that lets you failover to an up-to-date copy with just a few clicks and within minutes.

All of these tools are part of OpenText Cybersecurity, providing you with all three aspects of recovery protection through a single provider in addition to tools that address the prevention and protection layers of a cyber resilient approach.

The team at My Fathers World, a Christian homeschool curriculum provider located in tornado country, constantly worried about system backups and disaster recovery.

Initially they ran a parallel physical server system that was located eight miles from their primary servers, using software to backup between the two locations. That approach became too expensive and time consuming as the business grew. Thats when My Fathers World began using cloud-hosted versions of Carbonite Server Backup and Carbonite Cloud Disaster Recovery by OpenText.

Carbonite Server Backup stores copies locally as well as in the Carbonite cloud and provides business continuity and data protection while meeting compliance regulations. Carbonite Cloud Disaster Recovery provides a fully managed service with a remote team of continuity and disaster recovery specialists that ensure the recovery of the businesss critical systems in the cloud.

My Fathers World signed up for the 48-hour recovery service, but in yearly tests, a full recovery has been achieved in just nine hours, while recovery of a specific file only takes minutes.

Guest blog courtesy of OpenText Cybersecurity. Regularly contributedguest blogsare part of MSSP Alertssponsorship program.

Continued here:
Building a Cyber Resilient Business: The Recover Layer - MSSP Alert

Read More..

Solid air: building secure clouds – Software applications – ERP Today

Observability before fragility, this is how to build secure clouds.

Building contemporary software applications is a complex process. Despite the rise of low-code/no-code platforms and a whole firmament of software application development and data analytics automation channels, we still exist in a world where systems are created with instabilities, fragilities and incompatibilities.

With ERP systems very typically supporting mission-critical use cases which in some scenarios straddle and support life-critical software deployments the need to assess where the pressure and pain points are in the modern IT stack has never been more pressing.

But where do we start? Is it a question of the surface-level interactions at the presentation layer and the way we manage what users are capable of doing inside their chosen applications user interface? Is it the middle-tier networking layer and all the application programming interface (API) connections that now form neural joins between data, services, operating systems and applications themselves? Could it be the lower substrate base of our technology infrastructures and the way these are now engineered to perform in essentially cloud-native environments? Or, inevitably, is it perhaps all of these tiers and all the internal connection points within them?

Spoiler alert! No prizes for guessing that its obviously everywhere. Code fragility manifests itself in a seemingly infinite variety of forms and functional formulations from syntax structures to higher-level system semantics. Given the Sisyphean challenge ahead then, shall we at least start at the foundation level with infrastructure?

We need to build enterprise technologies based upon infrastructures that work as dynamically orchestrated entities Sumedh Thakar, Qualys

Qualys CEO Sumedh Thakar says he has looked at the use case models being showcased across his firms customer base and beyond. As an IT, security and compliance solutions company, Qualys has advocated a concerted move to Infrastructure-as-Code (or IaC) as a core capability for secured operations.

Thakar has described the need to now build enterprise technologies based uponinfrastructures that work as dynamically orchestrated entities with fine-grained engineering controls

As it sounds, IaC is a SaaS cloud method that delivers a descriptive model to define and subsequently provision a cloud infrastructure layer. Just as a physical infrastructure would include data storage capabilities, server capacities and properties, lower system level relationships and network management control tools such as load balancer an IaC layer does the same, but is defined by code, for the cloud.

Among the technologies on offer here is Qualys CloudView, an IaC-level management product designed to enable firms to assess what is now being called their level of cloud security posture management (CSPM). CEO Thakar points to his firms ability to shift left (i.e. start earlier) an enterprises approach to cloud security via a combination of integrated application services designed to insert security automation into the entire application lifecycle. Qualys says this ensures visibility into both pre-deployment application build-time and post-deployment live operational runtime environments to check for misconfiguration and more, all via a single unified dashboard.

But why does so much misconfiguration happen and can we do more to stop it? Largely, it appears to be a natural byproduct of diversity in both cloud-native and terrestrial enterprise software platform environments that may eventually migrate to cloud.

Misconfigurations happen for a range of reasons, the most blatant being insecure-by-default settings, where security or hardening is an added control rather than a default state, says Martin Jartelius, CSO at Outpost24. The next challenge is that configurations are often evaluated in test environments where insecure configurations may occur, such as use of invalid certificates or ignoring signatures and validations at the point of testing. This means that once a transition to production is made, insecurities remain just as they were tested.

Years ago, the silver bullet was called gold builds for workstations and servers today its the Infrastructure-as-Code Martin Jartelius, Outpost24

Further, notes Jartelius, new functionality is added over time and unless organizations are attentive, this may introduce new and once again default insecure options. He agrees that an IaC approach offers the benefits of reusing trusted and well-audited templates and thereby reduces the room for human errors. It does not address the root causes of misconfigurations, but it does allow consistency.

Without consistency, maintenance and keeping up to date become even harder, and keeps getting harder over time, clarifies Jartelius.

Years ago, the silver bullet was called gold builds for workstations and servers today its the Infrastructure-as-Code. It will not solve all issues, but it will provide a remedy to a degree and most importantly it will allow those proactive who utilize its full power to reap its benefits.

For those who are not versed in what they are doing, it however creates the opportunity to do a low-quality job faster, so at the end of the day just as any other tool its usefulness depends entirely on whose hands it is put.

With so many differences in syntax, format, structure, code dependencies and other delineating factors across every development environment, every application toolset and indeed every cloud platform, misconfiguration is an inevitability throughout the modern IT stack all the way to the presentation layer. But that said, we can dive into cloud security more specifically to understand whats happening here.

Misconfigurations in the cloud can cause data breaches James Hastings, eSentire

Because of the complexities described thus far, the overarching issue many organizations see when working with live cloud environments is a lack of visibility. This manifests in multiple ways such as complications when more than one cloud account or platform is leveraged, issues where the chosen technology necessitates new tooling for security monitoring purposes, or a lack of understanding of what is deployed or how its configured. This is the opinion of James Hastings in his position as senior product manager at eSentire, a cloud software and security specialist focused on managed detection and response.

Misconfigurations in the cloud, which occur due to improper settings being used when architecting and deploying to cloud platforms, can cause data breaches that have a business-wide impact. According to a recent Cyber Security Hub study on the future of cloud security, almost half of the respondents (44 percent) said their primary challenge with cloud security was a reduced ability to detect and prevent cloud misconfigurations.

This lack of visibility usually stems from improper tooling that either cant pull the needed data from a cloud account or workload, or where the tool isnt designed to scale in cloud environments, said Hastings.

These issues impact both ends of the cloud adoption model; users early in their cloud journey struggle from a lack of knowledge and experience, while cloud-native customers tend to run into issues establishing visibility and monitoring for services like serverless functions and other shared or ephemeral technologies. Outside of visibility, the eSentire team reports that cloud customers experience some common security pain points like alert fatigue and fear of (or the inability to detect) unknown threats.

Looking at the way cloud-centric IT departments are run today, can we ask whether IT security teams and developers really collaborate with each other effectively even in the so-called age of DevOps, when it should arguably be taken as a given? Or is there still a need to improve this aspect of operations?

It really depends on the teams in question, but in my experience, yes, some do and its becoming easier than ever, says Hastings. Newer cloud security tools take a more holistic approach to security. These solutions usually feature multiple modules that are all intertwined to offer native multi-signal correlation out of the box and are increasingly targeting the shift of security into the development process.

Tooling such as code analysis focuses on hardening application code before its deployed to server-based or serverless workloads; the hardening of this code reduces the attack surface of the eventual workload and also cuts down on the patching, investigation and response that might otherwise be necessary.

The previously noted CSPM-style checks found in IaC are helpful when it comes to evaluating cloud infrastructure for misconfigurations but, notes the eSentire engineering team, this process happens as a fundamental part of the automation template. So this enables organizations to create secure infrastructure from the get-go and spend less time on remediating platform misconfigurations.

The last tool that we see making a significant impact on this collaboration is the idea of integrating vulnerability assessment into a continuous integration and continuous deployment (CI/CD) pipeline, explains Hastings. Here, before any code or a container can be published, it must have a vulnerability assessment run against it. Organizations are able to set their own bar for security compliance and even go as far as blocking a build that doesnt meet their security standards.

All said and done then, doesnt where we stand now in cloud security (and system robustness as a whole) beg a wider question? Are modern IT approaches built to be secure by design, or are we missing out on embedding security into these processes from the start? It comes back to a comment and sentiment spoken many times when customers move to the cloud cloud is more secure. While the statement is true in broad terms, it really needs qualification; perhaps we should instead say cloud can be more secure, but its up to each and every organization to lock it down and make it so.

The problem with cloud security and perhaps system security in general is, its all too often bolted on and implemented as an afterthought. Hastings speaks from a position of experience and reminds us that most cloud practitioners (and certainly all cloud-native practitioners) realize this inconvenient truth.

This has ultimately spawned the idea of shifting security left i.e. starting it earlier and/or pushing a more embedded approach to security into engineering and DevOps practices, says Hastings. Doing so embeds security throughout the organizations operational fabric and means that code is written and infrastructure is created to a secure and locked-down narrative. It reduces the number of times that teams need to circle back to change code, implement patches or make other changes that likely have change control and approval processes in place.

The combination of security and development streamlines both processes, reduces the organizations risk, and enables velocity.

Cloud computings evolution has been nothing if not flaky from the start. We know that AWS CEO Adam Selipsky talks of the very early stages of cloud as having been a somewhat embryonic phase, when the virtualization planets were still aligning. Its for sure that we have spent the last decade and more shoring up security, consolidating cloud tool sprawl and looking for key avenues through which we can automate many of the management tasks that can lead to cloud fragility in the first place.

If we had the chance to do cloud all over again, we might use a different and more considered approach, but perhaps we wouldnt. This might just be a hefty symptomatic nuance of the way new technology platforms rapidly escalate and eventually germinate, oscillate, occasionally fluctuate and finally become part of our operational substrate.

Go here to read the rest:
Solid air: building secure clouds - Software applications - ERP Today

Read More..

Apple admits that data should be kept off its cloud – Fudzilla

Much safer that way

Fruity cargo cult Apple has effectively admitted that if you want to keep your data safe it is problem better that you dont put it on its cloud.

While it was banging on about data security at its Wonderlust event Apple seemed to be telling all that listened that keeping your data safe sometimes means keeping it out of the cloud.

Apple said that the risk was always there that someone, somewhere, may get access to your personal information in the cloud.

To prove Apples point it said that all Siri Health data is processed on Apple Watch S9 instead of the cloud and the iPhone 15 Pro's A17 Pro chip Neural Engine uses machine learning on the device without sending your personal data to the cloud.

The Tame Apple Press, along with those speaking at the event seem to be pushing edge security rather than cloud. However, what they appear to be ignoring is that Apple is admitting that its cloud security is not up to snuff.

If only companies which administered cloud servers took their security a little more seriously.

See the rest here:
Apple admits that data should be kept off its cloud - Fudzilla

Read More..

AI and Blockchain Integration for Preserving Privacy – Unite.AI

With the widespread attention, and potential applications of blockchain and artificial intelligence technologies, the privacy protection techniques that arise as a direct result of integration of the two technologies is gaining notable significance. These privacy protection techniques not only protect the privacy of individuals, but they also guarantee the dependability and security of the data.

In this article, we will be talking about how the collaboration between AI and blockchain gives birth to numerous privacy protection techniques, and their application in different verticals including de-identification, data encryption, k-anonymity, and multi-tier distributed ledger methods. Furthermore, we will also try to analyze the deficiencies along with their actual cause, and offer solutions accordingly.

The blockchain network was first introduced to the world when in 2008 Nakamoto introduced Bitcoin, a cryptocurrency built on the blockchain network. Ever since its introduction, blockchain has gained a lot of popularity, especially in the past few years. The value at which Bitcoin is trading today, and it crossing the Trillion-dollar market cap mark indicates that blockchain has the potential to generate substantial revenue and profits for the industry.

Blockchain technology can be categorized primarily on the basis of the level of accessibility and control they offer, with Public, Private, and Federated being the three main types of blockchain technologies. Popular cryptocurrencies and blockchain architectures like Bitcoin and Ethereum are public blockchain offerings as they are decentralized in nature, and they allow nodes to enter or exit the network freely, and thus promotes maximum decentralization.

The following figure depicts the structure of Ethereum as it utilizes a linked list to establish connections between different blocks. The header of the block stores the hash address of the preceding block in order to establish a linkage between the two successive blocks.

The development, and implementation of the blockchain technology is followed with legitimate security and privacy concerns in various fields that cannot be neglected. For example, a data breach in the financial industry can result in heavy losses, while a breach in military or healthcare systems can be disastrous. To prevent these scenarios, protection of data, user assets, and identity information has been a major focus of the blockchain security research community, as to ensure the development of the blockchain technology, it is essential to maintain its security.

Ethereum is a decentralized blockchain platform that upholds a shared ledger of information collaboratively using multiple nodes. Each node in the Ethereum network makes use of the EVM or Ethereum Vector Machine to compile smart contracts, and facilitate the communication between nodes that occur via a P2P or peer-to-peer network. Each node on the Ethereum network is provided with unique functions, and permissions, although all the nodes can be used for gathering transactions, and engaging in block mining. Furthermore, it is worth noting that when compared to Bitcoin, Ethereum displays faster block generation speeds with a lead of nearly 15 seconds. It means that crypto miners have a better chance at acquiring rewards quicker while the interval time for verifying transactions is reduced significantly.

On the other hand, AI or Artificial Intelligence is a branch in modern science that focuses on developing machines that are capable of decision-making, and can simulate autonomous thinking comparable to a humans ability. Artificial Intelligence is a very vast branch in itself with numerous subfields including deep learning, computer vision, natural language processing, and more. NLP in particular has been a subfield that has been focussed heavily in the past few years that has resulted in the development of some top-notch LLMs like GPT and BERT. NLP is headed towards near perfection, and the final step of NLP is processing text transformations that can make computers understandable, and recent models like ChatGPT built on GPT-4 indicated that the research is headed towards the right direction.

Another subfield that is quite popular amongst AI developers is deep learning, an AI technique that works by imitating the structure of neurons. In a conventional deep learning framework, the external input information is processed layer by layer by training hierarchical network structures, and it is then passed on to a hidden layer for final representation. Deep learning frameworks can be classified into two categories: Supervised learning, and Unsupervised learning.

The above image depicts the architecture of deep learning perceptron, and as it can be seen in the image, a deep learning framework employs a multiple-level neural network architecture to learn the features in the data. The neural network consists of three types of layers including the hidden layer, the input payer, and the output layer. Each perceptron layer in the framework is connected to the next layer in order to form a deep learning framework.

Finally, we have the integration of blockchain and artificial intelligence technologies as these two technologies are being applied across different industries and domains with an increase in the concern regarding cybersecurity, data security, and privacy protection. Applications that aim to integrate blockchain and artificial intelligence manifest the integration in the following aspects.

In the current scenario, data trust systems have certain limitations that compromise the reliability of the data transmission. To challenge these limitations, blockchain technologies can be deployed to establish a dependable and secure data sharing & storage solution that offers privacy protection, and enhances data security. Some of the applications of blockchain in AI privacy protection are mentioned in the following table.

By enhancing the implementation & integration of these technologies, the protective capacity & security of current data trust systems can be boosted significantly.

Traditionally, data sharing and data storing methods have been vulnerable to security threats because they are dependent on centralized servers that makes them an easily identifiable target for attackers. The vulnerability of these methods gives rise to serious complications such as data tampering, and data leaks, and given the current security requirements, encryption methods alone are not sufficient to ensure the safety & security of the data, which is the main reason behind the emergence of privacy protection technologies based on the integration of artificial intelligence & blockchain.

Lets have a look at a blockchain-based privacy preserving federated learning scheme that aims to improve the Multi-Krum technique, and combine it with homomorphic encryption to achieve ciphertext-level model filtering and model aggregation that can verify local models while maintaining privacy protection. The Paillier homomorphic encryption technique is used in this method to encrypt model updates, and thus providing additional privacy protection. The Paillier algorithm works as depicted.

De-Identification is a method that is commonly used to anonymize personal identification information of a user in the data by separating the data from the data identifiers, and thus reducing the risk of data tracking. There exists a decentralized AI framework built on permissioned blockchain technology that uses the above mentioned approach. The AI framework essentially separates the personal identification information from non-personal information effectively, and then stores the hash values of the personal identification information in the blockchain network. The proposed AI framework can be utilized in the medical industry to share medical records & information of a patient without revealing his/her true identity. As depicted in the following image, the proposed AI framework uses two independent blockchain for data requests with one blockchain network storing the patient's information along with data access permissions whereas the second blockchain network captures audit traces of any requests or queries made by requesters. As a result, patients still have complete authority and control over their medical records & sensitive information while enabling secure & safe data sharing within multiple entities on the network.

A multi-layered distributed ledger is a data storage system with decentralization property and multiple hierarchical layers that are designed to maximize efficiency, and secure the data sharing process along with enhanced privacy protection. DeepLinQ is a blockchain-based multi-layered decentralized distributed ledger that addresses a users concern regarding data privacy & data sharing by enabling privacy-protected data privacy. DeepLinQ archives the promised data privacy by employing various techniques like on-demand querying, access control, proxy reservation, and smart contracts to leverage blockchain networks characteristics including consensus mechanism, complete decentralization, and anonymity to protect data privacy.

The K-Anonymity method is a privacy protection method that aims to target & group individuals in a dataset in a way that every group has at least K individuals with identical attribute values, and therefore protecting the identity & privacy of individual users. The K-Anonymity method has been the basis of a proposed reliable transactional model that facilitates transactions between energy nodes, and electric vehicles. In this model, the K-Anonymity method serves two functions: first, it hides the location of the EVs by constructing a unified request using K-Anonymity techniques that conceal or hide the location of the owner of the car; second, the K-Anonymity method conceals user identifiers so that attackers are not left with the option to link users to their electric vehicles.

In this section, we will be talking about comprehensive analysis and evaluation of ten privacy protection systems using the fusion of blockchain and AI technologies that have been proposed in recent years. The evaluation focuses on five major characteristics of these proposed methods including: authority management, data protection, access control, scalability and network security, and also discusses the strengths, weaknesses, and potential areas of improvement. It's the unique features resulting from the integration of AI and blockchain technologies that have paved ways for new ideas, and solutions for enhanced privacy protection. For reference, the image below shows different evaluation metrics employed to derive the analytical results for the combined application of the blockchain and AI technologies.

Access control is a security & privacy technology that is used to restrict a users access to authorized resources on the basis of pre-defined rules, set of instructions, policies, safeguarding data integrity, and system security. There exists an intelligent privacy parking management system that makes use of a Role-Based Access Control or RBAC model to manage permissions. In the framework, each user is assigned one or more roles, and are then classified according to roles that allows the system to control attribute access permissions. Users on the network can make use of their blockchain address to verify their identity, and get attribute authorization access.

Access control is one of the key fundamentals of privacy protection, restricting access based on group membership & user identity to ensure that it is only the authorized users who can access specific resources that they are allowed to access, and thus protecting the system from unwanted to forced access. To ensure effective and efficient access control, the framework needs to consider multiple factors including authorization, user authentication, and access policies.

Digital Identity Technology is an emerging approach for IoT applications that can provide safe & secure access control, and ensure data & device privacy. The method proposes to use a series of access control policies that are based on cryptographic primitives, and digital identity technology or DIT to protect the security of communications between entities such as drones, cloud servers, and Ground Station Servers (GSS). Once the registration of the entity is completed, credentials are stored in the memory. The table included below summarizes the types of defects in the framework.

Data protection is used to refer to measures including data encryption, access control, security auditing, and data backup to ensure that the data of a user is not accessed illegally, tampered with, or leaked. When it comes to data processing, technologies like data masking, anonymization, data isolation, and data encryption can be used to protect data from unauthorized access, and leakage. Furthermore, encryption technologies such as homomorphic encryption, differential privacy protection, digital signature algorithms, asymmetric encryption algorithms, and hash algorithms, can prevent unauthorized & illegal access by non-authorized users and ensure data confidentiality.

Network security is a broad field that encompasses different aspects including ensuring data confidentiality & integrity, preventing network attacks, and protecting the system from network viruses & malicious software. To ensure the safety, reliability, and security of the system, a series of secure network architectures and protocols, and security measures need to be adopted. Furthermore, analyzing and assessing various network threats and coming up with corresponding defense mechanisms and security strategies are essential to improve the reliability & security of the system.

Scalability refers to a systems ability to handle larger amounts of data or an increasing number of users. When designing a scalable system, developers must consider system performance, data storage, node management, transmission, and several other factors. Furthermore, when ensuring the scalability of a framework or a system, developers must take into account the system security to prevent data breaches, data leaks, and other security risks.

Developers have designed a system in compliance with European General Data Protection Rules or GDPR by storing privacy-related information, and artwork metadata in a distributed file system that exists off the chain. Artwork metadata and digital tokens are stored in OrbitDB, a database storage system that uses multiple nodes to store the data, and thus ensures data security & privacy. The off-chain distributed system disperses data storage, and thus improves the scalability of the system.

The amalgamation of AI and blockchain technologies has resulted in developing a system that focuses heavily on protecting the privacy, identity, and data of the users. Although AI data privacy systems still face some challenges like network security, data protection, scalability, and access control, it is crucial to consider and weigh these issues on the basis of practical considerations during the design phase comprehensively. As the technology develops and progresses further, the applications expand, the privacy protection systems built using AI & blockchain will draw more attention in the upcoming future. On the basis of research findings, technical approaches, and application scenarios, they can be classified into three categories.

The technologies belonging to the first category focus on the implementation of AI and blockchain technologies for privacy protection in the IoT industry. These methods use AI techniques to analyze high volumes of data while taking advantage of decentralized & immutable features of the blockchain network to ensure authenticity and security of the data.

The technologies falling in the second category focus on fusing AI & Blockchain technologies for enhanced privacy protection by making use of blockchains smart contract & services. These methods combine data analysis and data processing with AI and use blockchain technology alongside to reduce dependency on trusted third parties, and record transactions.

Finally, the technologies falling in the third category focus on harnessing the power of AI and blockchain technology to achieve enhanced privacy protection in large-scale data analytics. These methods aim to exploit blockchains decentralization, and immutability properties that ensure the authenticity & security of data while AI techniques ensure the accuracy of data analysis.

In this article, we have talked about how AI and Blockchain technologies can be used in sync with each other to enhance the applications of privacy protection technologies by talking about their related methodologies, and evaluating the five primary characteristics of these privacy protection technologies. Furthermore, we have also talked about the existing limitations of the current systems. There are certain challenges in the field of privacy protection technologies built upon blockchain and AI that still need to be addressed like how to strike a balance between data sharing, and privacy preservation. The research on how to effectively merge the capabilities of AI and Blockchain techniques is going on, and here are several other ways that can be used to integrate other techniques.

Edge computing aims to achieve decentralization by leveraging the power of edge & IoT devices to process private & sensitive user data. Because AI processing makes it mandatory to use substantial computing resources, using edge computing methods can enable the distribution of computational tasks to edge devices for processing instead of migrating the data to cloud services, or data servers. Since the data is processed much nearer the edge device itself, the latency time is reduced significantly, and so is the network congestion that enhances the speed & performance of the system.

Multi-chain mechanisms have the potential to resolve single-chain blockchain storage, and performance issues, therefore boosting the scalability of the system. The integration of multi-chain mechanisms facilitates distinct attributes & privacy-levels based data classification, therefore improving storage capabilities and security of privacy protection systems.

Read more:
AI and Blockchain Integration for Preserving Privacy - Unite.AI

Read More..

New revelations from the Snowden archive surface – ComputerWeekly.com

He risked his neck. When Edward Snowden chose to expose the U.S. National Security Agency (NSA)'s mass surveillance Leviathan, and that of its British counterpart, GCHQ, 10 years ago, he put his life on the line. And he has always declared he has never regretted it.

But years after his act of extraordinary courage, the Snowden archive remains largely unpublished. He trusted in journalists to decide what to publish. In an article published in June 2023, by Guardian Pulitzer prize winner Ewen MacAskill - who flew to Hong Kong with Glenn Greenwald and Laura Poitras to meet Edward Snowden - McAskill confirmed that most of the archive has not been made public. "In the end, we published only about 1 percent of the document, he wrote.

What does the 99 percent of the Snowden archive contain? A decade on, it remains shrouded in secrecy.

A doctoral thesis by American investigative journalist and post-doctoral researcher Jacob Appelbaum has now revealed unpublished information from the Snowden archive. These revelations go back to a decade but they remain of indisputable public interest:

These revelations have surfaced for the first time thanks to a doctoral thesis authored by Appelbaum towards earning a degree in applied cryptography from the Eindhoven University of Technology in the Netherlands.

Titled "Communication in a world of pervasive surveillance", it is a public document and has been downloaded over 18,000 times since March 2022 when it was first published.

Appelbaum's work, supervised by professors Tanja Lange and Daniel J. Bernstein, is among the top ten most popular Ph.D. theses at the Eindhoven University.

When we asked whether the U.S. authorities had contacted the Eindhoven University of Technology to object to the publication of some of the revelations from the Snowden files, a university spokesperson replied that they had not.

In 2013, Jacob Appelbaum published a remarkable scoop for Der Spiegel, revealing that the NSA had spied on Angela Merkel's mobile phone. This scoop won him the highest journalistic award in Germany, the Nannen prize (later known as the Stern Award).

Nevertheless, his work on the NSA revelations and his advocacy for Julian Assange and WikiLeaks and for high-profile whistleblowers has put him in a precarious condition. As a result of this he has resettled in Berlin, where he has spent the last decade.

In June 2020, when the United States issued a second superseding indictment against Julian Assange, it was clear that Appelbaum's concerns were not a matter of paranoia; the indictment criminalizes political speeches given by Assange as well as by former WikiLeaks journalist Sarah Harrison and by Jacob Appelbaum himself, identified under the codename "WLA-3".

Public speeches made by Appelbaum taking a humorous and provocative tone and with titles like Sysadmins of the World, Unite! were interpreted as an attempt to recruit sources and as incitement to steal classified documents. To this day, however, there are no publicly-known charges against Appelbaum or Harrison.

We asked Jacob Appelbaum, currently a post-doctoral researcher at the Eindhoven University of Technology, why he chose to publish those revelations in a technically written thesis rather than a mass-circulation newspaper.

"As an academic", he replied, "I see that the details included are in the public interest, and highly relevant for the topic covered in my thesis, as it covers the topic of large-scale adversaries engaging in targeted and mass surveillance".

One of the most important unpublished revelations from the Snowden archive regards American semiconductor vendor Cavium. According to Appelbaum, the Snowden files list Cavium "as a successful SIGINT enabled CPUs vendor".

"The NSA's successful cryptographic enabling is by definition the introduction of intentional security vulnerabilities that they are then able to exploit, and they do exploit them often in an automated fashion to spy," he said.

"One such method", he added, "is sabotaging a secure random generator".

A random number generator that is unpredictable to everyone "is an essential requirement for meaningful cryptographic security. In most cases, the NSA sabotage happens in a way where the owners, developers, and users are unaware of the sabotage as a core goal".

The purpose of this sabotage is to allow the NSA to breach the security offered by a given company, device and/or other services.

At no point does Appelbaum write or even suggest that Cavium was complicit in these sabotage activities or was aware of them.

The Snowden documents date back to 2013. In 2018, Cavium was acquired by the U.S. company Marvell Technology, one of the two firms which, according to financial services giant J.P. Morgan, will dominate the custom-designed semiconductors market driven by Artificial Intelligence.

We contacted Marvell to ask a series of questions, including whether Cavium's CPUs have basically remained the same in the last decade, and whether it is certain that Cavium CPUs, which according to the 2013 Snowden files were backdoored, are no longer marketed and in use.

We also asked Marvell whether the company conducted any internal investigations after we informed them about Appelbaum's revelation. One of the co-founders of Cavium, Raghib Hussain, is currently one of the presidents of Marvell.

Marvell has not provided answers to our specific questions. Its vice president for Corporate Marketing, Stacey Keegan, said that it did not implement backdoors for any government.

Her statement reads in full:

"Marvell places the highest priority on the security of its products. Marvell does not implement backdoors for any government. Marvell supports a wide variety of protocols and standards including IPsec, SSL, TLS 1.x, DTLS and ECC Suite B.

Marvell also supports a wide variety of standard algorithms including several variants of AES, 3DES, SHA-2, SHA-3, RSA 2048, RSA 4096, RSA 8192, ECC p256/p384/p521, Kasumi, ZUC and SNOW 3G.

All Marvell implementations are based on published security algorithm standards [.] Marvells market leading NITROX family delivers unprecedented performance for security in the enterprise and virtualized cloud data centers.

The NITROX product line is the industry leading security processor family designed into cloud data center servers and networking equipment, enterprise and service provider equipment including servers, Application Delivery Controllers, UTM Gateways WAN Optimization Appliances, routers, and switches".

Appelbaum said that as the new owner of Cavium, Marvell "should conduct a serious and transparent technical security investigation into the matter and make the result available to the public".

He said that he wrote to the company, including to their security response email address, and set this forth in extreme detail, but has never heard back from them.

The two other important and yet unpublished revelations from the Snowden files concern the compromise of foreign government infrastructure by the NSA.

Appelbaum writes in his thesis that the Snowden archive includes largely unpublished internal NSA documents and presentations that discuss targeting and exploiting not only deployed, live interception infrastructure.

The documents also discuss targeting and exploiting the vendors of the hardware and software used to build the infrastructure.

Primarily these documents remain unpublished because the journalists who hold them fear they will be considered disloyal or even that they will be legally punished," he writes.

Appelbaum adds that "Targeting lawful interception (LI) equipment is a known goal of the NSA".

"Unpublished NSA documents specifically list their compromise of the Russian SORM LI infrastructure as an NSA success story of compromising civilian telecommunications infrastructure to spy on targets within reach of the Russian SORM system, he says.

Though Appelbaum did not publish the NSA slides on SORM in his thesis, he reports that they show two Russian officers wearing jackets bearing the slogan: "you talk, we listen".

He says that it is not unreasonable to assume that parts, if not the entire American lawful interception system, known as CALEA, have been compromised.

In his doctoral thesis he says that key European lawful interception systems "have been compromised by NSA and/or GCHQ". Appelbaum said that the Snowden archive contained many named target systems, companies, and other countries that had been impacted.

According to Appelbaum, "compromise" means different things: sometimes it is a matter of technical hacking, others it is a matter of "willful complicity from inside the company by order of some executives after being approached by the NSA.

Woe to those who do not comply immediately, he says.

Some of the most important revelations published from the Snowden archive concerned PRISM, a mass surveillance program which allowed the NSA to access emails, calls, chats, file transfers, web search histories.

The NSA slides claimed that this collection was conducted from the servers of internet giants like Google, Apple, Facebook, Microsoft, AOL, Skype, PalTalk and YouTube, but when the existence of this program was exposed by Glenn Greenwald and Ewen MacAskill inThe Guardianand by Laura Poitras and BartonGellmann in theWashington Post, the internet giants denied any knowledge of the program and denied that they had granted direct access to their servers.

Though PRISM was one of the very first revelations from the Snowden archive, Appelbaum reveals that "The PRISM slide deck was not published in full" and "several pages of the PRISM slide list targets and related surveillance data, and a majority of them appear to be a matter of political surveillance rather than defense against terrorism".

He explains that one such example of PRISM's targets being a matter of political surveillance rather than anti-terrorism "shows a suggestion for targeting the Tibetan Government in Exile through their primary domain name".

In 1950 the Peoples Republic of China took control of Tibet and met with considerable resistance from the Tibetan people. In 1959, the Fourteenth Dalai Lama left Tibet to seek political asylum in India, and there was a major exodus of Tibetans into India. The Dalai Lama set up the Tibetan Government in Exile in India and exiled Tibetans have accused China of cruelty and repression for decades.

Appelbaum reveals that the main domain of the Tibetan Government in Exile (tibet.net) "is named as an unconventional example that analysts should be aware of as also falling under the purview of PRISM". He explains that the email domain was "hosted by Google Mail, a PRISM partner, at the time of the slide deck creation and it is currently hosted by Google Mail as of early 2022". At the time of this writing, it still is.

According to him, tibet.net exemplifies the political reality of accepting aid from the United States. The system administrators wanted to be protected from Chinese hacking and surveillance. To fight Chinese surveillance, the technical team opted to host with Google for email and Cloudfare for web hosting. The reason Google appealed to the technical team behind tibet.net was the excellent reputation of Google's security team at that time.

"What was unknown at the time of this decision", Appelbaum explains, "was that Google would, willing or unwillingly, give up the data to the US government in secret. Thus in seeking to prevent surveillance by the Chinese government some of the time when the Chinese government successfully hack their servers, they unknowingly accepted aid that ensured their data will be under surveillance all of the time".

As a result, to fight the well-known devil of Chinese surveillance, the Tibetan Government in Exile put itself in the hands of the NSA.

How many important revelations like these do the unpublished documents still contain? It is impossible to say so long as the archive remains unpublished. It is also unclear how many copies of the full archive remain available and who has access to them.

Appelbaum says,"there was a discussion among many of the journalists who worked on the archive about opening access to the Snowden archive for academics to discuss, study, and of course to publish. This is a reasonable idea and it should happen, as it is clearly in the public interest".

He said it was a terrible day when The Guardian allowed GCHQ to destroy the copy of the archive in the United Kingdom. However, according to Ewen MacAskill's reporting in The Atlantic, "A copy of the Snowden documents remains locked in an office at the Times, as far as I know".

According to Jacob Appelbaum, The Intercept - the media outlet co-founded by Glenn Greenwald and Laura Poitras to publish the Snowden files - is no longer in possession of the documents. "I was informed that they destroyed their copy of the archive", Appelbaum tells us.

In 2013, the author of this article worked with Glenn Greenwald on the Snowden files regarding Italy, publishing all the documents that Greenwald shared with us in her newspaper at the time, the Italian newsmagazinel'Espresso.

After that journalistic work, we were contacted again to work on additional files, but unfortunately after some preliminary contacts, we never heard fromThe Intercept staff again. All of our attempts to work on the files came to nothing, though we never learned what the problem was.

We asked the Intercept whether the publication is still in possession of the Snowden file. A spokesperson replied: "The Intercept does not discuss confidential newsgathering materials".

Appelbaum is highly critical of those who destroyed the Snowden files: "Even if the privacy violating intercepts are excluded from publication, there is an entire parallel history in that archive".

See the original post:
New revelations from the Snowden archive surface - ComputerWeekly.com

Read More..

Firm advocates crowdfunding to meet nations housing needs – Guardian Nigeria

The Lagos HOMS launched by the state government under the scheme

A Real estate development firm, Tobykemsworth Investment Limited, has called on Nigerians to embrace crowdfunding initiatives to meet their housing needs.

Managing Director of Tobykemsworth Investment Limited, Dr. Adekunle Raphael-Monehin, made the call in Lagos recently at the first anniversary of Rabanaire, a crowd funding initiative of the firm, aimed at making housing accessible for Nigerians.

According to him, Rabanaire was established because the alternative funding for real estate is crowd funding.

He said: We are sure that affordability for housing is key right now. If you want to set control on the housing deficit, with the experience we have had over the years, since 2009, we have found out that even the government has not been able to make affordable housing. They say this house is affordable.

How can it be when its above N15 million and the premium mortgage bank can only give a loan of N15 million? So, what we are doing is to create a pool where affordable housing can be picked from, and this firm has been in the industry; we have had our experience and we know that if we go in this direction of crowdfunding, it will help the real estate industry where people can own their homes.

Raphael-Monehin described the experience of the firm in the last one year as a success story. Stating that Rabanaire has achieved acceptability, he noted that Nigerians irrespective of class can invest a minimum of N55,000, to own a home.

Also, the Country Director of Rabanaire, Mrs. Pricilia Irabor, said the aim was to reach all parts of the country as they are currently in Abuja and Lagos.

She said: Housing deficit is not only in Nigeria. We are presently in Benin Republic, we are going to be launching in Cameroon and Ghana. We are also going to be looking into how to launch in Ethiopia. So, we found out that housing deficit is a global issue.

Since we launched Rabbanire on September 3, 2022, we have seen acceptance and a lot of people are coming on board.

Earlier, Permanent Secretary, Lagos State Ministry of Housing, Kamar Olowoshago, applauded Tobykemsworth Investment Limited for the crowd funding initiative.

According to him, there was a need for private organisations to assist the government in bridging the housing gap estimated at about three million.

Represented by Saheed Omotosho, the Director of Lagos State Real Estate Regulatory Authority (LASRERA), an Agency under the Ministry of Housing, Olowoshago said: Housing deficit is something that is predominantly mentioned in the present governments THEMES agenda, and government cannot do it alone.

They need to go into partnership with the real estate practitioners to get people there to support them for the provision of housing. Government is doing its best, but they have to do more because they are closer to the people than even the government.

Rabbanire is international not only in Lagos. I believe by the time other key players in the industry come, they will be able to compliment the effort of Rabanaire.

Also, the Permanent Secretary, Ministry of Commerce, Industry and Cooperatives, Mrs. Adetutu Oluremi Ososanya, represented by the Director, Agribusiness Support Unit in the ministry, Mrs. Gbemisola Muninat Atitebi Osi-efa, described the crowd funding initiative of the firm as a laudable project that needs to be extended to an average person on the streets of Lagos.

She pledged to partner the company to ensure that many Lagosians enter into the housing net without much hassles.

Original post:
Firm advocates crowdfunding to meet nations housing needs - Guardian Nigeria

Read More..

I-T unearths unaccounted trade worth Rs 2.5K crore from jewellery, bullion firms – The Indian Express

The Surat Income Tax department has unearthed alleged undisclosed transactions worth Rs 2,500 crore done by two major jewellery manufacturing firms, retail shops associated with them and a bullion company in a 133-hour-long search carried out in Surat and Rajkot.

The firms allegedly underreported their business and were doing three times more business without disclosing it, officials said.

The searches uncovered business links of Surat-based Parth Ornaments Private Limited, among the top jewellers in the state, and Tirth Gold, its sister company, with Akshar Jewellers, Kantilal Brothers Jewellery and Harikala Bullion of Surat city. A team of 150 I-T officials, led by Additional Director of Investigations, Vibhor Badoni, carried out the searches at 35 locations in Surat and two in Rajkot, including secret rooms, residences, and other places linked to the companies from 6.00 am on September 13 and went on till Monday evening.

An official who was part of the searches, on condition of anonymity, said Parth and Tirth had hired a Rajkot-based software company to help them store a majority of their sales data on cloud servers in other cities to enable them to under-report the books of account.

The officials have recovered data on Parth and Tirth from the software company. The officials have also recovered several documents from the five firms. These are being scrutinised. They have also seized 10 bank lockers of these firms.

We have not seized jewellery as it is part of their business. We suspect that cash might have been hidden in some secret place and we are trying to find it out. We have retrieved sales data of five years from Parth and Tirth, and found that they have done business transactions to the tune of Rs 2,500 crore without disclosing it on the books of account. The investigation is still in progress, but the searches have ended, an official said, on the condition of anonymity.

Parth Ornaments has two jewellery factories in Surat and has over 1,000 employees. The firm, which began around 10 years ago, has shops across the country. Tirth Gold, which is also into gold manufacturing, is owned by a cousin of the promoters of Parth Ornaments.

According to sources, Akshar Jewellers and Kantilal Brothers are into the retail jewellery business, and a majority of their sales transactions are done in cash and have not been shown in the book of accounts. The officials have also seized several documents that were kept hidden by the firms and not disclosed to the I-T department.

The bridal jewellery range at Kantilal Brothers starts from Rs 10 lakh onwards, apart from its retail jewellery sales. We have also found unaccounted stock of Rs 100 crore from these five firms. The data recovered from the bullion firm is also being analysed. The exact figures will be revealed after that, the official added.

The Indian Express (P) Ltd

First published on: 18-09-2023 at 23:50 IST

Original post:
I-T unearths unaccounted trade worth Rs 2.5K crore from jewellery, bullion firms - The Indian Express

Read More..