Page 788«..1020..787788789790..800810..»

Five ways to avoid common pitfalls that lead to breaches in the cloud – SC Media

Many cloud breaches are preventable often the result of poor visibility and the inability to take action on discovered risks.Its often common vulnerabilities not novel attacks or zero-days that lead to major, headline-grabbing breaches.

These incidents underscore the importance of proactive security practices, comprehensive threat detection, timely patching, and continuous monitoring to mitigate potential risks and safeguard valuable data. To avoid becoming another casualty in the battle against cyber threats, data security has become a top priority.

Here are five common pitfalls that security teams face and how to avoid them:

Misconfigurations are a common cause of data breaches that start with free and open access to sensitive information often stored in Amazon S3, Azure, or Google Cloud Platform buckets. Take the European VolleyballConfederation breach. In this case, a publicly-exposed cloud storage bucket allowed unauthorized access to hundreds of passports and identity documents. Had the organization been aware of the security posture of their storage, they would have known where data was exposed.

TheLastPassincident serves as another stark reminder of the dangers posed by data misconfigurations. On two distinct occasions, lapses in security protocol led to compromised S3 credentials and subsequent customer data theft. Initially, cyber attackers exploited a known vulnerability to access a developers account, pilfering the LastPass source code among other valuable assets. Later, they leveraged stolen data and decryption keys to infiltrate LastPass AWS storage.

To safeguard against such vulnerabilities, businesses must stay proactive, rooting their approach in data-centric security. Before potential breaches take place, its vital to use tooling such asdata security posture management(DSPM) to fortify cloud data security. This involves cataloging sensitive data, identifying vulnerabilities, and managing access protocols. During an active breach,data detection and response(DDR) capabilities can detect threats by pinpointing unauthorized or anomalous activity. After containing the breach, organizations must actively evaluate compromised data, identify the exploited vulnerabilities, and take steps to prevent future occurrences.

Security teams often overlook backups. While an organization may have robust security measures in place for their live systems, security teams dont always give the same level of attention to backups, leaving them vulnerable to unauthorized access. This oversight can prove disastrous, as demonstrated by the Uberbreach. The breach stemmed from inadequate protection for backups that contained sensitive data and had no controls to limit visibility once someone accessed them. Despite Ubers otherwise strong security practices, this overlooked vulnerability let attackers steal personal information from millions of passengers and drivers.

It serves as another reminder that organizations must apply the same level of security to backups as live systems. Implementing encryption, access controls, and regular vulnerability assessments for backup data can help reduce the attack surface, safeguard sensitive information and prevent unauthorized access. Visibility and classification are crucial, and security teams also must keep the data fresh at all times to ensure that steps are taken to avoid falling victim to a breach.

Security teams also have to think of Data Access Governance(DAG) to prevent data breaches by diligently controlling and monitoring access to sensitive information, ensuring that only authorized personnel can interact with it. For instance, we can attribute theUber breach, where the personal details of millions of passengers and drivers were compromised, to insufficient DAG on their GitHub accounts. These accounts house crucial AWS credentials that attackers leveraged to access Ubers data stores.

DAG operates on the foundational principle of least privilege, ensuring that users are only granted access required for their specific roles. By adhering to this principle, organizations can diminish the potential for unintentional data breaches and associated insider threats. However, DAG doesnt stop at mere access restrictions. It underscores the importance of continuous auditing and monitoring. Systematic checks identify and rectify anomalies like over-granted permissions and broad allowances like *. By monitoring user-access patterns, DAG can also spotlight dormant permissions those untouched for extended periods, such as 90 days hinting that they are potentially superfluous and candidates for removal. This approach employs a comprehensive threat model, meticulously examining how users engage with data, empowering organizations to detect and deter potential data breaches preemptively.

Furthermore, managing access in fragmented data environments becomes crucial with the rise of multi-cloud and hybrid architectures. DAG tools often cater to unstructured data, ensuring that even non-database content, such as documents, gets strictly controlled. Organizations bolster a layered defense strategy against potential data breaches through comprehensive DAG implementation.

Organizations must learn not to ignore data security red flags and take proactive steps to protect their valuable data, especially when they are aware of existing issues. Ignoring warnings from external entities like the FBI or internal reports from security engineers can expose organizations to significant risks.

In the January 2023 Twitter breach, internal engineers had provided warnings about the collection of excessive data and poor security for limiting access. Despite being well aware of their exposures and receiving alerts from the FBI about a potential spy within their ranks, Twitter overlooked these red flags, leading to a detectable breach.

Early warnings are opportunities for improvement and teams shouldconsider them an urgent call to strengthen security measures. Organizations must foster a culture of proactive risk mitigation and ensure that red flags are thoroughly investigated, addressed, and integrated into their data security strategies.

Detecting an organizations data security posture has emerged as the crux of preventing many data breaches. By comprehensively understanding and assessing data security posture, security teams can more proactively identify vulnerabilities and potential weak points. Many of these headlining breaches could have been avoided if these organizations had a clear view of their data security landscape and taken timely remedial actions.

So, whether by misconfigurations, insufficient backup security, or inadequate data access governance, recognizing these vulnerabilities early and addressing them may haveprevented many costly breaches. As the digital landscape evolves, organizations must prioritize regular checks of their data security posture and take steps to rectify identified issues.

Dan Benjamin, co-founder and CEO, Dig Security

Read more here:
Five ways to avoid common pitfalls that lead to breaches in the cloud - SC Media

Read More..

Storj is Adobe’s Premiere Pro storage cloud Blocks and Files – Blocks & Files

Decentralized storage provider Storj has announced a new integration with Adobe Premiere Pro, and says its revenues are growing 226 percent year on year.

Premiere Pro is professional video editing software used by artists and editors worldwide. They need to collectively work on media projects, such as special effects and video sequencing, and share project work with other team members. In other words, media files need storing and moving between the teams. Storj supplies a worldwide cloud storage facility based on spare datacenter capacity organized into a single virtual data store with massive file and object sharding across multiple datacenters, providing error-coded protection and parallel access for file and object read requests. Adobe has selected Storj as its globally distributed cloud storage for Premiere Pro.

Storjs 226 percent growth has brought it customers and partners such as Toysmith, Acronis, Ad Signal, Valdi, Cribl, Livepeer, Bytenite, Cloudvice, Amove, and Kubecost. The growth has been helped by a University of Edinburgh report showing a a 2-4x factor improvement in Storjs transfer performance over the prior year. It found speeds of up to 800 Mbps when retrieving large quantum physics data sets from Storjs network.

The universitys Professor Antonin Portelli stated: You can really reach very fast transfer rates thanks to parallelism. And this is something which is built-in natively in the Storj network, which is really nice because the data from many nodes are scattered all over the world. So you can really expect a good buildup of performances.

Storj says it reduces cloud costs by up to 90 percent compared to traditional cloud providers, and also cuts carbon emissions by using existing, unused hard drive capacity. However, these hard drives have embodied carbon costs, so-called Scope 3 emissions accumulated during their manufacture, which will be shared between the drive owner and the drive renter (Storj).

It also cites a Forrester Object Storage Landscape report which suggests decentralization of data storage will disrupt centralized object storage providers as computing shifts to the edge. Storj is listed as one of 26 object storage vendors covered by the report, which costs $3,000 to access.

Scope 1, 2, and 3 emissions were first defined in the Green House Gas Protocol of 2001.

Link:
Storj is Adobe's Premiere Pro storage cloud Blocks and Files - Blocks & Files

Read More..

Comprehensive Report on Cloud Storage Market Research Report 2023 | Trends, Growth Demand, Opportunities &a… – SeeDance News

Global Cloud Storage Market report is an in-depth study of the present market dynamics and the factors that affect it. It consists of the detailed study of current market trends along with the past statistics. The past years are considered as reference to get the predicted data for the forecasted period. Various important factors such as market trends, revenue growth patterns market shares and demand and supply are included in almost all the market research report for every industry. A significant development has been recorded by the market of Cloud Storage, in past few years. It is also for it to grow further. Various important factors such as market trends, revenue growth patterns market shares and demand and supply are included in almost all the market research report for every industry.

Get sample copy of Cloud Storage Market report@ https://www.thebrainyinsights.com/enquiry/sample-request/12592

The Cloud Storage Market report is also used in the analysis of the growth rates and the threats of new entrants, which are used for the determination of the growth of the market for the estimated forecast period. Moreover, increased demand for the factors influencing the growth of the market is also one of the major aspects which is likely covered in depth in the report. One of the methods for the determination of the growth of the market is the increased use of the statistical tools, which is used for the estimation of the growth of the market for the estimated forecast period. SWOT and PESTEL analysis is one of the methods for the determination of the growth of the global Cloud Storage Market. These tools are also used for the determination of the major players for the growth of the market for the estimated forecast period.

Top Leading Key Players are: AWS, IBM, Microsoft, Google, Oracle, HPE, Dell EMC, VMware, Rackspace, Dropbox

Cloud Storage Market report sheds light on the industry characteristics, progress and size, country and geographical breakdowns, market shares, segmentation, strategies, trends and competitive background of the global Cloud Storage industry. The research report also outlines the driver as well as restraining factors that are adding and hampering the development of the Cloud Storage market correspondingly. In addition, the research study also delivers the market historical as well as estimate market size on the basis of the geographical analysis. Furthermore, the market research report provides the comprehensive information about the key emerged regions as well as major developing markets on the basis of regional growth of the market.

Read complete report with TOC at: https://www.thebrainyinsights.com/report/cloud-storage-market-12592

The report renders an in-depth assessment of crucial Cloud Storage market segments, which includes types, applications, and regions. competitive scenario, industry environment, market projection, growth constraining factors, limitations, entry barriers, provincial regulatory framework as well as upcoming investment and business opportunities, challenges, and other growth-promoting factors. The report thoroughly investigates diving forces, changing market dynamics, upcoming business opportunities and challenges, market threats, risks, and constraining factors in the global Cloud Storage market which have been considered most influential for future market development.

Global Cloud Storage market is segmented based by type, application and region.

Based on Type, the market has been segmented into:

Based on application, the market has been segmented into:

This report is directed to equip report readers with decisive understanding on the potential of mentioned factors that propel relentless growth in global Cloud Storage market. Further in the course of the report this research report on market presented with the help of thorough research Cloud Storage industry forerunners and their effective business decisions, aligning with market specific factors such as threats and challenges as well as opportunities that shape growth in global Cloud Storage market. In addition to this, the Cloud Storage market report provides inclusive analysis of the SWOT and PEST tools for all the major regions such as North America, Europe, Asia Pacific, and the Middle East and Africa. The report offers regional expansion of the industry with their product analysis, market share, and brand specifications.

Besides this, the report analyzes factors affecting Cloud Storage market from both demand and supply side and further evaluates market dynamics affecting the market during the forecast period i.e., drivers, restraints, opportunities, and future trend. The report also provides exhaustive PEST analysis for all five regions considered in the Global Cloud Storage Market report. This study provides key statistics on industry status through tables and figures to help analyze the global Cloud Storage market, and provides useful guidance and direction for companies and individuals interested in the market.

Furthermore, Cloud Storage Market readers will get a clear perspective on the most affecting driving and restraining forces in the Cloud Storage market and its impact on the global market. The report predicts the future outlook for market that will help the readers in making appropriate decisions on which market segments to focus in the upcoming years accordingly. In conclusion, the report provides a fast outlook on the market covering aspects such as deals, partnerships, product launches of all key players for 2023 to 2032. It then sheds lights on the competitive landscape by elaborating on the current mergers and acquisitions (M&A), venture funding and product developments that took place in the Cloud Storage market.

Do You Have Any Query Or Specific Requirement? Ask to Our Industry Expert @ https://www.thebrainyinsights.com/enquiry/request-customization/12592

About The Brainy Insights:

The Brainy Insights is a market research company, aimed at providing actionable insights through data analytics to companies to improve their business acumen. We have a robust forecasting and estimation model to meet the clients objectives of high-quality output within a short span of time. We provide both customized (clients specific) and syndicate reports. Our repository of syndicate reports is diverse across all the categories and sub-categories across domains. Our customized solutions are tailored to meet the clients requirement whether they are looking to expand or planning to launch a new product in the global market.

Media Contact

Avinash DOrganization: The Brainy InsightsPhone: +1-315-215-1633Email: sales@thebrainyinsights.com Web: http://www.thebrainyinsights.com

Here is the original post:
Comprehensive Report on Cloud Storage Market Research Report 2023 | Trends, Growth Demand, Opportunities &a... - SeeDance News

Read More..

Carbon Copy Cloner Backs Up Cloud-Only Content – TidBITS

When I wrote ApplesFile Provider Forces Mac Cloud Storage Changes (10 March 2023), I closed the article by noting that backing up data in cloud storage services was potentially fraught because online-only files wouldnt be backed up. I recommended downloading your entire cloud storage data store, at least temporarily, so it would all be included in your backup. New files you subsequently created would be local and thus backed up, but there are likely situations where files created or modified by collaborators would not be properly reflected in your backup.

That is, unless youre using the most recent version of Carbon Copy Cloner, which can temporarily download cloud-only content, back it up locally, and evict it again to avoid consuming too much local space. Agen Schmitz mentioned this feature in his Watchlist item (see Carbon Copy Cloner 6.1.7, 11 September 2023), but I wanted to call it out more explicitly because its unusual and clever. Bombich Software explains:

When a file stored by one of these storage services is flagged to reside only online, the local copy of the file is deleted from your Mac and replaced with a 0-byte placeholder file. While this is a convenient feature that allows you to free up some space on your Mac, this feature imposes a logistical challenge to creating a local backup of those files. If you want to have a local backup of these cloud-only files, CCC must temporarily download these files to your startup disk. CCC can do this, but because this involves downloading a potentially large amount of data from the Internet, this functionality is disabled by default. Likewise, allowing this data to co-mingle with your startup disks backup could lead to a situation where it is impossible to restore your entire backup to the original disk due to space constraints. To avoid that,we recommend making backups of your cloud-only storage to a separate volume on your backup disk.

The page continues on to provide complete instructions and explain why some iCloud-only content still wont be temporarily downloaded (because Apple isnt yet using its own File Provider technology for iCloud, ironically), among other interesting technical details. Its worth a read if youre interested in quirky backup topics or the integration of local and cloud storage.

And, of course, if youre highly concerned about maintaining local backups of cloud data, add Carbon Copy Cloner to your backup strategy.

Read original article

Read the original post:
Carbon Copy Cloner Backs Up Cloud-Only Content - TidBITS

Read More..

Cloudian does a fast object storage speedrun with AMD CPUs and … – Blocks & Files

Object storage supplier Cloudian has managed to wring 17.7GBps writes and 25GBps reads from a six node all-flash cluster in a recent benchmark.

Cloudian said these are real-world results, generated on an industry-standard benchmark, GOSBENCH, that simulates real-life workloads. It is not an in-house benchmark tool. The servers used were single processor nodes, each with a single, non-bonded 100Gbpds (Ethernet) network card and four Micron 6500 ION NVMe drives per node.

The company supplies HyperStore object storage software and this speed run was done with servers using AMDs EPYC 9454 CPUs and upcoming v8 Hyperstore software.

Cloudian CEO Michael Tso said in a statement: Our customers need storage solutions that deliver extreme throughput and efficiency as they deploy Cloudians cloud-native object storage software in mission-critical, performance-sensitive use cases. This collaboration with AMD and Micron demonstrates that we can push the boundaries.

AMD corporate VP for Strategic Business Development, Kumaran Siva, backed him up: Our 4th Gen AMD EPYC processors are designed to power the most demanding workloads, and this collaboration showcases their capabilities in the context of object storage.

CMO Jon Toor told us: Most of our customers today are talking with us about all flash for object storage, if theyre not already there. Increased performance is a driver, especially as we move into more primary storage use cases. Efficiency is a driver also. With these results we showed a 74 percent power efficiency improvement vs an HDD-based platform, as measured by power consumed per GB transferred.

HyperStore 8.0 incorporates multi-threading technology and kernel optimizations to capitalize on the EPYC 9454 processor, with its 48 cores and 128 PCIe lanes. This combination was then optimized for Microns 6500 ION 232-layer TLC SSDs which delivers 1 million 4KB random write IOPS.

Object storage tends to be linearly scalable as nodes are added to a cluster so great speeds are possible. Cloudians per-node performance was 2.95 GBps write and 4,15 Gbps read.

In October 2019, OpenIO achieved 1.372 Tbps throughput (171.5GBpsec), using an object storage grid running on 350 commodity servers. Thats 0.49GBps per server.

A month later MinIO went past 1.4Tbps for reads, using 32 nodes of AWS i3en.24xlarge instances each with 8 NVMe drives, making a total of 256 NVMe drives, and that means 175GBps overall and 5.5GBps per AWS instance, outperforming Cloudian. We dont know the NVMe drive performance numbers but Minio used two more of them per instance than Cloudian used per node. Object storage performance benchmarks are bedevilled with apples and oranges comparison difficulties.

Check out a Cloudian speed run Solution Brief here.

See the article here:
Cloudian does a fast object storage speedrun with AMD CPUs and ... - Blocks & Files

Read More..

AvePoint Launches AvePoint Opus, AI Powered Information … – GlobeNewswire

JERSEY CITY, N.J., Oct. 10, 2023 (GLOBE NEWSWIRE) -- AvePoint (Nasdaq: AVPT), the most advanced platform to optimize SaaS operations and secure collaboration, today announced the launch of AvePoint Opus, its AI-powered information lifecycle management solution, at #shifthappens Conference 2023. As part of the AvePoint Confidence Platforms Resilience Suite, AvePoint Opus is a comprehensive solution that enables organizations to discover, classify, protect and manage their data across Microsoft 365 accurately and at scale.

AvePoint Opus is a robust information lifecycle management solutionthat ensures organizations can manage all stages of the data lifecycle and represents the next generation of AvePoints Resilience solutions. A key component of AvePoint Opus is AvePoint Maestro, which uses AI models powered by Azure Machine Learning to analyze content and metadata and assign appropriate policies to documents.

With AvePoint Opus, organizations can now achieve the following:

Organizations today are excited about the power of AI and machine learning to transform business, but to truly unlock this technology, they need a comprehensive data strategy that will accurately analyze, govern and classify their data, said Dr. Tianyi Jiang (TJ), Co-founder and CEO, AvePoint. AvePoint Opus provides a solution that is automated and capable of learning over time, allowing organizations to manage the troves of data they produce today to build that data foundation, maintain compliance and reduce storage costs.

AvePoint Opus uses an AI model that rapidly identifies and classifies content in weeks, as opposed to years if done manually. The Australian Transport Safety Bureau, for example, has already benefitted from this efficiency.

AvePoint has always had a modern information management solution, which allows the Australian Transport Safety Bureau team to seamlessly integrate recordkeeping and compliance as part of their responsibilities, said Angelo Santosuosso, IT, Property & Security at the Australian Transport Safety Bureau. New capabilities within AvePoint Opus will enhance our organization and we look forward to continuing to work together.

In addition, by integrating more AI capabilities, AvePoints channel partner ecosystem will be equipped with smarter, more automated tools to manage and protect their clients data and collaboration environments.

With the rapid growth of cloud data, our customers are experiencing a host of information management challenges, said Jacqueline Stockwell, CEO and Founder, Leadership through Data Limited. AvePoint Opus and its industry leading AI-powered data classification capabilities helps us empower our customers to better manage their information, minimize cloud storage costs, improve efficiencies and truly thrive in the digital workplace.

AvePoint has a track record of innovation spanning more than 20 years, aimed at providing solutions that democratize insights and provide recommendations and comprehensive protection for customers and partners. Built upon a robust data management strategy, AvePoint Opus is one of many AI-powered solutions the company plans to introduce.

For more information on AvePoint Opus, visit the websitehttps://www.avepoint.com/products/cloud/avepoint-opus

About AvePointCollaborate with Confidence. AvePoint provides the most advanced platform to optimize SaaS operations and secure collaboration. Over 17,000 customers worldwide rely on our solutions to modernize the digital workplace across Microsoft, Google, Salesforce and other collaboration environments. AvePoint's global channel partner program includes over 3,500 managed service providers, value added resellers and systems integrators, with our solutions available in more than 100 cloud marketplaces. To learn more, visit http://www.avepoint.com.

Disclosure Information AvePoint uses the https://ir.avepoint.com/ website as a means of disclosing material non-public information and for complying with its disclosure obligations under Regulation FD.

Forward-Looking StatementsThis press release contains certain forward-looking statements within the meaning of the safe harbor provisions of the United States Private Securities Litigation Reform Act of 1995 and other federal securities laws including statements regarding the future performance of and market opportunities for AvePoint. These forward-looking statements generally are identified by the words believe, project, expect, anticipate, estimate, intend, strategy, future, opportunity, plan, may, should, will, would, will be, will continue, will likely result, and similar expressions. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: changes in the competitive and regulated industries in which AvePoint operates, variations in operating performance across competitors, changes in laws and regulations affecting AvePoints business and changes in AvePoints ability to implement business plans, forecasts, and ability to identify and realize additional opportunities, and the risk of downturns in the market and the technology industry. You should carefully consider the foregoing factors and the other risks and uncertainties described in the Risk Factors section of AvePoints most recent Quarterly Report on Form 10-Q and its registration statement on Form S-1 and related prospectus and prospectus supplements filed with the SEC. Copies of these and other documents filed by AvePoint from time to time are available on the SEC's website, http://www.sec.gov. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and AvePoint does not assume any obligation and does not intend to update or revise these forward-looking statements after the date of this release, whether as a result of new information, future events, or otherwise, except as required by law. AvePoint does not give any assurance that it will achieve its expectations.

Investor ContactAvePointJamie Arestiair@avepoint.com(551) 220-5654

Media ContactAvePointNicole Cacipr@avepoint.com(201) 201-8143

Link:
AvePoint Launches AvePoint Opus, AI Powered Information ... - GlobeNewswire

Read More..

Detas Space OS aims to build the first personal cloud computer – The Verge

Heres how your computer should work, according to Mustafa Abdelhai, the co-founder and CEO of a startup called Deta. Instead of a big empty screen full of icons, your desktop should be an infinite canvas on which you can take notes or watch movies or run full apps just by drawing a rectangle on the screen. Instead of logging in to a bunch of cloud services over which you ultimately have no control, you should be able to download software like PC users did 20 years ago, and the stuff you download should be completely yours. All your apps should talk to each other, so you can move data between them or even use multiple apps features simultaneously. You should be able to use AI to accomplish almost anything.

And it should all happen in a browser tab.

For the last couple of years, the Berlin-based Deta has been building what it calls the personal cloud computer. The product Deta is launching today is called Space OS, and the way Abdelhai explains it, its the first step in putting the personal back in the personal computer. Personal computing took a dive at the turn of the century, he says, when cloud computing became the big thing. We all moved to the cloud, moved our data, and we dont own it anymore. Its just somebody elses computer. Deta wants to give it back.

The term personal cloud computer sounds like an oxymoron, and to a large extent, it is. A computer you control that no one else can see or access or shut down that runs on someone elses servers behind a username and password. After all, Deta stores your stuff via AWS just like everybody else. In a certain light, Deta is building the exact kind of thing its trying to break away from the companys just hoping it can build a big enough platform, with user-friendly enough rules, to make it worthwhile. But also, you know, it is what it is. It is in the cloud, he says. So it is managed by somebody.

The term personal cloud computer sounds like an oxymoron, and to a large extent, it is

The idea of a streaming computer, where all your data lives online and you can interact with your PC from anything with a screen and a web connection, is not new. Thats more or less Googles long-term vision for Chromebooks, for one thing. Companies like Shadow tried streaming entire Windows computers, only to find that its a wildly complicated and expensive thing to get right. (Streaming lag when youre moving your mouse? Bad times.) In 2021, Microsoft announced the creation of the Cloud PC and predicted it might change the way we work forever. This year, we found out the company is still very committed to that idea.

Thats all good, Abdelhai says, but he thinks that Windows on the internet is too small a vision. We wanted to bring personal computers to the cloud, Abdelhai says. That doesnt mean we reinvented Windows; it means we really needed a new way of thinking. Deta wants to take this shift as an opportunity to rethink apps, to change the way we approach privacy and data storage, and to turn our devices from a series of siloed apps into something more fluid and interactive.

The easiest way to describe the way Space OS works is probably just to tell you how to use it. When you first create an account, youre dropped into whats called Horizon, which is an infinite canvas with a dot grid. You can add all kinds of things to that grid just by drawing rectangles with your mouse. Add a text box and take notes; add icons to launch the apps you care about; embed YouTube videos or link to websites; prompt Teletype, Space OSs built-in AI chatbot, to make you an app. Setting up Horizon is similar to setting up your phones homescreen, only Space OS is much more flexible. And the whole thing is made of interactive widgets.

Spaces Discovery section is like an app store for web apps. Image: Deta / David Pierce

The first place you go after that is Discovery, which is Detas app store. Here, you can download a bunch of the first Space OS apps: WebCrate, a simple bookmarking service; Minima, a note-taking tool; Filebox, a file-storage system; Black Hole, a photo host; Temper, a simple website builder. You click Install App, and Space OS adds that app to your personal cloud. But it doesnt create an account on someone elses server; it downloads the code and installs it in your Deta cloud so it can run even if the app eventually disappears.

Right now, you can access Space OS just by navigating to your clouds URL. (Every user gets an ugly-looking alphanumeric Deta URL, but there are plans to fix that.) Abdelhai says Deta is planning to make heavy use of progressive web apps in order to run apps locally and even offline, too thats the companys future more than building a browser or a native app or even its own hardware. Were working on making it feel like your personal computer, he says. In the next five years, we want to move to this computer. And then just use Chromebooks or something.

Your Deta cloud is essentially two things: an encrypted place to store your data and a series of virtual machines that spin up to run your various apps. Every app completely runs on your personal cloud, Abdelhai says. Right now, if you use somebody elses software, they will have access to your data. We ensure that these apps are running on your personal cloud. You get the basics for free, Abdelhai says, but youll have to pay for more capabilities and storage.

The Space Canvas is like your app library and the Teletype AI bot lives at the bottom. Image: Deta / David Pierce

When I asked Abdelhai why people should trust Deta instead of their other cloud services, he cited two things: the business model and the tools. You can delete your app, you can export your data, you can delete your space as well. All your data is encrypted, you can download all the source code of your apps, and Deta doesnt sell ads or make money from your data. We have a lot of incentives that protect you and deliver on our promise. Its a very modern conundrum, really: using the internet requires trusting somebody, so all you can do is pick the one whose incentives match your own.

Right now, everything in Space OS is extremely primitive. There are no ultra-compelling new apps, no immediate reason to throw away your computer and start living the Space life. (Though I will say, the canvas-as-homescreen idea is a terrific one.) Much of the interface looks like it was designed as a proof of concept, not a consumer product. Abdelhai agrees with all that. Deta is still early, he says, and theres a tremendous amount of work to do. But hes confident that the overarching idea is the right one: a personal, interoperable, AI-powered operating system is what the world needs next.

The most immediate thing for Deta to get right is the developer platform. The company is trying to make it ludicrously easy to build and sell apps, which is part of how it hopes to convince developers to jump to a new platform. Developers can integrate with the Teletype AI, build apps that just work across the web, and use Detas App Actions system to interoperate with everything else in Space OS. Abdelhai says 70,000 developers are already building on Deta and hopes to have many more soon.

Imagine if your Google Docs, Notion, Figma, and Slack accounts could all share data and talk to one another thats the Space OS dream

The first thing Abdelhai says he thinks Deta will be great for is productivity tools. Imagine, he says, if your Google Docs, Notion, Figma, and Slack accounts could all share data and talk to one another thats the Space OS dream. (Deta has already gotten a few well-known productivity tools running on its platform, too.) Software is stiff, Abdelhai says. We like breaking down apps into smaller chunks so they can work together. Right now, every app we use is a universe mostly unto itself; Deta wants to turn software into tools in a toolset. Once that works, he thinks it could be a gaming platform, a creative tool, and much more.

Along the way, the company has to sort out the rest of the open questions about Space OS. Whats the killer app? How do you convince users to run their whole lives from your servers when part of what people are trying to do is get away from that dynamic? What happens to users clouds if Deta goes down or goes out of business altogether?

Detas idea is both a very new one and a very old one. It harkens back to the early days of computers when you bought software in a box at a store and installed it on your computer. The cloud era, of course, made computing vastly easier and more powerful but also systematically ate away at the idea that you could control anything on your devices. Its an interesting thought experiment, actually: if every cloud service shut down tomorrow, what would be left on your phone or your laptop? Odds are, not much. Detas trying to undo that a bit, to embrace the cloud and the expansive universe of apps while giving you back the feeling that your computer and everything on it is yours and no one elses. Because your computer should be yours even if its on somebodys server.

Read more here:
Detas Space OS aims to build the first personal cloud computer - The Verge

Read More..

Pliops Collaborates with Innovators at Cloud Expo Asia to Address … – GlobeNewswire

SINGAPORE, Oct. 10, 2023 (GLOBE NEWSWIRE) -- Pliops, a leading provider of data processors for cloud and enterprise data centers, will be on hand this week at Cloud Expo Asia to deliver insights surrounding data infrastructure optimization, modern workload acceleration and flash investments. At the Expo, Pliops will deliver a keynote presentation with enterprise SSD provider DapuStor, in which the power of the Pliops XDP Data Services platform will be highlighted with DapuStors high-capacity SSDs.

Pliops empowers data centers to thrive in the AI era by addressing vital challenges in sustainability, performance and scalability. With cutting-edge technology, Pliops illuminates the path for businesses to conquer these obstacles. Through pioneering solutions and a focus on sustainable practices, Pliops equips data centers with the necessary tools for success in today's dynamic landscape.

The Pliops XDP Data Services platform takes a seamless-to-deploy, transformational approach to optimizing data infrastructure and accelerating modern workloads, while in tandem reducing TCO by 50%. Running on the Pliops Extreme Data Processor (XDP), the portfolio of XDP Data Services includes XDP-RAIDplus, XDP-AccelDB and XDP-AccelKV. These Data Services are designed to maximize data center infrastructure investments by exponentially increasing application performance, data reliability, storage capacity, and overall stack efficiency.

Pliops is collaborating with digital solutions leader H3C, which recently conducted extensive performance, availability and recoverability tests using Pliops XDP-RAIDplus and its Uniserver R4900 G5 server. H3Cs Product Manager Qiang Yu had this to say: By utilizing the H3C Uniserver in combination with Pliops XDP-RAIDplus, enterprises can attain elevated performance levels and ensure high reliability. This enables organizations to effectively meet customer service level agreement requirements, ensuring a satisfactory level of service delivery.

At Cloud Expo Asia, a technical presentation titled, Overcome Blast Radius Anxiety: Ultrafast Rebuilds and Performance for Hi-Cap SSDs, will be jointly delivered by Pliops and DapuStor on October 11 from 4:10 4:30 p.m. The session will look at blast radius concerns encountered by organizations looking to keep up with surging data storage requirements.

Traditional RAID solutions reduce both the performance and endurance of SSDs, and Pliops has stepped in to solve this issue by enabling performance acceleration and scalability for databases, said DapuStors VP International Business Alfred Chase Hui. Pliops XDP-RAIDplus brings higher endurance, usable life and unlocked capacity to our enterprise SSDs.

Cloud Expo Asia presents an ideal venue to highlight some of the critical issues we solve in the data center, and to showcase the more efficient and reliable infrastructures our storage solutions enable, said Ido Bukspan, Pliops CEO. This is a great opportunity to continue to expand our business and partnerships in Asia, and we are delighted to be here this week.

Connect with PliopsAbout PliopsRead BlogVisit Resource CenterConnect on LinkedInFollow on X (formerly Twitter)

About PliopsPliops is a technology innovator focused on making data centers run faster and more efficiently. The companys Extreme Data Processor (XDP) radically simplifies the way data is processed and flash is managed. Pliops overcomes storage inefficiencies to massively accelerate performance and dramatically reduce overall infrastructure costs for data-hungry applications. Founded in 2017, Pliops is a winner of the Flash Storage Solution of the Year Award in the Data Breakthrough Awards program and has been named a few times one of the 10 hottest semiconductor startups. The company has raised over $200 million to date from leading investors including Koch Disruptive Technologies, State of Mind Ventures Momentum, Intel Capital, Viola Ventures, SoftBank Ventures Asia, Expon Capital, NVIDIA, AMD, Western Digital, SK hynix and Alicorn. For more information, visit http://www.pliops.com.

Media Contact:Stephanie OlsenLages & Associates(949) 453-8080stephanie@lages.com

A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/6a55b1c1-2d25-4a11-8dff-f678f5b54a15

Continued here:
Pliops Collaborates with Innovators at Cloud Expo Asia to Address ... - GlobeNewswire

Read More..

New Report: Child Sexual Abuse Content and Online Risks to … – The Hacker News

Oct 10, 2023The Hacker NewsCybersecurity / Online Security

Certain online risks to children are on the rise, according to a recent report from Thorn, a technology nonprofit whose mission is to build technology to defend children from sexual abuse. Research shared in the Emerging Online Trends in Child Sexual Abuse 2023 report, indicates that minors are increasingly taking and sharing sexual images of themselves. This activity may occur consensually or coercively, as youth also report an increase in risky online interactions with adults.

"In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives," said John Starr, VP of Strategic Impact at Thorn. "Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content."

These trends and others shared in the Emerging Online Trends report align with what other child safety organizations are reporting. The National Center for Missing and Exploited Children (NCMEC) 's CyberTipline has seen a 329% increase in child sexual abuse material (CSAM) files reported in the last five years. In 2022 alone, NCMEC received more than 88.3 million CSAM files.

Several factors may be contributing to the increase in reports:

This content is a potential risk for every platform that hosts user-generated contentwhether a profile picture or expansive cloud storage space.

Hashing and matching is one of the most important pieces of technology that tech companies can utilize to help keep users and platforms protected from the risks of hosting this content while also helping to disrupt the viral spread of CSAM and the cycles of revictimization.

Millions of CSAM files are shared online every year. A large portion of these files are of previously reported and verified CSAM. Because the content is known and has been previously added to an NGO hash list, it can be detected using hashing and matching.

Put simply, hashing and matching is a programmatic way to detect CSAM and disrupt its spread online. Two types of hashing are commonly used: perceptual and cryptographic hashing. Both technologies convert a file into a unique string of numbers called a hash value. It's like a digital fingerprint for each piece of content.

To detect CSAM, content is hashed, and the resulting hash values are compared against hash lists of known CSAM. This methodology allows tech companies to identify, block, or remove this illicit content from their platforms.

Hashing and matching is the foundation of CSAM detection. Because it relies upon matching against hash lists of previously reported and verified content, the number of known CSAM hash values in the database that a company matches against is critical.

Safer, a tool for proactive CSAM detection built by Thorn, offers access to a large database aggregating 29+ million known CSAM hash values. Safer also enables technology companies to share hash lists with each other (either named or anonymously), further expanding the corpus of known CSAM, which helps to disrupt its viral spread.

To eliminate CSAM from the internet, tech companies and NGOs each have a role to play. "Content-hosting platforms are key partners, and Thorn is committed to empowering the tech industry with tools and resources to combat child sexual abuse at scale," Starr added. "This is about safeguarding our children. It's also about helping tech platforms protect their users and themselves from the risks of hosting this content. With the right tools, the internet can be safer."

In 2022, Safer hashed more than 42.1 billion images and videos for their customers. That resulted in 520,000 files of known CSAM being found on their platforms. To date, Safer has helped its customers identify more than two million pieces of CSAM on their platforms.

The more platforms that utilize CSAM detection tools, the better chance there is that the alarming rise of child sexual abuse material online can be reversed.

See the rest here:
New Report: Child Sexual Abuse Content and Online Risks to ... - The Hacker News

Read More..

The Google Pixel 8 Pro didn’t get this upgrade and it’s a problem – Tom’s Guide

I honestly had low expectations for the Pixel 8 Pro so much so that I largely didnt pay much attention to all the leaks in the build-up to the phone's release this week. However, my mind quickly changed once the Pixel 8 Pro was officially announced at the Made by Google event, and I got to check it out in person during my hands-on time.

It's an impressive phone, but there was one odd thing that flew under the radar and got under my skin. The Pixel 8 Pro features a godawful 128GB of storage in its base model.

For the last few years, 128GB has been the industry standard for most high-end smartphones. But as weve seen this year, thats all changed. Google spent a lot of time talking about all of the Pixel 8 Pro's cool new features, like the enhanced Call Screen capabilities that actually make Google Assistant sound and act more like a real assistant. Then theres also the new Pro Controls and Magic Audio Eraser with the cameras but while they all add greater depth to the phones utility Im still shocked that Google neglected to up the storage of the device.

Making 128GB as the starting storage option is unacceptable for a high-end flagship phone in 2023. Heres why.

Like I mentioned, the Pixxel 8 Pro's two biggest rivals both start off at 256GB of storage. Whats even more astounding was that Apple decided to give the iPhone 15 Pro Max the upgrade treatment, going from the previous 128GB starting storage of the iPhone 14 Pro Max to 256GB. I didnt expect that to happen because Apple usually waits longer than other phone makers to increase capacity on its phones.

Then again, the Samsung Galaxy S23 Ultra arrived earlier in the year and Apple probably didnt want its main rival to take the storage spotlight. That's why it really doesnt benefit the Pixel Pro 8 to start off with 128GB. I understand that it makes sense for the $699 Pixel 8 to go that route, but the extra $300 Google charges for the Pixel 8 Pro demands greater capacity.

Google talked big about the cameras of the Pixel 8 Pro. All of the new Google AI experiences powered by the Tensor G3 are motivating people to use their phone's cameras more than ever before. The Pixel 8 Pro's main 50MP camera is a strong contender in catapulting the Pixel 8 Pro as one of the best camera phones this year, especially when it can stitch together the perfect family photo shot with Google's new Best Take feature.

Pixel 8 Pro owners would be inclined to shoot a lot more photos due to these new features, along with editing them to perfection leading me to my point that 128GB is yet again insufficient storage. I understand that Google Photos should help offset some of the load by backing up media to the cloud. However, that also comes with an added cost that could add up quickly.

Take for example one of the Pixel 8 Pros exclusive features, Video Boost, which will come to the phone later this year. Video Boost lets you upload RAW videos to the cloud where you can enhance them with the processing muscle of Googles data centers. The ideal result is an even better -ooking video for you to download later.

These videos have the potential to add up and take up storage space, especially when theyre captured at 4K 60fps. Im a videographer and know how much these files can be crush your storage.

My frustration about a 128GB Pixel 8 Pro couldve easily been remedied with an expandable storage option. But you know what? Its been years since a microSD slot was included in a Google-made smartphone, so theres no practical method of expanding its paltry starting storage unless youre willing to pay more.

Pixel 8 Pro models with 256GB, 512GB, and 1TB capacities cost $1,059, $1,179, and $1,399 respectively. At least, these options actually give you better storage bang for the buck than the 1TB iPhone 15 Pro Max, which sets you back $1,599 .

These are great alternatives to solving my unquenchable storage thirst, but its still mind-boggling that Google thinks its appropriate to start out at 128GB. Ive been using the Galaxy Z Fold 4 for more than a year now, and when I take a quick peek at its storage tally, Ive already used up 81% of the 256GB of internal storage so theres no way I could properly transfer everything to the Pixel 8 Pro.

Today's best Google Pixel Buds Pro deals

The rest is here:
The Google Pixel 8 Pro didn't get this upgrade and it's a problem - Tom's Guide

Read More..