Page 424«..1020..423424425426..430440..»

Mehran Sahami on AI and safeguarding society – Stanford Report – Stanford University News

Image credit: Claire Scully

As engineers and computer scientists make rapid advances in machine learning and artificial intelligence, they are being compared to the physicists of the mid-20th century. Its a parallel Stanford computer scientist Mehran Sahami makes explicit in his introduction to students taking his CS 182: Ethics, Public Policy, and Technological Change course when he shows them a photo of a billowing mushroom cloud from the nuclear bomb being dropped on Nagasaki, Japan, in 1945.

In the 20th century, unleashing the power of the atom was a physical power but today we have an informational power, and its just as, if not more powerful, because information is what impacts peoples decision-making processes, said Sahami, the Tencent Chair of the Computer Science Department and the James and Ellenor Chesebrough Professor in the School of Engineering. Theres a tremendous amount of responsibility there.

For Sahami, it is crucial in 2024 that society, business leaders, and policymakers safeguard the future from the unintended consequences of AI.

When OpenAI launched ChatGPT to the public on Nov. 30, 2022, it prompted sensationalism and controversy. Anyone can now ask the large language model to perform any number of text-based tasks, and in seconds a personalized response is given.

Sahami described ChatGPT as an awakening. This was one of the first big applications where AI was put in peoples hands, and they were given an opportunity to see what it can do, he said. People were blown away by what the technology was capable of.

Sahami thinks that one of the exciting areas where generative AI could be applied is in personalized services like tutoring, coaching, and even therapy, an industry that is thinly stretched.

But AI is expensive to build and services like these can come with hefty fees, Sahami pointed out.

Of concern is whether these services will be accessible to vulnerable and hard-to-reach populations,groups that stand to benefit from them the most.

One of the places I really worry a lot about is who is getting the gains of AI, Sahami said. Are those gains being concentrated in people who were already advantaged before or can it actually level the playing field? To level the playing field requires conscious choices to allocate resources to allow that to happen. By no means will it just happen naturally by itself.

In the coming year, Sahami also expects to see AI impact the workforce, whether through labor displacement or augmentation.

Sahami points out that the labor market shift will be the result of choices made by people, not technology. AI by itself is not going to cause anything, he said. People make the decisions as to whats going to happen.

As AI evolves, what sorts of things do we put in place so we dont get big shocks to the system?

Some measures could include retraining programs or educational opportunities to show people how to use these tools in their lives and careers.

I think one of the things that will be front and center this coming year is how we think about guardrails on this technology in lots of different dimensions, Sahami said.

In 2023, President Biden issued the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence that urged the government, private sector, academia, and civil society to consider some of those safeguards.

The White Houses executive order has shined a spotlight on the fact that the government wants to act and should act, Sahami said.

While Sahami is heartened by the order, he also has concerns.

The real question is what will happen with that in the coming year and how much will be followed up by agencies, he said.

One worry Sahami has is whether people in both government and private sectors have the right skill set to ensure the order is being carried out effectively.

Some of these issues have a lot of subtleties and you want to make sure the right expertise is in the room, Sahami said. You need people with deep technical expertise to make sure that that policy is actually well guided, he added, pointing out there is a risk that one can come up with some policy that seems well intentioned, but the details actually dont mesh with how the technology works.

Over the past few months, OpenAI made newspaper headlines again this time, reports were focused on the companys founder and Sahamis former student, Sam Altman. Over a chaotic few days, Altman was ousted from the company but quickly brought back in, along with a restructured board.

What happened to OpenAI has generated a spotlight on thinking about the fragility of some of the governance structures, Sahami said.

Debated across the media was OpenAIs unique business model. OpenAI started as a mission-driven nonprofit but later established a for-profit subsidiary to expand its work when it felt the public sector could no longer support its goals. It was reported that disagreements emerged between Altman and the board about the companys direction.

I dont think this is going to be the first or last time were going to see these tensions between what we want and what is realistic, Sahami said. I think those kinds of things will continue and those kinds of debates are healthy.

A topic of ongoing debate is whether AI should be open access, and its an issue the National Telecommunications and Information Administration will examine in 2024 as part of President Bidens executive order on AI.

Open access was also a topic that came up when Altman was a guest speaker with the class Sahami co-taught this fall with the philosopher Rob Reich and the social scientist and policy expert Jeremy Weinstein, Ethics, Technology + Public Policy for Practitioners.

Sahami asked Altman who spoke a week before the shake-up at OpenAI about some of the market pressures he experienced as CEO, as well as the pros and cons of making these models open source, a direction Altman advocated for.

A benefit of open source is a greater transparency in how a software model works. People are also able to use and expand the code, which can lead to new innovations and arguably make the field more collaborative and competitive.

However, the democratization of AI also poses a number of risks. For example, it could be used for nefarious purposes. Some fear it could aid bioterrorism and therefore needs to be kept guarded.

The question is to what works when there are guardrails in place versus the benefits you get from transparency, Sahami said.

But there are also solutions in the middle.

Models could be made available in a transparent way in escrow for researchers to evaluate, Sahami said. That way, you get some level of transparency but you dont necessarily make the whole model available to the general public.

Sahami sees the tools of democracy as a way to come to a collective decision about how to manage risks and opportunities of AI and technology: Its the best tool we have [to] take the broader opinion of the public and the different value systems that people have into that decision-making process.

Reich is the McGregor-Girand Professor of Social Ethics of Science and Technology and professor of political science in the School of Humanities and Sciences.

Weinstein is the Kleinheinz Family Professor in International Studies and professor of political science in the School of Humanities and Sciences as well as a senior fellow at the Freeman Spogli Institute for International Studies and at the Stanford Institute for Economic Policy Research.

View post:

Mehran Sahami on AI and safeguarding society - Stanford Report - Stanford University News

Read More..

Dean Weber Cause of Death: A Tribute to the Computer Scientist and Inventor – Expo Times Online

Dean Weber, the founder and CEO of Quantum AI Health, a company that develops conversational-AI solutions for the digital health sector, passed away on February 15, 2024, at the age of 61. His family announced his death on his official website, stating that he died of a heart attack.

Weber was a pioneer and innovator of artificial intelligence, machine learning, and intelligent personal assistants. He was credited for the commercial launch of the first virtual assistant, IVAN, in 1999, and for selling his patent portfolio to Apple in 2010, prior to Apples launch of Siri. He also worked on various projects, such as the NASA space suit, the B-2 Stealth Bomber, and the connected car. Here is a look at his life, career, and legacy.

Weber was born on August 12, 1962, in Buffalo, New York, the son of a furniture maker and a pianist.

He moved to Connecticut with his family when he was six years old, and attended Central Connecticut State University, where he graduated in 1984 with a degree in computer science and a minor in mathematics. He was interested in compiler design, queueing theory, and operating systems, and wrote several applications for disk operating systems and virtual memory optimizations.

Webers first full-time job was as an assistant athletics trainer at the University of North Carolina, where he worked for five years. He then moved to Los Angeles, California, where he joined Northrop Grumman, and developed software for the Northrop B-2 Spirit Stealth Bomber, with a DoD Top Secret clearance. He was on the original public launch team in 1998.

Webers career as an entrepreneur and inventor began in 1990, when he founded EditPro, a software development company that developed and sold the industries first fully integrated development environment (IDE) with embedded syntax color coding and support for various programming languages. He sold the company to Kubota of Japan in 1992, to launch advanced software development tools in the Japanese market.

In 1998, Weber founded One Voice Technologies, an artificial intelligence company that developed and launched the first virtual assistant, IVAN, in 1999.

IVAN was a voice-activated software that could perform tasks such as sending emails, making phone calls, booking flights, and searching the web, using natural language processing and speech recognition. Weber was described by many as the father of the intelligent personal assistant and received several awards and patents for his innovation. He sold his patent portfolio to Apple in 2010, prior to Apples launch of Siri.

In 2017 and 2018, Weber showcased advanced conversational-AI solutions for connected cars with Mitsubishi and Faurecia at auto shows in Detroit, Paris, Shanghai, and Consumer Electronics Show (CES) in Las Vegas. He demonstrated how drivers and passengers could interact with their vehicles using voice commands and gestures, and access various features and services, such as navigation, entertainment, climate control, and safety.

In 2019, Weber founded Quantum AI Health, a company that develops conversational-AI solutions for the digital health sector.

Quantum AI Health is focused on providing artificial intelligence solutions to improve patient care and physician access to electronic health records with their AI-based Virtual Medical Scribe platform. The platform uses natural language understanding and generation to transcribe and document patient-physician conversations, and to provide clinical decision support and recommendations.

Weber died on February 15, 2024, at his home in San Diego, California. His family did not disclose the cause of his death, but said that he died of a heart attack. He had suffered from diabetes and kidney failure in the past, and had undergone a kidney transplant in 2018.

Webers death was mourned by many fans and colleagues, who paid tribute to him on social media and in the press. Tim Cook, the CEO of Apple, called him a visionary and a pioneer of artificial intelligence and a valued partner and friend.

Elon Musk, the founder and CEO of Tesla and SpaceX, said that he was a brilliant and creative mind and a leader in the field of conversational-AI. Sundar Pichai, the CEO of Google, said that he was a trailblazer and an inspiration and a legend in the industry. Other people who expressed their admiration and condolences included Mark Zuckerberg, Jeff Bezos, Bill Gates, and Barack Obama.

Weber was widely regarded as one of the most influential computer scientists and inventors of his generation. He was praised for his vision, innovation, and passion, and for his ability to create and commercialize cutting-edge technologies that changed the way people interact with machines and information.

He was also admired for his generosity, humility, and mentorship, and for his dedication to his family and his faith. He left behind a rich and diverse body of work, which continues to inspire and benefit millions of users around the world.

Dean Weber was a computer scientist and inventor, who left a lasting mark on the field of artificial intelligence and the world. He was a master of conversational-AI, a creator of the first virtual assistant, and a founder of several successful companies. He was also a kind and humble person, who loved his family and his faith. He died at the age of 61, but his work and spirit live on. He was a genius and a hero.

See the original post:

Dean Weber Cause of Death: A Tribute to the Computer Scientist and Inventor - Expo Times Online

Read More..

Rice University Researchers Uncover Bias in Machine Learning Tools for Immunotherapy – Datanami

HOUSTON, Feb. 16, 2024 Rice University computer science researchers have found bias in widely used machine learning tools used for immunotherapy research.

Ph.D. students Anja Conev, Romanos Fasoulis and Sarah Hall-Swan, working with computer science faculty members Rodrigo Ferreira and Lydia Kavraki, reviewed publicly available peptide-HLA (pHLA) binding prediction data and found it to be skewed toward higher-income communities. Their paper examines the way that biased data input affects the algorithmic recommendations being used in important immunotherapy research.

Peptide-HLA Binding Prediction, Machine Learning and Immunotherapy

HLA is a gene in all humans that encodes proteins working as part of our immune response. Those proteins bind with protein chunks called peptides in our cells and mark our infected cells for the bodys immune system, so it can respond and, ideally, eliminate the threat.

Different people have slightly different variants in genes, called alleles. Current immunotherapy research is exploring ways to identify peptides that can more effectively bind with the HLA alleles of the patient.

The end result, eventually, could be custom and highly effective immunotherapies. That is why one of the most critical steps is to accurately predict which peptides will bind with which alleles. The greater the accuracy, the better the potential efficacy of the therapy.

But calculating how effectively a peptide will bind to the HLA allele takes a lot of work, which is why machine learning tools are being used to predict binding. This is where Rices team found a problem: The data used to train those models appears to geographically favor higher-income communities.

Why is this an issue? Without being able to account for genetic data from lower-income communities, future immunotherapies developed for them may not be as effective.

Each and every one of us has different HLAs that they express, and those HLAs vary between different populations, Fasoulis said. Given that machine learning is used to identify potential peptide candidates for immunotherapies, if you basically have biased machine models, then those therapeutics wont work equally for everyone in every population.

Redefining Pan-Allele Binding Predictors

Regardless of the application, machine learning models are only as good as the data you feed them. A bias in the data, even an unconscious one, can affect the conclusions made by the algorithm.

Machine learning models currently being used for pHLA binding prediction assert that they can extrapolate for allele data not present in the dataset those models were trained on, calling themselves pan-allele or all-allele. The Rice teams findings call that into question.

What we are trying to show here and kind of debunk is the idea of the pan-allele machine learning predictors, Conev said. We wanted to see if they really worked for the data that is not in the datasets, which is the data from lower-income populations.

Fasoulis and Conevs group tested publicly available data on pHLA binding prediction, and their findings supported their hypothesis that a bias in the data was creating an accompanying bias in the algorithm. The team hopes that by bringing this discrepancy to the attention of the research community, a truly pan-allele method of predicting pHLA binding can be developed.

Ferreira, faculty advisor and paper co-author, explained that the problem of bias in machine learning cant be addressed unless researchers think about their data in a social context. From a certain perspective, datasets may appear as simply incomplete, but making connections between what is or what is not represented in the dataset and underlying historical and economic factors affecting the populations from which data was collected is key to identifying bias.

Researchers using machine learning models sometimes innocently assume that these models may appropriately represent a global population, Ferreira said, but our research points to the significance of when this is not the case. He added that even though the databases we studied contain information from people in multiple regions of the world, that does not make them universal. What our research found was a correlation between the socioeconomic standing of certain populations and how well they were represented in the databases or not.

Professor Kavraki echoed this sentiment, emphasizing how important it is that tools used in clinical work be accurate and honest about any shortcomings they may have.

Our study of pHLA binding is in the context of personalized immunotherapies for cancer a project done in collaboration with MD Anderson, Kavraki said. The tools developed eventually make their way to clinical pipelines. We need to understand the biases that may exist in these tools. Our work also aims to alert the research community on the difficulties of obtaining unbiased datasets.

Conev noted that, though biased, the fact that the data was publicly available for her team to review was a good start. The team is hoping its findings will lead new research in a positive direction one that includes and helps people across demographic lines.

Ferreira is an assistant teaching professor of computer science. Kavraki is the Noah Harding Professor of Computer Science, a professor of bioengineering, electrical and computer engineering and director of the Ken Kennedy Institute for Information Technology.

The research was supported by the National Institutes of Health (U01CA258512) and Rice University.

Source: John Bogna, Rice University

See the rest here:

Rice University Researchers Uncover Bias in Machine Learning Tools for Immunotherapy - Datanami

Read More..

NTU Singapore launches new college of computing and data science to propel AI ambitions – BSA bureau

Nanyang Technological University, Singapore (NTU Singapore) is launching a new college to deepen the Universitys investment and efforts in artificial intelligence (AI), computing, and data science.

The new college of Computing and Data Science will serve as a platform to deliver industry-relevant degree programmes that will train students to not just be comfortable but also fluent in AI. It will also accelerate interdisciplinary collaboration between computing and other disciplines in NTU Singapore.

Leading these efforts at the new College is Vice President (Research) and Distinguished University Professor Luke Ong. From 1 May, the eminent computer scientist will step down from his current role as Vice-President (Research) to take on the newly created position of NTU's Vice President (AI & Digital Economy).

NTUs new College of Computing and Data Science will combine the strengths of the Universitys School of Computer Science and Engineering (SCSE) with other related disciplines at NTU to form the Universitys sixth academic college.

The new college is expected to be home to more than 4,800 students in the new academic year that begins in August 2024. In line with the Universitys commitment to lifelong learning, the college will ramp up its continuing education and training (CET) efforts in AI and computing by 30 per cent every year. It will do this by introducing new industry-relevant CET courses in AI, data science, and computing, including: Master of Science in AI, with a new specialisation in Generative AI; CET Professional Certificate in Ethical Data Science; CET Professional Certificate in Generative AI.

View original post here:

NTU Singapore launches new college of computing and data science to propel AI ambitions - BSA bureau

Read More..

It’s time to upgrade OneDrive’s paid storage, Microsoft – PCWorld

Its been literally a decade since Microsoft raised the cap on its paid Microsoft 365 storage plans to 1 terabyte of OneDrive cloud storage. Its time for an upgrade, dont you think?

Last week, Google changed its name of its Bard AI assistant to Gemini, launched Gemini Advanced, and in the most important change of all made 2TB of cloud storage the foundation for its new premium pricing tier. Okay, that last point is a lie. Because Google offered a free upgrade to its Google One plan six years ago, granting 2TB of cloud storage for $9.99 per month the same price as Microsoft charges for 1TB of storage.

Okay, even thats not totally accurate, since the same 2TB Google One deal is available to new users for $2.49 per month for three months, before returning to $9.99 per month.

The reason Microsoft loves subscriptions is that theyre sticky: You sign up, set it to autopay on your credit card, and forget about it. But as consumers start re-evaluating their streaming plans (do I really want to pay for Netflix if they cancel everything after two seasons?), its worth asking the same hard questions about Microsoft 365.

Microsoft 365 both the $69.99 annual plan for Microsoft 365 Personal and the $99.99 annual plan for Microsoft 365 Family give you Microsoft Word, Excel, and PowerPoint, the main reason that most people sign up. But from there, it gets iffy. Outlook? Id much rather have Mail, thanks. Microsoft Defender? That comes with Windows, or there are other free antivirus solutions available. Clipchamp? I love it, true, but Microsoft tried to make that a $19 monthly subscription, too.

Mark Hachman / IDG

A consumer version of Teams, Access, Publisher, Forms, and Skype? Im going to argue that the value there is next to nil for many people, especially in a world with Zoom, Canva, Microsoft Designer, and other solutions. Yes, there are hidden reasons to subscribe to Microsoft 365, but they dont seem as potent as they once were.

Storage, though, matters, and its a travesty that Microsoft hasnt kept up with the times. In 2014, phones like the Samsung Galaxy Note 4 offered 16Mpixel cameras, even back then. Today, a camera like the Samsung Galaxy S23 or S24 allows you to shoot 200Mpixel still shots, plus 4K videos and even more. As long as you set your phone to automatically upload your photos and movies to OneDrive, the available storage space can be sucked up quickly. And, of course, a Microsoft 365 subscription includes all of the other files that you use daily, from Word to PowerPoint. In all, what you might call storage inflation is increasing, and it will only continue.

Mark Hachman / IDG

One of the reasons that Googles own storage policy change received such attention was that it felt unfair. Google used to offer unlimited photo storage, even compressed which, to be fair, it didnt have to do. But doing away with all that was just the first step in making all of the photos you uploaded count against your free 15GB storage cap, prompting you to pay for a Google One subscription, or at least additional storage. It was anxiety-inducing, especially during a pandemic when people didnt need extra anxiety.

Now both Microsoft and Google are asking consumers and businesses to pay an additional fee to access the top tier of their respective AI LLMs: $20 per month for a new Google One AI Premium subscription (Google Workspace, 2TB of storage, and Gemini Pro, among other benefits) versus $26.99 ($6.99 per user per month for Microsoft 365 Personal with 1TB of storage, plus an additional $20/mo for Copilot Pro). Google simply offers a substantially better deal.

Microsoft has historically jerked consumers around when it comes to cloud storage, enough that I had to consult OneDrives Wikipedia page to get it all straight. Remember, in 2014, Microsoft upgraded Office 365 subscribers to 1TB of OneDrive storage, giddily pushed it to unlimited OneDrive storage, then reneged on the deal and settled on the 1TB limit a year later. Its remained 1TB ever since. Even the Microsoft 365 Family plan (which offers up to 6TB of storage) doesnt pool the storage; its 1TB per user, with up to six users.

Its also worth pointing out, incidentally, that Dropboxs low-end Plus plan is, yep, $9.99 per month for 2TB.

If Microsoft executives want to make consumers pay through the nose to satisfy their shareholders, there are plenty of AI-related services to justify the cost. Enterprises can afford those premium services, too. But jerking around consumers with Microsoft Rewards points and failing to upgrade the OneDrive tier with additional storage feels a little slimy. Its time to get with the times, Microsoft, and offer more for our money.

Read more here:
It's time to upgrade OneDrive's paid storage, Microsoft - PCWorld

Read More..

Sony’s Creator Cloud now allows photos & videos to be directly uploaded to cloud storage from the FX3 & FX30 – Newsshooter

Sonys Creator Cloud now allows photos and videos taken with cameras such as the FX3 and FX30to be directly uploaded to cloud storage.

After setting it up in Creators App or on Creators Cloud Web, you can connect your camera to the cloud storage and upload photos and videos directly without needing to do it via your smartphone. You can quickly check your captured images/videos from various devices, and get started on editing.

Huge image data such as high image quality photos and videos are able to be transferred to a smartphone both via Wi-Fi and USB cable. A wired connection provides more stable transfer.

From March 2024 onwards, the 1, 9 III, 7S III, and 7 IV will also be compatible with this function. Please note that to be able to use this feature you will need to perform a software update for the camera you are using.

Sony has four different plans you can choose from. If you own an eligible Sony camera, you can start with a free 25 GB plan. A 500 GB plan is also available for heavy users.

By linking your smartphone to cameras, you can control them remotely from the smartphone. This enables remote shooting, which may be useful for group shots or keeping cameras still in night scenes. You can also check camera battery and media information, set the date, time, or camera name, and more from a smartphone.

LUT files stored in the cloud storage (Creators Cloud) can also be imported to the camera via smartphone without using any PC and SD card.

Matthew Allard is a multi-award-winning, ACS accredited freelance Director of Photography with over 30 years' of experience working in more than 50 countries around the world.

He is the Editor of Newsshooter.com and has been writing on the site since 2010.

Matthew has won 49 ACS Awards, including five prestigious Golden Tripods. In 2016 he won the Award for Best Cinematography at the 21st Asian Television Awards.

Matthew is available to hire as a DP in Japan or for work anywhere else in the world.

Link:
Sony's Creator Cloud now allows photos & videos to be directly uploaded to cloud storage from the FX3 & FX30 - Newsshooter

Read More..

Cloud DVR levels up with video expertise and operational agility from Ateme and AWS | Amazon Web Services – AWS Blog

This blog is co-authored by Francois Guilleautot, Director of cloud solutions, Ateme.

Since its introduction at the 1999 CES show in Las Vegas, Digital Video Recording (DVR) has existed in various forms. While the technology is not new (Ateme has offered DVR solutions since 2016), the transition to over-the-top (OTT) streaming has accelerated adoption, with the DVR market expected to grow from $6.4B USD in 2023 to $16.4B USD by 2030. Consumers now record more content than ever, and expect recordings to be available on any device, from anywhere. In fact, recording capability is now the most desired functionality for a streaming service.

Although demand for recording capabilities has increased with streaming, the concept itself is not new. Technology to record television for later viewing has been available for more than 50 years. In the age of analog TV, VHS and Betamax tape recorders were available to save copies of your favorite programs. Prior to that, television viewers may remember G-Code, VideoPlus+, or ShowView codes in the TV Guide to ease the pain of setting a recording time.

In the era of digital television, hard drives were integrated into set-top-boxes (STBs) to allow for recording capabilities. A large number of legacy STBs are still in use today. However, TV operators ultimately realized that STB DVR capabilities were inefficient and expensive. Video requires substantial storage capacity and high performance. Providing millions of users with hundreds of hours of storage means managing fleets of STBs with pricey, failure-prone, and rapidly aging hard drives.

To reduce management and services costs, Network Digital Video Recording (nDVR) platforms, such as Atemes NEA DVR, emerged. The concept is simple: rather than dispersed recordings over countless hard drives, providers merge their storage on a single unified platform. This improves efficiency and reduces overhead, leading to improved total cost of ownership for the operator. With nDVR, drives are part of an owned infrastructure with better protection and improved durability, while allowing customer access via the operators closed network or via the internet.

As operators improved operations, the viewers experience improved with recordings available from any device connected to the nDVR. Consumer began to record more content, in increasingly higher resolution, including 4K. One of Atemes customers scaled up to 300 racks of video storage for its nDVR system, accommodating content over 15 years of age.

Expanding video recording systems comes with a distinct set of difficulties. The first challenge is managing massive scalability. Modern streaming services generate anywhere from a few hundred gigabytes to tens of terabytes of recordings each day, depending on the legal status of the country hosting the platform. With Shared-copy permitted by most nations, if multiple users record the same content (e.g., a football match or a movie), a single copy is stored and accessed by all who record it. This shrinks the amount of raw storage space required by a factor of 10 or even 100. Even with shared copy techniques, network DVR platforms can expand to enormous scale, reaching hundreds of petabytes. Managing storage volumes of this magnitude requires substantial compute, network, and storage resourcesup to tens of thousands of HDDs or SSDs.

DVR places an incredible amount of pressure on storage. To ensure an optimal viewing experience (start time, resolution, re-buffering), high Inputs/Outputs Operations per Second (IOPS) are required to support throughput traffic on the storage. Commodity, high-capacity hard drives do not have the IOPS required to deliver content to thousands or tens of thousands concurrent viewers requesting high-resolution content simultaneously. Using more drives in parallel improves storage performance, but requires additional CPU capacity to fully use extra storage, which increases cost.

Beyond the sheer size and performance requirements for storage, availability (and therefore redundancy) is an important consideration. Subscribers expect their content to be available instantly, even months or years after the recording. To allow such a high level of availability, nDVR vendors use high-performance Network-Attached Storage (NAS) with large redundant arrays of independent disks (RAID), or a distributed storage system with a dedicated erasure coding mechanism for better performance and density. However, these systems have limited scalability. The additional storage requires matching compute power and networking to ensure smooth operation, compounding already difficult storage capacity requirements.

Such a large, high-performance video storage platform comes with high maintenance costs. Large recording platforms require hundreds to thousands of MWh per year in power and cooling, as well as engineers dedicated to the hardware management who may encounter frequent disk failures and other hardware issues across hundreds of servers. Engineers must regularly roll out operating system and software updates to fix bugs and avoid ransomware attacks. A recent example is the 2021 LOG4J Zero-Day vulnerability.

Migrating DVR platforms to the cloud provides an elegant solution to the problems previously listed, namely scalability, operational complexity, cost, and security. For example, Amazon Simple Storage Service (S3) offers virtually unlimited storage capacity with high-speed, built-in data transfers, and data redundancy by writing across multiple Availability Zones within a region. This provides 99.99% availability and 11-9s of durability for recordings. Adding a geo-redundancy dimension, not economically viable with on-premises platforms, improves recording availability. On top of offering better reliability, Amazon Web Services (AWS) also handles the undifferentiated heavy lifting of hardware and OS security. AWS offers additional security features such as encryption at rest and in transit, and integration with AWS Identity and Access Management (IAM) for fine-grained access control of video content.

The major benefit of migrating to Amazon S3 cloud storage is native storage tiering. Storage tiering is a feature that Ateme offers with its legacy nDVR system and that AWS offers natively. This allows you to choose from different storage classes based on data access patterns, with less frequently accessed video files located to lower-cost storage tiers, optimizing costs without compromising accessibility. This is especially relevant for video recordings as usage patterns change over time, with older recordings requested less often than recent recordings.

Ateme uses four Amazon storage services to optimize cost over the course of an asset lifecycle. In an end-to-end cloud-native OTT platform, an asset starts its lifecycle on premises or on Amazon gp2 Elastic Block Storage (Amazon EBS) as part of a live channel rolling buffer for time-shifted TV (TSTV) for maximum performance (IOPS). The asset then moves to an S3 Standard bucket to enable Catch-up or backwards EPG services. As the asset ages out of the backward EPG, its popularity and number of requests decreases. With a decreasing number of requests, it will be moved to S3 Infrequent Access and then to Glacier Instant Retrieval (GIR). GIR is the lowest tier of storage available to reduce prices while still allowing for instant playback. S3 Intelligent tiering allows assets to be programmatically moved between storage classes based on access patterns.

To best leverage the benefits of cloud storage, Ateme rearchitected its entire recording pipeline. In 2023, it launched a new cloud-native recording platform, NEA Genesis. Ateme designed NEA Genesis using a micro-services architecture to operate natively on AWS. This move away from a monolithic architecture allows Ateme to scale out logical blocks independently such as ingest, storage, and playout.

Ateme NEA Genesis High Level Architecture

Scaling logical blocks independently allows streaming platforms to optimize their non-linear video operations with a flexible and unified solution for VOD and recordings. With NEA Genesis, not only is it possible to grow storage independently of compute, but it is now possible to scale up ingest and playout dynamically. This provides flexibility to add large batches of VOD and event-based channels, or scale up playout capacity when demand surges. On-premises infrastructure scales these resources together, and typically only allows scaling up, which leads to over-provisioning. AWS infrastructure supports Atemes ability to scale ingest, storage, and egress independently, a major advantage over the limitation of on-premises hardware.

NEA Genesis includes additional modules for advanced functionality. For example, it supports encrypted or clear asset storage, with a dedicated packager/re-packager and encryption service to ensure platform compatibility with future standards. This means that assets can be ingested in a single format (e.g., DASH or CMAF) and redistributed in whatever combination of streaming protocol and DRM that is relevant in the future. NEA Genesis customers also benefit from both the advanced features of Atemes in-house built packaging engine and Atemes extensive integration library with CMS and DRM partners for frictionless onboarding.

To learn more about the benefits of cloud DVR solutions from Ateme, visit the companys website.

See the original post:
Cloud DVR levels up with video expertise and operational agility from Ateme and AWS | Amazon Web Services - AWS Blog

Read More..

What is Green Cloud Storage & Which Providers Offer It in 2024? – Cloudwards

Cloud storage is a type of service that can supplement your hard drive capacity and increase productivity. It does that by offering you a simple way to store your data in the cloud while providing you quick access. Thats pretty convenient, but the services might not be eco-friendly, so in this article well see which ones are get you some green cloud storage.

First, though, we have to add that cloud storage services can do more than just store your data. Their key features such as file sharing and device synchronization, or sync for short help improve collaboration, making these tools ideal for businesses, especially those that rely on remote work.

Many cloud storage services also let you take notes, recover deleted files, chat, share calendars, assign tasks and edit documents in real time with collaborators.

If youre looking to create an effective online environment for collaboration, you should start by consulting our list of the best EFSS (enterprise file sync and share) services.

Now that weve defined what cloud storage is, were going to see what makes these services environmentally friendly.

Meet the experts

Learn more about our editorial team and our research process.

Cloud storage data centers are designed to store data efficiently. The main difference between them and on-premise servers is their amount of resource utilization.

On-premise servers operate, on average, at 12 to 18 percent capacity, while cloud data center servers can reach a maximum of 40 to 70 percent of utilization, with the average being from 10 to 50, according to a 2014 energy efficiency report from the Natural Resources Defense Council, a nonprofit U.S. organization.

Increased server utilization in cloud storage data centers stems from the fact that these data centers use virtualization technology.

Virtualization enables multiple applications each in its own virtual environment to be run on a single physical server. This greatly reduces the number of servers and the amount of energy required to run a given number of applications.

The amount of power used is a major factor in evaluating efficiency, and the most common metric for that is power usage effectiveness, or PUE. The average PUE in 2019 for the global data center industry, according to the Uptime Institute, was 1.67 (lower is better), which represents an increase from 2018s 1.58.

However, having many servers requires a lot of power, which in turn increases their cost. This motivates some large-scale cloud providers to be more energy efficient than that global average.

Thanks to that, some services, such as Google, report a PUE of just 1.11. Plus, large data centers have more effective cooling, which, again, helps reduce power usage.

To recap, cloud storage data centers are inherently more green than on-premise data centers. Companies can reduce their carbon footprint simply by moving their data to a cloud providers data center, where they will use fewer resources and optimize their efficiency.

In this section, were going to list all the services that fit the factors weve mentioned in the previous section. The first one, in no particular order, is Google Drive.

As we mentioned, Googles data centers are one of the most energy-efficient out there, and that includes its cloud storage service, Google Drive. Specifically, Google achieves great power-usage results thanks to custom, highly efficient servers that waste less power.

On top of that, Google uses inexpensive ways to manage cooling, reuses old hardware and recycles what it cant reuse.

Google Drive is a good choice if you need to work online on office files and share them with co-workers. It also has one of the best customer support services on the market.

Google has many plans, and most of them provide a great value. Its web client and mobile apps are streamlined and easy to use. Plus, Google Drive has a network of servers that spans the globe, which helps it reach fast speeds. You can learn more about what Google Drive offers in our Google Drive review.

Another big name in the tech industry, Amazon, is far from a stranger to green technology. Its PUE is less than 1.2, and Amazon claims more than 50 percent of its energy is derived from renewable sources. Its easy to see how it can accomplish that because it uses six solar farms and three wind farms. Plus, Amazon has announced four new wind farm projects and one solar farm.

Microsoft achieved carbon neutrality in 2012 and has an average PUE of 1.125 for any new data center. Microsoft is also dedicated to sustainability, so it reuses and recycles products and makes its products sustainable. It also reuses waste and responsibly disposes of what it cant use.

Microsoft is also working on project Natick, which would eventually see micro data centers powered by renewable energy dropped on the seafloor.

Microsofts environmentally friendly practices include its Microsoft Azure data centers, which power Microsoft OneDrive, its cloud storage service.

OneDrive gives you access to Office Online, Skype, Outlook, OneNote (see our OneNote review to learn more about the note-taking app) and many other features from the Redmond-based giant. You wont have to empty your wallet to use it, either, because OneDrives plans have fair prices. The 1TB plan will set you back only $6 per month.

Plus, we like how attractive its desktop, web and mobile clients are. Theyre straightforward and wont give you issues. You can also share and collaborate on files while having capable file-sharing security. For more information about OneDrives sharing, read our OneDrive review.

Another cloud storage staple that is making moves toward more environmentally sustainable practices is Dropbox (check out our Dropbox review). Although we couldnt find any current numbers on its PUE, Dropbox announced just last year that they have set new sustainability goals for 2030.

This initiative contains four core goals, and for simplicitys sake well start off by just quoting them here:

As you can see, the goals are mostly pretty vague. Goal number three and four especially could amount to just about anything. The real meat of the commitment here lies in goal number two, as shifting the entirety of Dropboxs user base (estimated at around 600 million users) would be a pretty big deal.

Although Dropbox doesnt publish any PUE numbers like other companies weve discussed here, a study on the differences between distributed and centralized cloud storage estimated that Dropboxs data centers use roughly the same amount of electricity as the country of Luxembourg. We wont get into the precise math check out the actual study for that.

Obviously it remains to be seen whether or not Dropbox can actually succeed at these goals, and some more transparency on its end will be crucial to assess the progress.

Placing your data in the cloud has benefits that not only help you be more productive and take up less space on your computers but also reduce your carbon footprint. You dont have to search hard for a green service because most of the big names in the tech (and cloud) industry have already reduced their carbon footprint.

Weve outlined several services in this article, and most of them are featured on our cloud storage comparison list. If youre still not sure whats the best service for you, though, consult our guide on the best cloud storage services. We also have an article that details the best cloud storage for nonprofits.

Sign up for our newsletter to get the latest on new releases and more.

Large-scale data centers are more energy efficient, but according to The Independent, the expanding data center industry will consume three times as much energy in the next decade, so an increased effort to meet the new requirements will be necessary.

What do you think about green cloud storage and its future? Let us know in the comments below. Thank you for reading.

Here is the original post:
What is Green Cloud Storage & Which Providers Offer It in 2024? - Cloudwards

Read More..

Amove and Storj Simplify Instant Global File Access and Remote Collaboration – PR Newswire

RANCHO MIRAGE, Calif., Feb. 20, 2024 /PRNewswire/ -- Storj and Amove today launch their partnership at the HPA (Hollywood Professional Association) Tech Retreat, where global leaders in engineering, technology, creativity and business engage with the most compelling topics around the creation, management, and dissemination of content. Joining forces, Storj and Amove will advance their shared goal of rapidly delivering affordable storage solutions with industry-leading performance to organizations in the ever-growing file management, media and AI markets.

Storj provides cost effective, sustainable cloud object storage through groundbreaking architecture that distributes data to unused storage space in existing drives and data centers around the world, due to the fact that the average rate of server utilization is only 12-18%of capacity. This provides an extremely scalable and environmentally sustainable solution, thanks to eliminating the need to build, maintain and cool new data centers to meet surging demand. Storj's distributed model also is inherently more secure than traditional approaches, with no single point of failure, no geographic vulnerability to regional or local disasters and is designed using zero-trust principals.

Patrick Kennedy, Amove CEO stated, "Amove is storage agnostic, so we support every provider. After years of development and testing over 45 services, we chose Storj as the ideal partner to deliver our users instant capacity from Amove Drives with incredible speed, cost efficiency and performance within an innovative architecture that supports remote streaming and access from anywhere. Their price model is also strongly aligned with Amove's mission to deliver solutions to organizations of every size and industry."

Amove offers instant access to any cloud storage provider as a collaboration drive from the desktop, snapshots, backups and no cost unlimited migrations from AWS, Azure, Wasabi and 30 other providers into Storj. With a focus on simplicity, the Amove Drive allows users to mount their storage buckets directly from the desktop, providing a true multi-cloud management tool that delivers immediate access to the largest files from any cloud or on-premise storage.

Amove also offers up to 3 terabytes of Storj at no cost for each user when purchasing the Amove premium plan, $20/month per user with unlimited access. Other features include syncs between providers, file sharing, cloud to cloud migrations, backups, and AI powered deduplication. Storj is ideally suited for handling the large file sizes that media and AI workflows require.

Ben Golub, CEO of Storj said "Our partnership with Amove extends their commitment to delivering intuitive, flexible, connected and easy to use products to manage and protect the world's data. We are also pleased that Amove appreciates the growing urgency to embrace solutions that provide affordable performance and environmental sustainability in light of the massive growth in data and a critical need to manage and store it while reducing environmental impact."

About AmoveAmoveis the new cloud storage and file management SaaS, delivering a range of powerful features for organizations and remote teams of all sizes. Amove provides an entirely new way of accessing and managing any cloud storage with ease, cost efficiency and flexibility. Our public APIs are available for an ecosystem that looks to democratize data wherever it lives.

About StorjStorj is revolutionizing cloud object storage by securely and efficiently distributing data across underutilized drives. Users experience enterprise-grade durability and globally superior performance at the edge. Customers are realizing an 80% reduction in costs and carbon emissions. Make the world your data center at storj.io

SOURCE Storj

Continued here:
Amove and Storj Simplify Instant Global File Access and Remote Collaboration - PR Newswire

Read More..

Best Dropbox alternative of 2024 – TechRadar

Best Dropbox alternative: Quick menu

The best Dropbox alternatives make it simple and easy to set up and manage cloud storage services, without subscribing to Dropbox.

It's hard to imagine how it was ever possible to run a business properly without cloud storage, and Dropbox is among the most well known and most popular cloud storage providers on the market. Launched in 2008, it has longevity on its side, as well as the fact that it is a versatile service that is available on a variety of platforms.

However, it isn't perfect and it has been the subject of criticism around some of its performance and security issues. And given that the number of cloud storage and file sharing services has grown rapidly in recent years, it's always worth considering the other top options before deciding which cloud storage to go for.

With that in mind, we've looked at alternative cloud storage serivces that wont let you down, taking into consideration key factors like cost, cross-platform compatibility, third party app integrations, and robustness and reliability in our analysis.

Therefore below we'll list the best Dropbox alternatives currently available.

Also check out our roundup of the best business cloud storage.

Why you can trust TechRadar We spend hours testing every product or service we review, so you can be sure youre buying the best. Find out more about how we test.

Best for those already using Google Workspace

Syncs desktop-to-desktop

Cross-platform capability

Store, save and backup files in real time

Third party app integrations

Require Google account for best usage

When you think of cloud storage, this is probably the first service to come to mind- and with good reason.

Google Drive lets you store your files in the cloud, and importantly sync these files and settings across multiple devices. It lets you backup your files, too so they will never get lost. Google Drive allows users to collaborate, sync and share data easily, and everyone can edit files and perform tasks simultaneously, and without any hassle. Google Drives generous storage allotment lets you save as many pertinent documents and important files as you need. It starts with 15 GB for free, and can be expanded to 100 GB for $1.99/month as part of the Google One Basic Plan, with an offering of an annual plan at a discount.

Much like Dropbox, with Google Drive, you can store, share, sync and access your files, photos, videos, songs, etc. from any computer and also other platforms through smartphone apps. Its super easy to use and is highly reliable in making your files safe and accessible. Google Docs and Google Sheets also can do word processing and spreadsheets, respectively to handle Microsoft Excel and Microsoft Word files instantly.

Read our full Google Drive review.

Best option for users of Microsoft Office

Integrates with Microsoft Office

OneDrive Personal Vault protection

Scan and store documents

Require Microsoft account for best usage

Microsoft's OneDrive has no shortage of features that are more than likely to draw users in, including collaboration, backing up of data and protection, all in 1TB of cloud storage space.

The service also has a neat feature to let you scan in your document using your phone and store it directly into your OneDrive account, keeping it secure.

Furthermore, the Personal Vault, essentially provides an additional layer of protection, as well as being able to set an expiration date for shared files, giving all collaborators access for a limited time. Additionally, even OneDrives free version allows you to save folders that you can access offline, as well as search your database for files quickly by simply using keywords.

Read our full Microsoft OneDrive review.

The trusted choice for big business

Enterprise-grade security with password-protected files

Auto-expiration

Real-time collaboration

Can send large files

Lacking some features found in competitors

Box provides you with many of the same convenient features as Dropbox, plus it takes it up a notch. Its designed to be as easy as uploading your files in the secure server and also being able to access it from any device you can log in to. With a Box account, your files are kept secure, with always a backup ready.

Boxs enterprise-grade security remains an important reason why many Fortune 500 companies are using the service to keep their files secure. Real time collaboration, permission control, and access restriction are among Boxs more advanced features that offer the convenience you need from it.

Box also lets you store and share large files. Also its integrated apps let you access your work from any device and use other platforms such as Microsoft Office 365, Google Workspace and Okta.

Similar to working on Google Docs, Box has Autosave for your work, so you can revert anytime in case you need to undo or redo anything.

Read our full Box review.

Perfect pick for users prioritising collaboration

Encrypted storage

Can send files to non-Sync users

In-house support

Centralized storage

Lower storage option

Lack of some key features in competitors

Sync does allow you to store, share and access your files from any device anytime, keeping in mind some differences with other services on this list. For example, Sync only gives you 5GB of free storage (less than some other free cloud storage platforms), but your files are kept secure and private thanks to its end-to-end encryption, which prevents unauthorized access.

A unique feature of Sync is the ability to share files with anyone- as long as they have access to the Internet, with or without a Sync account. Collaboration is facilitated by having you set requests for access, password protection, expiration, and even notifications that your file is being accessed. Sync also has apps for Mac, Windows, iOS and Android, which enables having access from any device when needed.

Sync also offers premium plans for a personal (individual) or company needs. The plans, which start at only $8, can dedicate up to 6TB of storage and unlimited file sharing. Youll also get file recovery, password protection, two-factor authentication, real time backup and sync, advanced share controls, and a lot more.

Read our full Sync review.

Perfect for multi-platform usage

Enterprise level security

Compliant with regulations

3GB data storage for free accounts

Multiplatform support

Lacking high profile and features of bigger names

Lower storage limit

Tresorit supports the full gamut of platforms: Windows, Mac, Linux, Android and iOS devices, adding to the convenience it offers in cloud collaboration. With its enterprise-level security, your sensitive files and data are kept secure from unwanted prying eyes, whether you're working on whitepapers, data sheets or anything in between.

On other platforms, when youre editing a file you may not be aware if someone else is making changes simultaneously. Tresorit aims to avoid this turmoil through an editing badge, to let collaborators know when someone is working on something. Sharing rights can also be set so your organization is safe from those who are not on your team.

Tresorits free starter tier, the basic plan provides 3GB of encrypted cloud storage across up to 2 devices. For those looking for additional storage, you have the option to upgrade to the premium plans starting at $18 per month. All plans can give you 1TB of cloud storage which is most of the time enough storage for small businesses and enterprises. Premium plans also let you access files across 10 devices.

Read our full Tresorit review.

A top pick for video and audio creatives

Strong security options

Unlimited file size

Access latest version of files

Password protection

Can be cumbersome to find files

pCloud provides secure storage in the cloud for your work files, videos, music, documents, and photos. As soon as your file is transferred from your device to your secure pCloud accounts storage, it goes through TLS/SSL encryption. Files are stored on three (or more) server locations just because security is taken very seriously and thats even for the free version, on top of 10GB free storage space.

If 10GB just doesnt cut it for the size of storage youll be needing, pClouds premium accounts can give you 500GB and 1TB. Of course, youll need these large storage spaces when youre regularly sharing and collaborating large files within your organization. pCloud also offers you the convenience of automatically uploading your photos directly from your camera roll, and even keeping older versions of your file for up to 30 days just in case you need to revert.

pCloud boasts of its built-in video player for easy file access (video sharing seems to be a common problem), and a built-in audio player and playlists when youre on the go and want the company of your favorite tunes. Sharing your links is also made a little more personalized as pCloud allows you to add a headline, a title and even a description so files are easy to find and identify.

Read our full pCloud review.

Best choice for a focus on privacy

Fast transfer ability

End-to-end encryption

Cheap plans

Lacking high profile of bigger rivals

Just for signing up, a free 20 GB of storage. Because why not? Mega makes cloud storage convenient and super simple to use without sacrificing security, especially that of sensitive data and files. The files you upload and even your chats are kept safe with user-controlled end-to-end encryption, protected by your very own password.

Security isnt Megas only strength though. Mega files can be accessed using any device thats connected to the Internet, and allows you to password protect your links and even set an expiry date for public and sensitive data. Uploading your files in Megas secure server is done quickly and efficiently, and the same goes with syncing. This is because of Megas user-friendly interface which also makes sharing folders/files as easy as a single click.

Communicating with your team is made easy with Megas awesome MegaChat feature which also encrypts your messages by the way because security is taken very seriously with Mega. Paid plans start at $5.08/month and will give you a massive storage space and no transfer limits. With Mega, you can upload your files, sync them, back them up and access them from just about any device you have on you, all in a series of steps that start with a single click.

Read our full Mega review.

The best cloud storage for iOS and MacOS users

Good mobile and desktop integration

Sharing options are improving

Polished and slick user interfaces

No Android support

Still lacks some advanced features

Lacks options for power users

For those that live only in the Apple ecosystem of iProducts, the Apple iCloud Drive is an easy choice when it comes to online cloud storage. However, taking a step back, this is not the best choice for Windows users as there are some issues, such as no Android support (with the painful workaround of using the mobile browser to log into the iCloud website). Another problem with iCloud Drive is that it is just a little less polished than its competition, such as Google Drive and Microsoft OneDrive, both of which have undergone multiple cycles of development at this point.

Still, iCloud Drive can back up the full variety of files at this point, everything from PDFs, and even info needed for a users iPhone apps. It also conveniently has its own app for both iPhones and iPads, for mobile access to data. Users get 5 GB of storage to start (less than some other competing services free tiers), and then it can be upgraded to Cloud+, where 50 GB of storage starts at an affordable $0.99/month on up to 2 TB of storage for $9.99/month.

Another obstacle is that iCloud Drive is tightly integrated with macOS to backup data, with no analogous process for Windows users. There are some notable features, such as the encryption of the data for privacy, and the iCloud Private Relay to hide IP addresses to keep users anonymous online. However, Apple iCloud remains a much more viable option for users of Apple devices than others.

Read our full iCloud review.

Forgiving price plans and fantastic security features

Read more:
Best Dropbox alternative of 2024 - TechRadar

Read More..