Page 1,619«..1020..1,6181,6191,6201,621..1,6301,640..»

The Arbitrum Foundation Announces DAO Governance for the … – PR Newswire

The launch of the DAO Governance marks a significant milestone in the decentralization of the Arbitrum One and Arbitrum Nova networks, becoming the first EVM rollup technology to achieve Stage 1 decentralization

NEW YORK, March 16, 2023 /PRNewswire/ -- The Arbitrum Foundation today announced the launch of DAO governance for the Arbitrum One and Arbitrum Nova networks, a massive leap forward in the decentralization of the two networks. Alongside the DAO governance structure, The Arbitrum Foundation also announced an upcoming drop of $ARB to users of the Arbitrum ecosystem on Thursday, March 23.

Late last year, Vitalik Buterin proposed a 3 stage schema for decentralizing rollups, and with today's announcement Arbitrum has now become the first EVM rollup ever to achieve Stage 1. The milestone signifies an important achievement for both Arbitrum networks and for the state of Ethereum scaling more broadly.

The $ARB token will facilitate the decentralization of the Arbirum network, and the $ARB airdrop will place the governance token in the hands of the users who are actively participating in the Arbitrum ecosystem. Users can visit gov.arbitrum.foundation and follow the prompts for eligibility details and to claim their share in governance. The majority of the $ARB supply will be under the control of the Arbitrum community via The Arbitrum Foundation, accelerating growth of the ecosystem organically. $ARB token holders will govern The Arbitrum Foundation through the Arbitrum DAO.

Steven Goldfeder, CEO and Co-Founder of Offchain Labscommented: "We are extraordinarily excited for the official launch of The Arbitrum Foundation and DAO governance and to see Arbitrum One become the first EVM rollup to advance to Stage 1 decentralization, a tremendous milestone for both Arbitrum and Ethereum. Through the community airdrop, the delegation process, and the introduction of the Security Council, community participation and control is at the forefront of today's announcement, and the requirements for receiving a share of Arbitrum governance have been crafted meticulously, optimizing for the longevity of the ecosystem and community. Looking ahead, we're moving closer and closer toward a decentralized financial system, with the Arbitrum technology at the very forefront of that.."

To facilitate effective community governance, users will be able to delegate voting power to individuals they view as effective stewards of their values. Delegates will be expected to vote on proposals that pass through the Arbitrum DAO in a way that represents the token-holders who have assigned their voting power to them. The Arbitrum DAO will have the power to control key decisions at the core protocol level, from how the chain's technology is upgraded to how the revenue from the chain can be used to support the ecosystem. Those interested in becoming a delegate are encouraged to visit the governance forum and apply.

Crucially, Arbitrum's governance will be self-executing, meaning that the DAO's votes will directly have the power to effect and execute its on-chain decisions, and not rely on an intermediary to carry out those decisions. Self-executing governance is a critical milestone for decentralization and giving the community the power to govern the chain, and Arbitrum is leading the way as the first L2 to launch self-executing governance.

The Arbitrum Foundation also announced the creation of the Arbitrum Security Council, a 12-member multisig of highly regarded community members designed to ensure the security of the chains and be able to act quickly in the event of a security vulnerability. The decision-making powers of the Security Council are determined by a smart contract that will require multiple secure signatures by its members in order to implement any changes to the protocol. In case of emergency, the Arbitrum Security Council will be able to act quickly but this will require participation from 9 of the 12 members. The Arbitrum DAO will be the ultimate governing body over the Arbitrum Security Council, with elections for the Council being held twice annually.

The introduction further reinforces Arbitrum's focus on decentralization by giving the community the ability to play a more active role in Arbitrum governance and have a say over what occurs within the ecosystem.

Arbitrum is the leading Layer 2 (L2) scaling solution for Ethereum, boasting the highest Total Value Locked (TVL) across all L2 networks with approximately $3.61B, 55% market share across all rollups, and the Arbitrum One network recently surpassed Ethereum daily transactions on two occasions.

For more information, please visit the Arbitrum blog: http://arbitrumfoundation.medium.com/

About Offchain LabsOffchain Labs is a venture-backed and Princeton-founded company that was the initial developers of Arbitrum, a suite of secure scaling solutions for Ethereum. Arbitrum's technologies instantly scale dApps, significantly reducing costs and increasing speed, without sacrificing Ethereum's security. Porting contracts to Arbitrum requires no code changes or downloads as Arbitrum is fully EVM compatible. Offchain Labs also maintains Prsym, the leading Ethereum consensus client.

About The Arbitrum FoundationThe Arbitrum Foundation has a mission to help support and grow the Arbitrum network and its community while remaining at the forefront of blockchain adoption. The Foundation oversees the $ARB token and governance structure as well as the Arbitrum Security Council, a 12-member multisig of well regarded community members designed to ensure the security of the chains.

Media contact: Dillon Arace, [emailprotected]

SOURCE Arbitrum Foundation

More:

The Arbitrum Foundation Announces DAO Governance for the ... - PR Newswire

Read More..

Decentralization: In Togo, Yoto III municipality adopts a CFA4 billion … – Togo First

REFORMS OVERVIEW

STARTING A BUSINESS (more info)

At the fifteenth position, worldwide, and first in Africa, under the Starting a Business index of the 2020 Doing Business ranking, Togo sustains its reformative dynamics with more reforms.

ENFORCING CONTRACTS (more info)

Compared to some years ago when it was one of the lowest rankers under the Doing Business Enforcing Contracts indicator, Togo, leveraging many efforts to improve its business climate, was able to jump significantly on the index in the recent years... .

CONTRACT EXECUTION (more info)

Creation of special chambers of commerce for small debts Creation of chambers of commerce at the Court of Appeal Civil and commercial cases now handled by distinct clerks Establishment of commercial courts in Lom and Kara Lawyers and bailiffs now have access to the FORSETI COMMERCIAL platform A maximum period of 100 days was fixed to settle a commercial dispute .

TRADING ACROSS BORDERS (more info)

In comparison to previous years,Togo has significantly improved its ranking under theTrading across borders indicator by adopting multiple reforms that focus mainly on the digitization and reduction in delays, for import and export procedures related to import and export.

In comparison to previous years, Togo has significantly improved its ranking on the Trading across borders index by adopting multiple reforms that focus mainly on the digitalization and reduction in delays, for import and export procedures related to import and export.

CONSTRUCTION PERMIT (more info)

After moving from the 133rd to 127th place under the 2020 Doing Business construction permit index, Togo intends to reiterate this feat in the coming edition of the global ranking. To this end, it has introduced this year multiple reforms.

GETTING ELECTRICITY (more info)

Over the past two years, Togos ranking under the Doing Business Getting electricity and water indicator has increased consistently. Owing this performance to multiple reforms aimed at making it easier for businesses to access power and water, Lom plans to introduce even more reforms this year to keep up its improvements.

REGISTERING A PROPERTY (more info)

Out of all the 'Doing Business indicators, Property Registration is where Togo has improved the most since 2018. Indeed, after spending years in the lowest part of this ranking, the country now seeks to beat Rwanda which is the best performer on this index in Africa. To do so, Lom has been introducing many reforms, with the latest batch implemented this year.

PUBLIC PROCUREMENT(more info)

From professionalization to digitization, through legislative regulations, Togos public procurement framework is constantly being modernized. Several reforms have been implemented to improve the sector much to the benefit of the private sector, which is the focus of the National Development Plan.

PAYING TAXES AND DUTIES (more info)

To improve its business environment, Togo introduced some important reforms related to the payment of tax and duties. From the replacement of some taxes to the cancellation of others through exemptions, the country has only one objective: offer the most attractive tax framework to investors and economic operators. To achieve this, the authorities relied on digitization.

Continued here:

Decentralization: In Togo, Yoto III municipality adopts a CFA4 billion ... - Togo First

Read More..

At SXSW 2023, Web3ers Pitch Their Decentralized Museum and Meet Skepticism – ARTnews

A panel at the 2023 edition of SXSW titled The Decentralization of Art As We Know It began with a scene that bordered on self-parody. Alex Scull, the moderator and the executive director for arts and collections at the University of Texas at Austin, asked 2023s favorite interlocutor ChatGPT the question of the day: What is the decentralization of art?

The decentralization of art refers to the movement away from traditional centers of power and authority in the art world, the worlds favorite AI said, in an answer only mildly edited for brevity. However, with the advent of digital technology and the internet, theres been a shift towards decentralization, which has also led to the democratization of art, allowing for wider range of voices and perspectives to be heard. This challenges the traditional power structures and hierarchies of the art world as artists no longer are reliant on a few gatekeepers to reach a wider audience.

While the talk was ostensibly about this decentralization, the discourse mostly revolved around the Arkive, a Web3 startup that raised $9.7 million last year around this idea. That may have been due to the fact that the panel was comprised of the companys executive director Tom McLeod, chief curator and former Gagosian director Kelly Huang, and Gabby Goldberg of TCG Crypto, an investment company that fund the startup. All were, perhaps unsurprisingly, impressed by ChatGPTs answer.

Arkive has said it wants to decentralize the museum through a member-based organization similar to the decentralized autonomous organization DAO) thattried, and failed, to buy a copy of the United States Constitution last year. Arkive allows the members to propose works to the community, vote on what should be acquired, and build a museum collection from the bottom-up, according to itswebsite. In other words, to replace a museums curatorial board with the collective voice.

Early in the panel, McLeod neatly summed up what he believed ChatGPTs answer was missing and, what just so happened to be Arkives raison detre broad-based provenance. McLeod said the value and cultural significance of a piece of art is often judged by parameters like who has acquired it, what curatorsvouch for it, and what institution puts it on display.

And if you start to look at the masses getting to be a part of that, [something] that hundreds of thousands of people think something is important, as opposed to just ten key people at specific museums or institutions, that actually carries weight, McLeod went on. It may not still have the overwhelming weight of academic research or historically significances. But it matters.

Arkive, he said, is taking the same academic-first research perspective as a traditional institution with the added input of their community. The goal is to support existing museum by lending their acquisitions and essentially becoming a really attractive private collector that works to support the overall network of public art and the public good generally.

In one of his more inspired analogies, McLeod compared Arkive to the Spotify Top 100: the more traction a song gets on Twitter and TikTok, the higher it is on the list.

That choice of comparison was interesting given that, minutes later, McLeod said, looking back, social media might be a net negative.

I think in a long view we are starting to reassess things that we thought might be great might not all be there, he said.

Given the growing skepticism around Web3, it is perhaps unsurprising that some attendees were unconvinced. Sarah Wambold, an executive producer at the Metropolitan Museum of Art, told the panel that she couldnt help but interpret Arkives mission as the further commodification of art.

Museums are non-profits, they are mission-based organizations set up for the public good, which is why they have tax-exempt status. So, aside from exposure, what are you doing to directly support artists? Wambold asked the panel.

McLeod responded that Arkive has started a 501(c)(3) organization that would specifically grant funds back into the program for artists-in-residency programs that focus on underrepresented artists, an editorial team, and an independent magazine, as well as videos and text that would contextualize why they chosen to acquire a given artist. And, of course, exposure.

Do those efforts ultimately support your financial gain? Wambold asked.

I dont think the value of the portfolio is how well make money, McLeod said, adding that the business model relies more on institutions using Arkives protocol to verify items, but if the question is does that make the artists more well known, which then increases the value of the art? Maybe, actually dont know if thats a one-to-one correlation, but probably.

The session was nearly over at this point, but Huang had time to chime in. Our goal is to create space in the same way that The Met creates space for audiences to engage, you know, its through our public programs. We use the term museum to signify the fact that we are actually trying to be as artist-centric and as research-based as possible, that it is just about creating a platform for people to engage. And our focus is really there. Its really not like, how do we make money, you know, especially off of the artists back.

Another audience member who works closely with art institutions and is knowledgeable about Web3 was skeptical about not only the panels purpose, but Arkive as a whole.

Theyre a non-museum who says they are a museum. At the same time, theyre a DAO that claims not to be a DAO, he told ARTnews. Ultimately, I think Arkive suffers from what so many of these crypto-communities suffer fromopacity. It seems they are essentially trying to replace one elite with another elite while not being completely transparent about how their community functions.

Listen to the full panel here.

Read this article:

At SXSW 2023, Web3ers Pitch Their Decentralized Museum and Meet Skepticism - ARTnews

Read More..

The Creators of the First NFT Talk New Collection, Web3 Future – nft now

Jennifer and Kevin McCoy need little introduction. Ten years ago, the acclaimed artists introduced the world to Quantum, the first art NFT ever created. Quantum found its home on the Namecoin blockchain in 2014, when Kevin McCoy decided to mint it as an NFT to establish the provenance of the digital image.

The Web3 world might see minting an NFT for such a reason as a banality these days, but in 2014, the idea was revolutionary. Provenance documents for digital art didnt exist at the time, and Quantum showed the world that blockchain technology could solve the problem of ownership in the digital age.

Its landmark million-dollar sale at a Sothebys auction in 2021 brought it the recognition it finally deserved.

Having created only a handful of NFT artworks since then, the McCoys are now releasing their first NFT collection, Land Sea and Sky. The project, which launches on Artwrld on April 5, is a collection of 310 collages that combine elements of AI-generated landscapes. nft now caught up with Jennifer and Kevin to talk about their upcoming release, how the NFT ecosystem has evolved since 2014, and the ethical quandaries that come with using AI art tools.

Given their position in the annals of Web3, Jennifer and Kevin are uniquely positioned to offer a birds-eye view of how the crypto and NFT space has evolved since they minted Quantum on Namecoin in 2014.

Calling back to comments he made alongside Anil Dash during a 2014 conference, in which the two argued that blockchain-based tech had far more interesting things to offer the world of culture than it did to finance, Kevin recounted a particularly poignant observation he had during the 2021 bull run.

Its culture versus money. And culture wins out every time.

I had this experience walking around Manhattan, looking out and [thinking], everybody here has heard of NFTs, and how weird that was, Kevin recalled. In a very real sense, NFTs became [bigger] in the popular imagination than the money aspect of cryptocurrencies. And I think that thats because NFTs are about media, theyre about ideas in a direct way. Its culture versus money. And culture wins out every time.

Web3 trends tend to be cyclical. The recent ASCII art meta that the Owls NFT project kicked off at the beginning of March put this fact on full display. The projects text-based aesthetic reignited interest in some of the earliest NFTs on the blockchain, many of which were just simple characters associated with a domain on Namecoin. They may not have been the first art-intentional NFTs Web3 witnessed that would come with 2014s Quantum but they nonetheless form an integral part of the spaces history.

As individuals who have been creating art for decades, the McCoys unquestionably advocate for people to learn and honor the value of historical developments in art and technology. But they also lament that theyve seen history forgotten time and again. The pair even go so far as to say that history slows down the market. When a new development occurs, few people are interested in shining a light on the things that came before it.

This is one of the reasons Quantums 2021 Sothebys auction was such an important moment for both the McCoys and the broader Web3 community: a vital piece of NFTs history finally received its due.

Web3s future is as unpredictable as ever, but the McCoys think the dynamic of decentralization is still finding its way. Pointing to traditional art institutions role in guiding conversations and cultural movements, Jennifer noted that, while admittedly centralized, such organizations provided some clarity in separating the signal from the noise.

Everyone is excited about peer-to-peer disintermediation, the idea of direct connection, she explained. And yet, in the absence of institutions to gather us together and point towards an ongoing conversation, it becomes really overwhelming and kind of frightening to have to sort through everything. And I think thats what good institutions can do.

The idea of digital scarcity, uniqueness, and property isnt going to go away.

The two remain big believers in Web3s egalitarian potential, however. Kevin was quick to point out the recent resurgence of discussion surrounding faith in cryptocurrency and Defi to free people from the disastrous mismanagement of the financial sector.

The fraud in [the NFT space in] 2022 was pretty intense, Kevin said, And it rocked a lot of people. But then, out of the blue, we have a banking crisis [with SVB], and, all of a sudden, the core argument for crypto dating back to Satoshi is once again brought up. So, you never know whats going to happen. We are believers in this technology [and] decentralization. The idea of digital scarcity, uniqueness, and property isnt going to go away.

While thats basically a guarantee at this point in Web3s history, it doesnt hurt to hear it from one of the people who helped ignite the digital revolution in the first place.And nothing speaks to their belief in the importance of digital scarcity and uniqueness more than their new work.

Collage work has long appealed to the McCoys, who deeply appreciate the idea of combining disparate visual and cultural elements into something new. To create the collages for Land Sea and Sky, the duo analyzed Ansel Adams body of work, highlighting certain landscape sections of his photographs that caught their eye.

Taking those outlines and using them as a basis with which to structure each hand-made piece,the pair then used the AI art tool Stable Diffusion to generate hundreds of landscape images. They then sorted the images into five descriptive databases: land, sea, mountains, trees, and sky. Custom scripts then searched these databases to produce recombinant landscapes, and the areas in each collage that come from shapes taken from Ansel Adams photographs were then filled in with one of the five types of generated AI landscapes that the McCoys produced.

The result is a fusion of 20th-century landscapes as filtered through 21st-century technology.

We knew it would be a landscape project, Jennifer McCoy said of Land Sea and Skys origins. We had done experiments with jagged cutouts and juxtaposing ill-fitting kind of elements of landscape before, but with Stable Diffusion and AI gaining ground, we [thought] this might be a really interesting way to go straight from imagination to landscape without intervening photographic sources, which was really exciting.

Were kind of driving the American East through the American West.

And while tools like Stable Diffusion are at the center of a heated debate about the ethical use and existential implications of ever-more capable AI systems, the McCoys were curious to explore how they could use them in their artistic practice. They explained that one of the most appealing aspects of these tools is how they evoke the idea of a visual statistical average.

Elaborating on what appeals to them about AIs summative nature, the pair pointed to Russian duo Vitaly Komar and Aleksandr Melamids 1994 work, The Peoples Choice, as a parallel. Komar and Melamid were Russian emigres who were fascinated by the idea of the American people and how they would express their collective preferences for art and culture.

Komar and Aleksandr commissioned a public-research polling firm to survey U.S. citizens accordingly; the survey included questions on lines and curves, colors, size and shape, content, and even what figures they liked to see depicted. Komar and Melamid then created paintings featuring the most and least popular elements, forming a damning satire of both creation-by-committee and the idea that artistic expression is an elitist endeavor.

What is the idea of a California coast? 110 ideas of a California coast? What is 40 images of Vermont trees? These were the kinds of questions we were looking at, Jennifer offered as a perspective on how the pair approached Land Sea and Sky. And the scripting that we did was essentially combining those into specific formats based on Adams [work] from the Sierra and Yosemite. Were kind of driving the American East through the American West.

The collages include AI-generated depictions of all 50 U.S. states. The couple was interested in depicting various environments through Adams lens, especially scenes that are entirely disparate from the photographers rugged depiction of grand-scaled nature in the west of the country.

The impacts are unknown, and I think that people are right to be nervous.

An artists job, Kevin emphasized, is to look out at the world and report back what they see, and landscape is one of the primary genres where that happens. Only now, the landscape is a digital one, an algorithmic landscape that, thanks to new AI technologies, can be used to report back in an entirely new way.

The concerns that prompt-based AI art tools have given rise to are nonetheless not lost on the couple. Kevin believes that critics who point to their social aspect the fact that these programs are trained on the works of others are absolutely right to ask who benefits from these tools use and who is owed what as a result.

These tools are going to be pretty weird, Kevin acknowledged. The impacts are unknown, and I think that people are right to be nervous.

Land Sea and Sky is imbued with a deep sense of nostalgia. The projects Artwrld page notes that the AI-generated images in the collection are meant to evoke something closer to memory than fantasy, with hints of cross-country road trips, Ektachrome film, and the strange, hazy yellow-green tint of [] childhood snapshots spread throughout.

But that sense of memory, Kevin says, stems not from the McCoys own childhoods but from their interpretation of the anxiety people feel toward the real world as they are increasingly thrown into the virtual one.

Theres this kind of question and desire for, or remembrance of, that physical world. Nostalgia for the real, he explained.

While speaking to nft now about the project, Artwrlds Artistic Director and founding partner Nato Thompson drew attention to the unique dynamic the McCoys have in being a couple that produces art together.

They often work with these kinds of juxtaposing technologies, Thompson explained. Theyre thinking about both cinema and landscape, but theyre also a couple, so they inevitably are almost a collage themselves. Theres a certain kind of pushing up against things thats in that work, that I think perhaps is the result also of two people making work together.

The 310 pieces in Land Sea and Sky are split into two parts: 300 short-form collages and 10 longer-form pieces accompanied by audio. And while theres nothing particularly unique about landing on 310 as a supply for the collection, the McCoys observed that utilizing NFTs allows them to maintain serial uniqueness at a scale that might otherwise be impossible in more traditional art forms.

The McCoys have chosen the non-profit Rhizome, a born digital art platform, to receive a portion of the proceeds from primary and secondary sales. Those looking to gain priority access to the drop can do so by minting an NFT fromone of Artwrlds previous collectionsby March 31. Minting for Land Sea and Sky for the general public goes live on April 6.

Go here to read the rest:

The Creators of the First NFT Talk New Collection, Web3 Future - nft now

Read More..

White House to Regulate Cloud Security: Good Luck With That – Security Boulevard

Biden administration wants new regulations for cloud providers. But were not sure itll help.

Old people in suits propose new bureaucracyin an attempt to make IaaS, PaaS and SaaS more secure. Amid much tut-tutting about SolarWinds, they seem convinced they can make a difference.

The internet disagrees.In todays SBBlogwatch, we unpick the arguments.

Your humble blogwatchercurated these bloggy bits for your entertainment. Not to mention:Uptown Car.

Whats the craic? John Sakellariadis reportsBiden administration is embarking on the nations first comprehensive plan to regulate the security practices of cloud providers:

Cloud providers havent done enoughGovernments and businesses have spent two decades rushing to the cloud trusting some of their most sensitive data to tech giants that promised near-limitless storage, powerful software and the knowhow to keep it safe. Now the White House worries that the cloud is becoming a huge security vulnerability. If the government fails to find a way to ensure the resilience of the cloud, it fears the fallout could be devastating.For all their security expertise, the cloud giants offer concentrated targets that hackers could use to compromise or disable a wide range of victims all at once. And cloud servers havent proved to be as secure as government officials had hoped. Hackers from nations such as Russia have used cloud servers from companies like Amazon and Microsoft as a springboard to launch attacks. Cybercriminal groups also regularly rent infrastructure from U.S. cloud providers.Cloud providers havent done enough to prevent criminal and nation-state hackers from abusing their servicesofficials argued, pointing in particular to the 2020 SolarWinds espionage campaign. [And they] express significant frustration that cloud providers often up-charge customers to add security protections:Agencies that fell victim to the Russian hacking campaign had not paid extra for Microsofts enhanced data-logging features.

Maybe more from Matt Milano? Biden Administration Prepares to Regulate Cloud Security:

Cloud security lapsesTheres hardly any aspect of daily life that isnt touched by the cloud in some way. That ubiquity is a source of concern. [So] the Biden Administration now views the cloud industry as too big to fail.Unfortunately while companies have raced to deploy cloud platforms and services, cloud security has often lagged behind, leaving organizations and individuals vulnerable. Even worse, critical infrastructure has come under attack as a result of cloud security lapses.

Will it work? Stephen E. Arnold observes thuswiseBig Tech, Lobbyists, and the US Government:

Armies of attorneys

Heres what stood out to rdevsrex:

The Biden administrationwill require cloud providers to verify the identity of their users to prevent foreign hackers from renting space on U.S. cloud servers.

Wait. Pause. Joell do whatnow? Heres a slightly sarcastic u/ryosen:

Oh good. A bunch of septuagenarians that have demonstrated, time and again, that they lack even the most fundamental understanding of how technology works, are going to legislate how technology should work. Im sure this will be just fine.

And this Anonymous Coward is nonplussed:

Ignoring the hackers scarewording, actual foreign spies have no problem getting US identity cards. So this is zero protection.I dont buy for a moment that the POTUS with the best advisors US government dollars can buy dont know this. So its for another reason. And that reason is the same as why China demands every citizen register to online services with their government identity:To keep tabs on political adversaries.

This is fine. u/sometimesanengineer sips coffee amid the conflagration:

The US government doesnt understand cloud enough to properly regulate it. Ive seen enough stuff get past C3PAO to anticipate a meaningless designation getting applied that customers think absolves them of their piece if the Shared Responsibility Model. Same as weve seen with Azure Government or AWS GovCloud.Information has a tendency to be left off architecture and design documentation. Policies / procedures / practices claimed in controls compliance are not necessarily followed. Layers of the system or components of the system are often left out. And changes are made for expediency sake, often to fix something else thats brokenwhich in complex systems is a quick way to screw things up.

Lawmakers gonna lawmake. techno-vampire predicts pointlessness:

Let me guess:At least 75% of any new regulations will either require cloud providers either to do things or stop doing things that are covered by existing regulations. And, most of the remaining 25% will either be useless, or so ambiguous that nobody will be able to tell if any company is following them or not. Thats because the only point of creating these new regulations will be so that the Administration can claim that they did something.

Meanwhile, u/fractalfocuser laughs and laughs and laughs:

Ohhhh lord this is too funny. Quick everybody! Put the cat back in the bag!

Funk Wash!

Previously in And Finally

You have been readingSBBlogwatchbyRichiJennings. Richi curates the best bloggy bits, finest forums, and weirdest websites so you dont have to. Hate mail may be directed to@RiCHiorsbbw@richi.uk. Ask your doctor before reading. Your mileage may vary. Past performance is no guarantee of future results. Do not stare into laser with remaining eye. E&OE. 30.

Image sauce: DinkeyHotey (cc:by-sa; leveled and cropped)

Recent Articles By Author

Read the original:
White House to Regulate Cloud Security: Good Luck With That - Security Boulevard

Read More..

Devoli bucks cloud migration trend with hybrid on-prem move to HPE – Reseller News

Craig Murphy (Hewlett Packard Enterprise)

Network automation vendor Devolihas bucked the trend of cloud migrations, instead opting fora move to a hybrid on-prem solution powered by Hewlett-Packard Enterprise (HPE).

Three years ago, Devoli had three core platforms that supported all voice applications and network management systems. With Dell servers and many of its monitoring and backend systems sitting with AWS, the systems and equipment were nearing end of life.

We run telco applications that require very high bandwidth and found that AWS couldnt keep up with the necessary processing power, said Ken Nicod, networking director at Devoli.

Also of concern was the increasing costs involved with a voice network that makes lots of small processing transactions. With public cloud no longer a long-term viable option, Devoli looked to create its own.

Searching for a solution that would allow Devoli to continue to support its customers seamlessly, Nicod was in the market for a full five-year engagement and a partnership with end-to-end service.

Devoli had an existing relationship with Ingram Micro, who put forward HPEs Alletra dHCL product, which includes HPE Storage and HPE ProLiant Servers, that could provide streamlined infrastructure and optimised performance.

For Craig Murphy, general manager of channels at HPE, moving to on-prem solution was an interesting scenario given the current wider trends of migrating to the cloud.

With everyone saying that mass migration to the cloud is the solution to everything, we are saying, No, not quite whats fit for purpose? Where does the data need to reside to get the best outcome? he said. We have come to the realisation that mass migration to the cloud is not the answer hybrid models is the way to go.

Were in a hybrid world and HPE has positioned itself to be the perfect partner for that. It doesnt matter what cloud youre in we can interact with it and we can also have an on-prem or co-load solution at the same time. Thats the way of the future for us."

With a partnership set in motion, this would be the first time such a solution had been employed in the Asia Pacific region.

Theres a lot of drive to go to someone elses cloud but not as much to create your own, Nicod said.

Because of that, this engagement was exciting and different from most projects. We gained direct access to the HPE team, including their engineers.

The delivery of the project occurred in the height of the COVID-19 lockdown, presenting the additional challenges of masking and keeping engineers in bubbles. Following an eight-week deployment process, 68 out of 74 applications were migrated to the private cloud in one day.

It was really tough times we were in the thick of COVID, we had our deployment guys in full PPE with lots of strong protocols around them and they had to be in discrete bubbles. Managing that, as well as trying to get the supply chain to function, was a challenge, Murphy said.

Ingram Micro facilitated a lot of that side of things for us, so it was a great success from a partnership perspective with the three of us involved it all converged for us.

Devolis network infrastructure and security team now runs entirely on the private cloud, with the exception of a few monitoring and alert systems in case the platform fails.

The software side of the business still operates through AWS due to flexibility for the development team, Nicod said, but over the next three to six months intends to migrate as many applications as possible to the private cloud.

While the number of platforms has reduced from three to one to serve both Australia and New Zealand, Nicod says reliability has been improved and costs have been reduced by 50 per cent in migrating to the private cloud.

Other notable outcomes include a 50 to 75 per cent improvement in the time to deploy support applications, provision new workloads and deploy new compute and storage resources. Devoli has also seen a 25 per cent improvement in lifecycle management tasks, less time troubleshooting issues and automated capacity planning that makes the teams routines significantly faster.

Devolis differentiator is our ability to deliver any broadband voice service within a few hours not weeks or months, Nicod said.

We have some large customers who consume a lot of voice traffic. Now we have the enhanced reliability to move faster, which has garnered us even more business.

Error: Please check your email address.

Tags HPEDevoli

Read more:
Devoli bucks cloud migration trend with hybrid on-prem move to HPE - Reseller News

Read More..

BT picks AWS Wavelength for 5G and the cloud – Capacity Media

The partnership combines AWS's cloud with BTs 5G and 4G infrastructure. Specifically, EEs national mobile network with AWS Wavelength will bring AWS closer to the network edge delivering faster, secure and high-bandwidth connectivity for use cases like policing, crowd management, healthcare and security.

As we continue to build best-in-class 5G infrastructure for the UK, launching the AWS Wavelength service for our business and wholesale customers is a hugely important step on our journey bringing the power of the cloud to the UKs best network. Its set to unlock use cases like IoT cameras to help first responders keep communities safe: a real-life example of using tech to connect for good," said Alex Tempest, managing director for BT Wholesale.

By building cloud edge services into our 5G and 4G EE network, we can accelerate innovation across industries, and bring fast, secure data processing closer to where our customers need it most. Ultimately, we want to give businesses and public sector organisations all the power of edge computing, wherever they are.

The news forms part of BTs investment in its existing mobile networks, to enable 5G-connected infrastructure as a service via AWS Wavelength.

This includes switching on a new AWS Wavelength Zone in Manchester, which will service trials for eligible businesses and public sector organisations within a 100km radius, with plans to roll out AWS Wavelength to more business customers across the UK in the near future.

AWS Wavelength embeds AWS compute and storage services within 5G and 4G networks, providing mobile edge computing infrastructure for ultra-low-latency applications. Hosting services directly at the edge of EEs UK network reduces lag, as application traffic can reach application servers running in the AWS Wavelength Zone without leaving BTs network.

Read this article:
BT picks AWS Wavelength for 5G and the cloud - Capacity Media

Read More..

Living with data breaches in unregulated cyberspace – The Express Tribune

ISLAMABAD:

Data fusion, cloud computing and internet-enabled devices have brought us the greatest threat since the Cold War: the risk of cyber-attacks from proxy states.

With Pakistans public sector institutions frequently being attacked by the terrorist adversaries operating the anonymous TOR networks, it is becoming critical to train and organise a cyber force to assist the government in managing the escalation in case of a cyber conflict.

Recently, LeakBase accessed the consumer data of Paysys Labs, an intermediary that integrates SBPs Raast services through its middleware, and published data of more than 50,000 users on the dark web.

Philippine Cyber Alliance has attempted to attack over a dozen government websites this month; not to mention some cyber terror group that has published personal details of the Punjab government employees.

Data of many private companies such as AM International and medIQ has reportedly been released on hacker forums.

It is crystal clear that individuals, businesses, and local governments cant bear this additional burden of ensuring cyber security and this domain must be dealt by specialist organisations with niche technology to safeguard from these attacks. Not only short-term defensive measures are required urgently, but there is also a need to take a strategic approach to build resilience in IT systems.

What it means for policymakers is to isolate database systems from each other, wherever possible, and avoid funding programmes that lead to data fusion. For example, integrating NADRA, FBR and banking systems is too dangerous, though such an integrated system offers a dream dashboard for authorities.

Though their individual APIs are secure, the architecture inherently promises too much power for hackers.

Similarly, the fact that NTDC has online dashboard available, which could be manipulated by any malicious user, is prone to attack incidents in the entire electricity supply chain. Russia has been attacking Ukrainian infrastructure including power grids and banks for a decade now.

Tracking such an attack or locating the cyber terrorists is tricky. A Russian hacker, over a VPN running in the US, may be using phishing emails to install malicious software in computers connected to our governments intranet for stealing data by uploading it on a Chinese cloud server.

Using the US as a proxy to launch attack while collecting data on another server in China makes it difficult to geolocate such individuals. Tracking people in cyberspace becomes a jurisdictional nightmare, making cyber warfare a weapon of choice for ransom groups.

With multiple elections due to be conducted this year, adequate cyber security measures need to be taken timely as many countries have cyber weapons to influence election results as well as public opinion.

By leveraging social media platforms run by Meta, it is very easy to use behavioural tools along with targeted ads to influence public sentiment on lines of Cambridge Analytica.

What we need to do is to create awareness for promoting data privacy and best practices to handle the online public data at large. Resilience of our critical infrastructure and essential services must be the top priority and strict SOPs need to be built into the system.

Cyber audits need to be conducted by the concerned regulatory authorities of critical assets including the banking system and security auditors need to thoroughly review protocol stacks and software components every quarter; building a list of every components license, patch releases, and dependencies.

So, in case a particular software component gets compromised, all organisations whose IT systems were built using that component could be timely alerted.

However, cyber security can also lead to less ease of doing business as a stringent SOP can slow down the clock speed of commercial operations.

For example, if NADRA stops issuing online ID cards and other certificates, fearing that fingerprints and signatures could be leaked and forged on illegally issued stamp papers to seal fake contracts, the inconvenience caused to an ordinary citizen will be enormous.

Similarly, the IoT devices that are penetrating quickly among the masses to control household appliances remotely are a great convenience but unfortunately all our data also get dumped into servers located overseas.

In a worst-case scenario, our electrical appliances could also be controlled by any foreign cyber terrorist or a ransomware group.

Overall, the future of cyber security will require continued investment in latest technologies and approaches to keep pace with the evolving threats.

However, surveillance of citizens on the pretext of cyber security must be discouraged and a policy shift towards cyber security that ensures minimal infringement on citizens rights of privacy is needed while taking holistic security measures.

The writer is a Cambridge graduate and is working as a strategy consultant

Published in The Express Tribune, March 20th, 2023.

Like Business on Facebook, follow @TribuneBiz on Twitter to stay informed and join in the conversation.

See the original post:
Living with data breaches in unregulated cyberspace - The Express Tribune

Read More..

How memory management is key to scaling digital twins in the cloud – Diginomica

Scientists have been building supercomputers for simulating climate change for decades on special-purpose machines using specially crafted algorithms. Today powerful cloud computers are growing in compute and raw memory capacity for running industrial simulations. However, some consideration must be given to how this memory is managed to get all these different models to work together.

Memory issues may not be the first thing that comes to mind as enterprises and researchers build ever-larger digital twins. Memory issues could become more significant as enterprises and researchers push the limits of larger models for adaptive planning scenarios like climate resiliency or building better products. The big challenge comes with predictive and prescriptive analytics designed to tease apart the knock-on effects of climate change on businesses and regions.

Building more accurate models means increasing the resolution and types of data. But this can also create hiccups that can stall models required to test various scenarios. This can be significant when running dozens or even hundreds of models to tease out the impact of multiple strategies or assumptions. In many of the largest models today, built-in programming languages like C require a lot of hand-tuning to help free up the memory requirements. Meanwhile, programming languages like Java, with ambitious memory management capabilities, could be vital in building more extensive and flexible digital twins.

Maarten Kuiper, Director of Water International at Darieus, a civil engineering firm based in the Netherlands, has been developing ever larger digital twins for planners, farmers, citizens, and business plans for climate change. In some respects, the Netherlands has been on the front lines of climate change for decades with ambitious efforts to protect low-lying lands from rising seas.

These days, Kuiper is helping plan against new combinations of floods and droughts. During a flood, it may be tempting to try and run all the water out to sea as quickly as possible, but then groundwater loses a valuable buffer against salt water running in. He was an early adopter of digital twin simulation tools from Tygron that allowed him to combine and overlay data sets about land elevation, hydrological conditions, land values, and demographic conditions.

The software also makes mixing and matching models from different sources easier. For example, he finds the latest tree models do a better job at modeling a trees ability to suck up water, its impact on nearby structures, and how they are affected by wind and elevation.Kuiper says:

Many people look at trees from different angles. You need to bring all those people together to make better decisions. With water security and climate change, we must bring citizens, governments, and businesses together.

Digital twin frameworks make it easier to bring new data sets, models, and visualizations for different use cases. A business might want to see how flooding or, conversely, lands subsiding might impact shipping routes, compromise the integrity of facilities, or affect supply chains. For example, the Port of Rotterdam used the same software to help plan a massive port expansion. This allowed them to align investment with new expansion with returns to guide profitable growth.

A big challenge is bringing more data to bear on better predictions and recommendations for planners. Kuiper explains:

We were early adopters. It started with a great visualization. But then we also need calculations for all kinds of simulations in their own domain. For example, we might need to calculate groundwater levels when the rain falls or what happens with a heat event. We needed software that could combine all those simulations in real time since the results are interconnected. This has helped us integrate analysis with all kinds of stakeholders who might be looking at something from different angles. It was also important to have information quickly in case of a disaster.

For example, in the wake of a flood, adding a relatively small earth bank in the right place can help adapt much better than a larger change elsewhere. A fast digital twin allows them to calculate all sorts of scenarios before acting in the real world. It also allows them to evaluate dynamic actions.

These larger digital twins would not have been possible with better memory management. Maxim Knepfle, CTO of Tygron, started working on the platform shortly out of high school. He adopted the Java programming language to strike the right balance between speed and performance. But he started running into long pauses as these digital worlds grew. Past a certain point, the simulations would pause for an extended period, which kept the simulations small or course. He had to keep the size of each grid cell about twenty to thirty meters on a side, which also limited the accuracy and precision of the models. Knepfle says:

In those large data sets, the normal Java virtual machine would freeze for about two or three minutes, and your entire application would freeze.

While at the Java One conference, he stumbled across Azul, which was doing cutting-edge work in building more performant garbage collection into the Java runtime. He tried the new runtime, which cut the pauses to several milliseconds versus several minutes. This enables his team to scale the latest models past twenty terabytes to support grids as small as twenty-five cm on a side with over ten billion cells.

Even with the explosion in new languages, Knepfle is still a big fan of Java since it is faster than Rust or Python and automates the underlying resources better than languages like C++. This is important in building better digital twins since they want to be able to bring in the latest algorithms and have them run quickly. This becomes a problem when the data sets become big,

Scott Sellers, CEO and co-founder of Azul, says that memory sizes available to work with have been growing thanks to cheaper memory and improvements in x86 architectures that give programmers access to more memory:

We would not have been able to do it without Moores Law allowing more memory to be put into boxes and without help from Intel and AMD adding hooks in the microprocessor to tap into terabytes of memory. Five years from now, we will talk about maybe half a petabyte of memory in a physical box.

This is taking what used to be done on a supercomputer and enabling it in the cloud, which makes a lot of sense. Instead of building these $300 million data centers and populating them with expensive servers, we can replace them with lower-cost servers in the cloud.

The rapid advances in GPUs are paving the way for building ever-larger digital twins for industrial design, planning, predictive analytics and prescriptive analytics. Increasingly these models will require running calculations across different types of data in parallels. For example, engineering and design teams are turning to multi-physics simulations that help identify design changes on mechanical, electrical, and thermal properties.

Other realms might similarly combine different kinds of economic, weather, demographic, and geologic models to adapt supply chains, plan expansions, or mitigate climate risks. Exploring multiple scenarios could require running lots of variations. Developers will need to consider the impact of allocating memory in creating these larger models at scale.

Read the original here:
How memory management is key to scaling digital twins in the cloud - Diginomica

Read More..

Industry Insights: What will the newsroom of tomorrow be like? – NewscastStudio

Subscribe to NewscastStudio's newsletter for the latest in broadcast design, technology and engineering delivered to your inbox.

Broadcast production vendors recently participated in an Industry Insights roundtable discussion on newsroom technology, looking at the current pain points and where the tech stack will help in the future.

The roundtable participants envisioned a newsroom of the future that is adaptable, diverse and more data-driven, with a focus on automation, collaboration, and integrated AI services. The workflow of the future will be streamlined, highly integrated, consolidated and agile, with greater emphasis on automation and collaboration.

Luis Fernandez, senior product marketing manager, Dalet:Newsrooms of all sizes will continue to face the challenge of catering to a wide and dispersed audience across various digital and broadcast platforms. Each platform, and their unique audiences, have different expectations and ways to understand the story, which makes it difficult for newsrooms to plan, produce and distribute stories effectively and efficiently.

Miro Rusko, managing director APAC, Octopus Newsroom: I see two verticals, speed where the professional newsrooms aim to deliver speedy and verified information competing with opportunistic social media accounts seeking exposure thus publishing information without verification. Second is the IT cyber-security policies, which are in some cases obstructing the concept of working from anywhere.

Craig Wilson, product evangelist in broadcast and media enterprise, Avid:One of the biggest pain points for news organizations is ensuring effective collaboration between teams to efficiently produce and deliver compelling content across multiple platforms. Of course, another pain point is the usual resistance to change from editorial teams to adapting to new workflows and technologies.

Adam Leah, creative director, Nxtedition:One of the main challenges that newsrooms are currently facing is the use of outdated software, complicated workflows, and a shortage of skilled professionals. The tendency to rely on traditional methods and resist change can impede the newsgathering process and make it difficult to keep up with changing trends. However, by embracing new technologies and letting go of old habits, newsrooms can overcome these challenges and adapt to the current demands of the industry.

Ionut Johnny Pogacean, senior product manager, Vizrt:Some of the most significant pain points are the monetization of content, the ability to produce and deliver multi-platform stories with speed, and the balance of retaining brand identity while trying to match the language of the platform they are published to.

Jenn Jarvis, product manager, Ross Video: Visibility of information whether that is information on what stories are in progress or information from a source. With newsrooms producing more content than ever before and from more locations, lack of visibility is what leads to redundant efforts, mistakes and general frustration.

Luis Fernandez:Cloud-based tools enable newsrooms to transcend physical boundaries and be accessible from any location. With cloud-native solutions like Dalet Pyramid, news professionals can access the same technology inside and outside the newsroom with ease and familiarity. This seamless hand off helps journalists break the news faster, work more collaboratively, and access all assets, communication and production tools from wherever the action occurs.

Gianluca Bertuzzi, sales manager for Africa and Latin America, Octopus Newsroom: Cloud-based tools are helping the newsroom by providing efficient and cost-effective ways to store, share, and access large amounts of data and multimedia content. They also enable remote collaboration and streamline the workflow.

Craig Wilson:Cloud-based tools are enabling collaboration, whether its working from the field or from other remote offices. These tools also enable access to integrated AI services to supplement technical metadata and assist in the editorial process.

Adam Leah:There is some nuance to the definition of cloud technology, as it can refer to both public cloud and private cloud servers on-premise. I understand that as cloudflation affects the cost of cloud services, many in the industry are considering alternatives to mitigate these expenses. However, its important to remember that the focus should be on the technology itself and how it can benefit newsrooms, rather than the physical location of the servers.

Johnny Pogacean:There is some degree of familiarity with working with web tools that makes things easier and more approachable. It is vital for todays journalists to go live from anywhere, and cloud tools allow the journalist to be as close to the story as possible without having to remote in on-prem resources. In addition to that, another benefit is the quick updates that SaaS providers offer.

Jenn Jarvis: Centralizing information and collaboration in a single tool or set of tools is changing the way newsrooms work. And putting those tools in the cloud creates a consistent and cohesive workflow regardless of location. Journalists have always had to work on the go, but weve only recently gotten to a point where a remote workflow mirrors the same experience as working in the newsroom.

Luis Fernandez:Some newsrooms are more equipped than others, but the truth is, with time, keeping up with the large variety of content will become more and more needed. The role of AI in this matter is critical; how can newsrooms generate all the metadata required for discoverability, repurposing, distribution, and archive? In conjunction with collaboration between different locations and broadcast & digital teams, this is the second most mentioned concern.

Bob Caniglia, director of sales operations in the Americas, Blackmagic Design: To keep up with the large demand for content across a variety of platforms, ranging from long-form video to social media snippets, newsrooms need to invest in flexible, all-in-one tools that support all types of content production needs, while also simplifying workflows.

Gianluca Bertuzzi: Newsrooms are equipped to handle a multitude of content, but it can still be a challenge to keep up with the demand for multimedia and interactive content, as well as ensuring that all content meets the high standards for accuracy and impartiality.

Craig Wilson:There is a big demand to quickly produce a lot of quality content while tailoring and rapidly delivering it to different platforms. Today most newsrooms can either produce good content or can produce it quickly, and often need to find a compromise such as a combination of skills training for staff and tools which can deliver content to any platform are needed.

Adam Leah:Not very many newsrooms are equipped to handle the plethora of content and platforms required in todays fast-paced news environment, they are too linear led. The demands for content across different platforms and formats are constantly changing and traditional newsroom installations are struggling to keep up with the pace. The lack of agile workflows, and modern technologies, along with the industry-wide skills shortage, only exacerbates this issue.

Johnny Pogacean: While most cope well with gathering content, what happens after it varies significantly depending on the size of the newsroom and resources. Its not uncommon for broadcasters to simply clip their on-air content and publish that, but that means compromising on quality. Increasingly distributed newsrooms and audiences wanting news on-demand on their preferred platforms are prompting newsrooms to adopt a story-first approach. The term story-centric is used a lot in our industry for workflows that are organized around the story, but how that looks in practice varies greatly.

Jenn Jarvis:Some more than others. The larger organizations are investing the time and energy into analyzing and building multi-platform workflows while smaller newsrooms are often struggling to create the same content without the integrated tools. The biggest challenge for all is the rate at which the content strategies and publishing platforms are changing.

Luis Fernandez:Newsrooms will get more complex with time, as new social and digital media outlets emerge and methods for reaching audiences and telling stories evolve. Newsrooms are already evolving from the linear model, focused on broadcast, and developing a more story-centric approach powered by new tools, workflows, and resources with specialized skills.

Bob Caniglia:The newsroom of the future will be adaptable and supported by powerful, hybrid technology, but most importantly, it will be diverse as professional technology is no longer reserved for the big broadcasters. With the continued adoption of accessible virtual technologies and cloud tools, creators will collaborate from anywhere in the world and will be unconstrained by one fixed studio or location.

Gianluca Bertuzzi: The newsroom of the future is likely to be more data-driven and technology-focused, with an emphasis on automation and collaboration. The use of artificial intelligence and machine learning is likely to increase, allowing journalists to focus on more in-depth reporting and storytelling.

Craig Wilson:The newsroom of the future provides a creative story centric approach to writing and content creation, while enabling access to material regardless of its location. This newsroom will require integrated collaboration tools for planning, creating, tracking and distributing content to multiple platforms, and integrated AI services to aid journalists and editorial teams with their work.

Adam Leah:The future requires us to be more pragmatic and forward-thinking. If we use the new technologies in the same way we used the old technology we will never release their full potential. Then there is all the AI stuff which is great for transcription, translation subtitling and indexing content but there may be moral and political issues around facial recognition and synthetic media; its technologically achievable, but is it moral? Thats going to be an interesting future debate.

Johnny Pogacean:AI (for better or worse) will revolutionize how content is created, processed, distributed and consumed. As journalists become more multi-disciplined, they are expected to do a lot more. The tools journalists use have to evolve to match their needs as complexity is just moved and managed differently; it doesnt disappear. Efficiency and accessibility will become even more critical in the future.

Jenn Jarvis:Whats exciting to me about the next generation of journalists is their general comfort level with technology and the rate at which technology can change. They are well positioned to adapt as delivery platforms and audience consumption changes. I think were going to see responsive newsrooms that are willing to experiment with new approaches and content formats.

Luis Fernandez: The newsroom workflow of the future will be hybrid, orchestrated, and intelligent, as the need and context in which newsrooms operate continue to evolve. Teams will need to be able to work collaboratively and effectively regardless of the consumption platform or the work location, and their workflows will need to be able to visualize, manage, assign, communicate, media edit, and distribute stories fast to different audiences on different platforms and be able to make a real impact.

Bob Caniglia:The newsroom workflow of the future will be streamlined as todays integrated and collaborative technologies empower creators to do more with less. Talent from all over the world will be able to create and share content in real time with their colleagues and newsrooms, contributing a diverse range of ideas and content.

Gianluca Bertuzzi:The newsroom workflow of the future is likely to be more streamlined, with a greater focus on collaboration and the use of technology to automate routine tasks. This will free up journalists to focus on more strategic and creative work.

Adam Leah: An exciting development were working on is story versioning. With the need to cater to different age groups and various social media platforms, the ability to fork a story using ML into multiple versions is a crucial asset. Another key requirement will be speed, speed is of the essence in breaking the news to the audience. To accomplish this, a highly integrated, consolidated, and agile workflow will be a necessity, thereby ensuring a seamless journey for the story from ideation to the viewer. Both points will need a technological step change in the newsroom.

Johnny Pogacean: I expect that content will become more interactive and highly individualized. Imagine content being augmented and enhanced by AI, users will choose their preferred style, choose the amount of graphics theyll see in a story. I also suspect that services like weve seen in the last few months with ChatGPT will become ubiquitous and they will be leveraged to deliver content in a highly individualized manner, and provide the necessary context in a way that is more approachable and understandable to each individual, without the newsroom having to generate it all.

Jenn Jarvis:Were already seeing investment priorities shift to things like planning tools, asset management and analytics. The workflow of the future is going to be ecosystems where these tools are connected and cohesive. Many of the manual workflows we have today will be automated, but visibility of content and data will play important roles and how those automated workflows are built.

Subscribe to NewscastStudio for the latest delivered straight to your inbox.

View original post here:
Industry Insights: What will the newsroom of tomorrow be like? - NewscastStudio

Read More..