Page 871«..1020..870871872873..880890..»

The Bumpy Road Toward Global AI Governance – NOEMA – Noema Magazine

Credits

Miranda Gabbott is a writer based in Barcelona. She studied art history at Cambridge University.

Just about two and a half years ago, artificial intelligence researchers from Peking University in Beijing, the Beijing Academy of Artificial Intelligence and the University of Cambridge released a fairly remarkable paper about cross-cultural cooperation on AI ethics that received surprisingly little attention beyond the insular world of academics who follow such things. Coming to a global agreement on how to regulate AI, the paper argues, is not just urgently necessary, but notably achievable.

Commentaries on the barriers to global collaboration on AI governance often foreground tensions and follow the assumption that Eastern and Western philosophical traditions are fundamentally in conflict.The paper, alsopublished in Chinese, takes the unconventional stance that many of these barriers may be shallower than they appear. There is reason to be optimistic, according to the authors, since misunderstandings between cultures and regions play a more important role in undermining cross-cultural trust, relative to fundamental disagreements, than is often supposed.

The narrative of a U.S.-China AI arms race sounded jingoistic and paranoid just a few years ago. Today, it is becoming institutionalized and borne out in policy in both countries, even as there has been growing recognition among researchers, entrepreneurs, policymakers and the wider public that this unpredictable, fast-growing and multiuse set of technologies needs to be regulated and that any effective attempt to do so would necessarily be global in scope.

So far, a range of public bodies, civil society organizations and industry groups have come forward with regulatory frameworks that they hope the whole world might agree on. Some gained traction but none have created anything like an enforceable global settlement. It seems possible that rivalry and suspicion between two great powers and their allies could derail any attempt at consensus.

Possible but not inevitable.

Getting policymakers from China and the U.S. around a table together is just the largest of many hurdles to a global agreement. Europe is likely to play a decisive role in shaping discussions. Though an ideological ally of the U.S., there are significant ideological differences between the U.S. and the EU on strategic aims regarding AI regulation, the former prioritizing innovation and the latter risk minimization.

More complex still, any global settlement on AI regulation that genuinely aspires to mitigate the negative consequences of this new technology must account for perspectives from regions often underrepresented in global discussions, including Africa, the Caribbean and Latin America. After all, it is overwhelmingly likely that the Global South will shoulder the brunt of the downsides that come with the age of AI, from the exploitative labeling jobs needed to train LLMs to extractive data mining practices.

Despite a thaw in the rivalry between Washington and Beijing remaining a distant prospect, there are still opportunities for dialogue, both at multilateral organizations and within epistemic communities.

A global settlement on AI ethics principles has clear advantages for all, since the effects of a transformational general-use technology will bleed across national and geographical boundaries. It is too far-reaching a tool to be governed on a nation-by-nation basis. Without coordination, we face a splinternet effect, wherein states develop and protect their technological systems to be incompatible with or hostile to others.

There are immediate dangers of technologists seeking an advantage by releasing new applications without pausing over ethical implications or safety concerns, including in high-risk fields such as nuclear, neuro and biotechnologies. We also face an arms race in the literal sense, with the development of military applications justified by great-power competitions: The principle of If theyre doing it, weve got to do it first.

With stakes this high, there is superficially at least widespread goodwill to find common ground. Most national strategies claim an ambition to work together on a global consensus for AI governance, including policy documents from the U.S. and China. A paper released by the Chinese government last November called for an international agreement on AI ethics and governance frameworks, while fully respecting the principles and practices of different countries AI governance, and one of the strategic pillars of a Biden administration AI research, development and strategy plan is international collaboration.

There are some prime opportunities to collaborate coming up this year and next, like the numerous AI projects under the U.N.s leadership and next years G7, which Giorgia Meloni, the Italian prime minister and host, suggested wouldfocuson international regulations of artificial intelligence. This July, the U.N. Security Council held its first meeting dedicated to the diplomatic implications of AI, where Secretary-General Antnio Guterres reiterated the need for a global watchdog something akin to what the International Atomic Energy Agency does for nuclear technology.

Yet the disruptive influence of fraught relations over everything from the war in Ukraine to trade in advanced technologies and materials show no sign of ending. U.S. politicians frequently and explicitly cite Chinese technological advancements as a national threat. In a meeting with Secretary of State Antony Blinken this June, top Chinese diplomat Wang Yi blamed Washingtons wrong perception of China as the root of their current tensions and demanded the U.S. stop suppressing Chinas technological development.

Which is why the first of four arguments from Sen higeartaigh, Jess Whittlestone, Yang Liu, Yi Zeng and Zhe Liu that these problems are surmountable and a near-term settlement on international AI law is achievable is so important. In times of geopolitical tension, academics can often go where politicians cant. There are precedents for epistemic communities from feuding nations agreeing on shared solutions to mitigate global risks. You can look back at the Pugwash Conference series during the Cold War, higeartaigh told me. There were U.S. and U.S.S.R. scientists sharing perspectives all the way through, even when trust and cooperation at a government level seemed very far away.

Differences in ideas about governing ethics across cultural and national boundaries are far from insurmountable.

There is evidence that Chinese and U.S. academics working on AI today are keen to cooperate. According to Stanford Universitys 2022AI index report, AI researchers from both countries teamed up on far more published articles than collaborators between any other two nations, though such collaborations have decreased as geopolitical tension between the two countries has increased. Such efforts, meanwhile, took place even amid threats to the lives and livelihoods of Chinese researchers living or visiting the U.S. in 2018, the Trump administration seriously debated a full ban on student visas for Chinese nationals, and in 2021, according to a survey of nearly 2,000 scientists, more than 42% of those of Chinese descent who were based in the U.S. reported feeling racially profiled by the U.S. government.

Although technology occupies a different place in Chinese society, where censorship has dominated since the early days, than in the U.S., which is still somewhat aligned with Californian libertarians and techno-utopians, higeartaigh and his colleagues second argument is that a these differences arent so great that no values are held in common at all.

Western perceptions of the internet in China are frequently inaccurate, which can make invisible certain points of common ground. Take, for instance, the issue of data privacy. Many in the West assume that the Chinese state, hungry to monitor its citizens, allows corporations free reign to harvest users information as they please. But according to Chinas Artificial Intelligence Industry Alliance (AIIA), a pseudo-official organization that includes top tech firms and research organizations, AI should adhere to the principles of legality, legitimacy and necessity when collecting and using personal information, as well as strengthen technical methods, ensure data security and be on guard against risks such as data leaks. In 2019, the Chinese government reportedly banned over 100 apps for user data privacy infringements.

In the U.S., meanwhile, policies on data privacy are amessof disparate rules and regulations. There is no federal law on privacy that governs data of all types, and much of the data companies collect on civilians isnt regulated in any way. Only a small handful of states have comprehensive data protection laws.

This brings us to the third reason why a global settlement on AI regulation remains possible. Given the complexities of governing a multi-use technology, AI governance frameworks lean toward philosophical concepts, with similar themes emerging time and again human dignity, privacy, explainability. These are themes that both countries share.

As Chinas AIIA puts it: The development of artificial intelligence should ensure fairness and justice and avoid placing disadvantaged people in an even more unfavorable position. And the White Houses draft AI Bill of Rights reads, in part, that those creating and deploying AI systems should take proactive and continuous measures to protect individuals and communities from algorithmic discrimination and to use and design systems in an equitable way.

This is not to say that incompatibilities genuinely rooted in divergent philosophical traditions can be wished away, nor that shallow accords are any foundation for lasting agreements. Rather, the point is that there is often scope to agree on specific statements, even while arriving at them from different places and perhaps even while disagreeing on abstract principles.

Here again, academia has a valuable role to play. Scholars are working to understand how different ethical traditions shape AI governance and uncover areas where consensus can exist without curtailing culturally divergent views. Sarah Bosscha, a researcher who studies how European and Chinese AI legislation differs, told me that with respect to the EU, the greatest point of divergence is the absence of a parallel to the Confucian value of harmony often interpreted as the moral obligation of an individual to the flourishing of their community. In China, following norms derived from Confucius, a person is not primarily an individual, but a family member, part of a social unit. This order of prioritization may clearly come into conflict with the supremacy in Europe (and even more so in America) of the individual.

But as Joseph Chan at the University of Hong Kong has argued, these are not mutually exclusive values. Chinese Confucianism, by his reading, can support multiple context-independent points of human rights. And the Universal Declaration of Human Rights contains collectivist aspects that contain similar meanings as the Confucian value of harmony: Human beings should act towards one another in a spirit of brotherhood (Article 1) and have duties to the community (Article 29).

This overlap is borne out in policy documents, with a 2019 EU document outline principles that emphasize community relations and contain a section on nondiscrimination against minorities. According to Bosscha, the European Union would do well to name harmony in its regulations and acknowledge its own investment in this value.

The Beijing AI Principles (2019), meanwhile, echo the language of human rights law, stating that human privacy, dignity, freedom and rights should be sufficiently respected. Though of course, Chinas deployment of AI and surveillance technologies against minorities reveals this commitment is far from full implementation.

A fourth line of reasoning in the paper by higeartaigh and his colleagues is that a noteworthy amount of the mistrust between the West and East is due to a rich history of misunderstandings. This is due, at least in part, to an asymmetrical language barrier. Scholars and journalists in China often have a strong command of English, the lingua franca of Western academia, and can access the work of their counterparts. Meanwhile, those working in the West rarely master Chinese languages. As such, knowledge-sharing often only flows one way, with English-speaking scholars and politicians alike almost entirely reliant on translations to access policy documents from China.

Political language is usually nuanced its subtleties rarely translatable in full. This is especially true in China. Translations of relatively ambiguous statements from Beijing on AI law have caused some high-stakes misunderstandings. For example, a2017 ChineseAI development planwas largely interpreted by Western commentators as a statement of intent toward technological domination. This was partly thanks to a translation that was worded as a declaration of China becoming the worlds primary AI innovation center by 2030. However, according to Fu Ying, a former Chinese diplomat, that was a misreading of the intent of the plan. What China wants to achieve, she wrote, is to become a global innovative center, not the only or exclusive center clearly a gentler goal.

But apprehension based on the translation of the Chinese plan reverberated in the American tech community nonetheless. As Eric Schmidt, a former executive chairman of Google parent Alphabet, put it at a summit in 2017: By 2030, they will dominate the industries of AI. Just stop for a sec. The [Chinese] government said that.

There is already an overlap in AI ethics frameworks between the two nations. And debunkable myths can inflate U.S. fears of Chinas technology strategies.

For higeartaigh, the reason global efforts to create shared regulation on AI are so vulnerable to derailment lies in asking who stands to benefit from crystallizing the narrative of a U.S.-China tech race from rhetoric to policy. If there is a race, he told me, its between U.S. tech companies. I am concerned that the perspective of needing to stay ahead of China is used to justify pushing ahead faster than would be ideal.

In his view, many technologists are deliberately amplifying U.S.-China race rhetoric to justify releasing software as fast as possible, cutting corners on safety checks and ethical considerations.

Schmidt is the head of the National Security Commission on Artificial Intelligence and a highly influential proponent of the race against China viewpoint. For years, Schmidt has pushed the Pentagon to procure smarter software and invest in AI research while maintaining a strong preference for technology deregulation. Meanwhile, his venture capital firm has invested in companies that won multimillion-dollar contracts from federal agencies.

According toAI Nows 2023 report, the crux of the problem is that AI products and the businesses behind them are increasingly perceived as national assets. The continued global dominance of Americas Big Tech companies (Google, Apple, Facebook, Amazon and Microsoft) is tied to U.S. economic supremacy. Any attempt to set limits on what those companies can develop or the data they can use risks ceding vital ground to Chinese companies, which are often presumed falsely to operate in a regulatory vacuum.

This argument has proved remarkably influential, particularly with regard to privacy regulations. In 2018, shortly after the Cambridge Analytica scandal, Mark Zuckerberg applied this line of reasoning to warn against strengthening data rights. In particular, he stated at a Senate hearing that implementing certain privacy requirements for facial recognition technology could increase the risk of American companies fall[ing] behind Chinese competitors. Just last year, the executive vice president of the U.S. Chamber of Commerce argued that data privacy guidelines outlined within the AI Bill of Rights that intended to bring the U.S. closer to the EUs GDPR were a bad idea when the U.S. is in a global race in the development and innovation of artificial intelligence. Needless to say, conflating deregulation with a competitive edge against China doesnt bode well for attempts to cooperate with its policymakers to agree on global regulations.

Fortunately, the U.S. government is not entirely batting on behalf of Big Tech. The Biden administration has taken clear steps to enforce competition with anti-trust laws against the wishes of tech monopolists. A 2021 executive order declared that The answer to the rising power of foreign monopolies and cartels is not the tolerance of domestic monopolization, but rather the promotion of competition and innovation by firms small and large, at home and worldwide.

So, despite a thaw in the rivalry between Washington and Beijing remaining a distant prospect, there are still opportunities for dialogue, both at multilateral organizations and within epistemic communities. As academics have shown, differences in ideas about governing ethics across cultural and national boundaries are far from insurmountable. There is already an overlap in AI ethics frameworks between the two nations. But unfortunately, durable myths continue to inflate U.S. fears of Chinas technology strategies.

Though the path to agreeing on a set of global ethical guidelines between rivals may be a bumpy road, theres nothing inevitable about the future direction this technological rivalry will take.

Read the original:

The Bumpy Road Toward Global AI Governance - NOEMA - Noema Magazine

Read More..

July mining output falls to lowest level in four months – IOL

Mining output in South Africa will struggle to recover for the remainder of the year after unexpectedly slipping in July, falling to its lowest level in four months on the back of intensified power cuts and slow global demand.

Data from Statistics SA (Stats SA) yesterday showed that mining production plunged by 3.6% from a year ago following an upwardly revised 1.3% rise in June, and defying market expectations of a 0.5% increase.

This was the steepest contraction in mining activity since February, with platinum group metals (PGMs), coal and diamonds being the biggest drags on growth.

Stats SAs principal survey statistician, Juan-Pierre Terblanche, said PGMs contracted by 10.4% following robust growth of 11.1% year-on-year in the previous month.

Terblanche said coal also fell by 7%, reflecting a deterioration from the 1.8% decline in June, while diamond production fell for the 10th consecutive month by 33.4%.year-on-year.

Nickel, manganese ore and chromium ore were also weak in the month, Terblanche said.

On the upside, miners in copper, gold and iron ore recorded a positive month. Iron ore reached its highest growth rate with production expanding by 13.8% year-on-year.

On a seasonally adjusted monthly basis, mining production decreased by 1.7% in July, following a downwardly revised 1.2% rise in the previous month.

In the year-to-date January to July period, mining output is down by 1.4% year-on-year, reflecting poor growth within the coal, iron ore and PGMs divisions.

However, output growth performance has been robust at 17.5% in the year-to-date in the gold division and modest in the manganese ore division at 2.9%.

Stats SA said the seasonally adjusted mining output is critical for the official calculation of quarterly GDP growth.

FNB senior economist Thanda Sithole said that this data, along with manufacturing output released on Monday, and electricity production painted a gloomy picture at the start of the third quarter. Sithole said it was also consistent with the general expectation of a GDP growth moderation after a better-than-expected 0.6% quarterly expansion in the second quarter.

Overall, the mining sector remains challenged by domestic load-shedding intensity and logistics constraints, as well as moderating external demand, with growth challenges in China and Europe boding ill for export of critical commodities, Sithole said.

Commodity prices have decreased relative to last year, weighing on earnings and the mining sectors contribution to government tax revenue collection.

In addition to domestic energy and logistical challenges, South Africas mining production has also been affected by the weakening global demand on the back of Chinas economic woes.

Investec economist Lara Hodes said the subdued global environment has weighed heavily on commodity demand, with the World Banks metals and minerals index down around 13% in the year-to-date to end August.

Hodes said the fragile global economic environment, with a slower-than-projected rebound in demand from China, has weighed on diamond sales, while competition from the lab-grown diamond industry persists.

Conversely, gold output has benefited from a sluggish greenback combined with geo-political tensions, which has seen investors seeking safe haven options, Hodes said.

Notwithstanding global factors, domestically the mining sector continues to deal with logistical impediments, while unreliable energy supply remains a primary operational hindrance.

Indeed, these key challenges continue to weigh heavily on SAs competitive position, impeding exports and deterring investment potential.

Read this article:

July mining output falls to lowest level in four months - IOL

Read More..

What Is Cloud Networking? Definition, Types & Benefits – Forbes

What Is Cloud Networking? Definition, Types & Benefits Forbes Advisor

Leeron is a New York-based writer with experience covering technology and politics. Her work has appeared in publications such as Quartz, the Village Voice, Gothamist, and Slate.

For over 15 years, Kiran has served as an editor, writer and reporter for publications covering fields including advertising, technology, business, entertainment and new media.He has served as a reporter for AdAge/Creativity and spent several years as an edito and writer at Adweek. Along the way, he has also served in managing editor roles at the likes of PSFK and Ladders, worked in PR as a director of content, and most recently served as a Senior Editor at Dotdash Meredith for personal finance brand The Balance and then Entertainment Weekly. At Forbes Advisor, Kiran brings his experience and expertise to reinforce the brand's reputation as the most informative, accessible and trusted resource in small business.

Leeron is a New York-based writer with experience covering technology and politics. Her work has appeared in publications such as Quartz, the Village Voice, Gothamist, and Slate.

For over 15 years, Kiran has served as an editor, writer and reporter for publications covering fields including advertising, technology, business, entertainment and new media.He has served as a reporter for AdAge/Creativity and spent several years as an edito and writer at Adweek. Along the way, he has also served in managing editor roles at the likes of PSFK and Ladders, worked in PR as a director of content, and most recently served as a Senior Editor at Dotdash Meredith for personal finance brand The Balance and then Entertainment Weekly. At Forbes Advisor, Kiran brings his experience and expertise to reinforce the brand's reputation as the most informative, accessible and trusted resource in small business.

Contributor, Editor

Matt is a proven leader in IT, combining a masters degree in Management Information Systems and solid experience with a proven track record in IT, leading business initiatives to help organizations meet their goals. He has led the security practices at 2 different MSPs, been a Health IT Director, a project manager, business analyst, system administrator, systems architect...if it has to do with IT, he's probably done it. He helped author the CMMC Certified Professional and CMMC Certified Assessor field guides and has spoken at conferences all over the country regarding CMMC, IT security, risk. Matt has worked with Fortune 500 companies and small businesses, in areas ranging from engineering to marketing and supply chain to health care.

Matt is a proven leader in IT, combining a masters degree in Management Information Systems and solid experience with a proven track record in IT, leading business initiatives to help organizations meet their goals. He has led the security practices at 2 different MSPs, been a Health IT Director, a project manager, business analyst, system administrator, systems architect...if it has to do with IT, he's probably done it. He helped author the CMMC Certified Professional and CMMC Certified Assessor field guides and has spoken at conferences all over the country regarding CMMC, IT security, risk. Matt has worked with Fortune 500 companies and small businesses, in areas ranging from engineering to marketing and supply chain to health care.

Published: Sep 13, 2023, 12:00pm

Editorial Note: We earn a commission from partner links on Forbes Advisor. Commissions do not affect our editors' opinions or evaluations.

Show more

These days almost every business relies on the cloud to some capacity. Cloud networking is scalable and flexible. It allows organizations to increase their infrastructure according to changing demands. Cloud networking also saves costs, as companies only pay for the services they use as they go. This article covers the terms cloud networking and cloud computing, the various types and the benefits of this technology for small and medium businesses.

Cloud networking is an element of cloud computing and refers to the way the networking infrastructure works within it.

Cloud computing has revolutionized the way companies run, making it easier, faster and cheaper to complete functions that previously required a company to have its own data center. Almost every type of business today uses cloud computing for a wide range of purposes, including data backup, email and customer-facing web applications.

Cloud computing refers to the on-demand delivery of IT products online, which enables businesses to access databases, power and storage through the cloud, instead of through a physical data center. Microsoft Azure and Amazon Web Services (AWS) are the two main cloud service providers.

Examples of cloud services include:

Cloud networking capabilities are provided by the cloud service providers.

There are various types of cloud networking. Here are the main ones:

A public cloud means that the servers are being shared by other people. You might think of a public cloud as similar to a public swimming pool. This type of cloud is adjustable to different capacities of a companys IT department. Multiple users may be using a public cloud but will all be able to benefit from the service.

While a public cloud is like sharing a pool, a virtual private cloud (VPC) is more like putting a rope around the pool and creating a private area. Companies can choose to build their own private cloud within a public cloud. Increased security is the main reason a company would choose a VPC.

Hybrid cloud refers to the combination of public clouds and VPCs. The term hybrid cloud network is also used to refer to the connection between a physical data center and a public cloud.

The term multicloud refers to using more than one cloud provider. For example, a company might choose to use services both from AWS and Microsoft Azure. While they both offer cloud computing, they do have differences. A company may find the differences between the two providers make it worth it to use both.

All sorts of businesses use cloud networking for a range of reasons. The capabilities are endless and the use of cloud networks is only expected to grow in the coming years. Essentially, the cloud gives companies capabilities, storage and infrastructure to build and develop in a way that was not previously possible.

Small and medium businesses from a wide range of industries are able to benefit from the use of cloud networking. Here are a few examples of how cloud networking can be used:

Cloud computing refers to delivering cloud services over the internet and the on-demand delivery of IT products online, which enables businesses to access databases, power and storage through the cloud. Cloud networking refers to the connection between the different devices required for cloud computing. Though these terms are distinct, they are often used interchangeably.

There are countless benefits to using cloud technology. The main benefits of cloud networking include:

The first steps to creating a cloud network include familiarizing yourself with cloud networking concepts, developing a network architecture plan and choosing a cloud provider. Amazon Web Services and Azure are the two most popular providers. Youll want to consult with the cloud provider for a full set of instructions on how to create a cloud network. Amazon Web Services has extensive online video tutorials that can help with the process.

Cloud technology has revolutionized the way companies run. The cloud has opened up endless possibilities for small and medium businesses, making it faster and more affordable to fulfill tasks, scale, keep track of large sets of data and communicate and collaborate remotely. Almost every type of business uses cloud computing today, from working on shared documents that are stored on the cloud, to hosting customer-facing web applications. There are many benefits of cloud networking, including scalability, flexibility, mobility and reduced operating costs.

There are plenty of reasons to use cloud networking and these days almost every business relies on the cloud to some capacity. Cloud networking is scalable and flexible. It allows organizations to increase their infrastructure according to changing demands. Cloud networking also saves costs, as companies only pay for the services they use as they go. In addition, cloud networking also offers increased security for businesses that dont have the capacity to run an entire cybersecurity team in-house.

The Virtual Private Cloud (VPC) of Amazon Web Services (AWS) is an example of a cloud network. AWS VPC enables users to create a private cloud within the AWS cloud, which increases the security.

The first steps to creating a cloud network include familiarizing yourself with cloud networking concepts, developing a network architecture plan and choosing a cloud provider. Amazon Web Services and Azure are the two most popular providers. Youll want to consult with the cloud provider for a full set of instructions on how to create a cloud network. Amazon Web Services has extensive online video tutorials that can help with the process.

Because almost every company today uses cloud computing, expertise in cloud technologies is in high demand. There are plenty of career paths to choose from in this field such as cloud architect, cloud engineer, cloud consultant and DevOps engineer. These jobs are intellectually challenging and offer high earning potential in a field with growing demand.

Share your feedback

Thank You for your feedback!

Something went wrong. Please try again later.

Forbes Advisor adheres to strict editorial integrity standards. To the best of our knowledge, all content is accurate as of the date posted, though offers contained herein may no longer be available. The opinions expressed are the authors alone and have not been provided, approved, or otherwise endorsed by our partners.

Leeron is a New York-based writer with experience covering technology and politics. Her work has appeared in publications such as Quartz, the Village Voice, Gothamist, and Slate.

For over 15 years, Kiran has served as an editor, writer and reporter for publications covering fields including advertising, technology, business, entertainment and new media.He has served as a reporter for AdAge/Creativity and spent several years as an edito and writer at Adweek. Along the way, he has also served in managing editor roles at the likes of PSFK and Ladders, worked in PR as a director of content, and most recently served as a Senior Editor at Dotdash Meredith for personal finance brand The Balance and then Entertainment Weekly. At Forbes Advisor, Kiran brings his experience and expertise to reinforce the brand's reputation as the most informative, accessible and trusted resource in small business.

Matt is a proven leader in IT, combining a masters degree in Management Information Systems and solid experience with a proven track record in IT, leading business initiatives to help organizations meet their goals. He has led the security practices at 2 different MSPs, been a Health IT Director, a project manager, business analyst, system administrator, systems architect...if it has to do with IT, he's probably done it. He helped author the CMMC Certified Professional and CMMC Certified Assessor field guides and has spoken at conferences all over the country regarding CMMC, IT security, risk. Matt has worked with Fortune 500 companies and small businesses, in areas ranging from engineering to marketing and supply chain to health care.

Are you sure you want to rest your choices?

cancel ok

View original post here:
What Is Cloud Networking? Definition, Types & Benefits - Forbes

Read More..

Cisco Swaps Hyperflex for Nutanix with New Strategic Relationship – Forbes

Nutanix and Cisco Announce Strategic Partnership.

Storyblocks

Cisco and Nutanix recently announced that the two companies have entered a global strategic partnership to accelerate the adoption of hybrid multi-cloud solutions with what the companies say is the industrys most complete hyperconverged (HCI) solution for IT modernization and business transformation.

The partnership starts with two new offerings, while taking a third off the market entirely. Let's look at what's being offered.

Cisco and Nutanix are collaborating to offer an integrated hyperconverged infrastructure (HCI) product line that combines Cisco's SaaS-managed compute and networking solutions with Nutanix's software-defined storage platform.

Cisco Compute Hyperconverged with Nutanix is a new offering combining Ciscos SaaS-managed compute and networking solution with Nutanixs Cloud Platform, a robust suite of solutions that includes Nutanix Cloud Infrastructure, Nutanix Cloud Manager, Nutanix Unified Storage, and Nutanix Desktop Services.

The Nutanix Cloud Platform offers a uniform cloud operational framework through a single platform that facilitates deploying applications and data across various environments, including data centers, edge locations, and public clouds. It achieves linear scalability in performance and capacity and prioritizes resilience through self-healing nodes built into the system. Additionally, the Nutanix Cloud Platform seamlessly integrates persistent storage into its architecture.

Nutanix Cloud Platform

Nutanix

The integrated solution supports various deployment options and integrates Cisco servers, networking, security, and management with Nutanix's Cloud Platform. It promises a consistent cloud operating model across data centers, edges, and public clouds, offering scalability, resilience, and native storage integration.

Initially available on Cisco C-Series servers, the combined offering will expand to Cisco's X-Series modular servers later. The companies plan to make the solution available through Ciscos global sales teams by the end of November 2023.

A few weeks after the announcement of its partnership with Cisco, Nutanix further announced that it has integrated its built-in hypervisor, AHV, and Nutanix Flow Network Security with Cisco ACI (Application Centric Infrastructure) VMM (Virtual Machine Manager), providing security policies through Nutanix AHV VLANs into Cisco ACI EPGs.

Cisco ACI integrates with Nutanix Flow Network Security (FNS) to enable micro-segmentation and contextual security policies for VMs on AHV through its SDN technology. The integration is straightforward, requiring just a few simple steps to initiate.

The integration allows Cisco and Nutanix administrators to leverage their expertise without learning new technologies. This collaboration empowers Nutanix admins to establish intuitive security rules for enhanced application security. At the same time, Cisco administrators validate the extension of a secure network within a software-defined framework, resulting in a robust approach to security operations.

IT organizations can leverage their existing investment in Cisco's software-defined networking technology by using Nutanix Cloud Platform and hyper-converged infrastructure (HCI) in conjunction with Cisco ACI. It facilitates the creation of a secure hybrid multi-cloud environment, making networks easily extensible, enabling seamless security policy implementation, and ensuring secure access to data and applications.

After Nutanix and Cisco announced their strategic relationship in late August, speculation began almost immediately about the future of Ciscos HyperFlex HCI solution. As such, it was no surprise today when Cisco officially announced that it will discontinue the solution.

Cisco isn't leaving its Hyperflex customers stranded, however. The company promises five years of service and support for existing HyperFlex installations and will continue taking orders until March 2024.

Theres a certain amount of irony in the discontinuation of HyperFlex, along with Ciscos fresh embrace of Nutanix. Industry watchers will remember that just before Nutanix went public in 2015, Cisco tried and failed to acquire Nutanix. Reportedly coming in a few billion dollars too low with its offer, Cisco instead bought Springpath for $320M, bringing its HyperFlex technology to market.

Much has changed in the HCI market in the intervening years. As nearly every competing solution has faded away, Nutanix maintains strong momentum amidst a small handful of competitors. A big part of Nutanixs continuing success is that it found a way to leverage its HCI technology to do more than consolidate datacenter infrastructure, a move that's made all the difference.

Hyperconverged infrastructure collapses disparate networking, compute, and storage elements into a single manageable whole, allowing a single pane-of-glass view of the combined infrastructure. Bringing an HCI approach to managing hybrid-cloud simply makes sense; after all, what is hybrid-cloud but a disaggregated bunch of storage, compute, and networking that are all begging to be managed together? Thats what Nutanix delivers.

At the same time, no company better represents the idea of hybrid-cloud than Cisco, whose networking and UCS server technology is fundamental to the interconnected world of edge, cloud, and on-prem datacenters we all live in. Cisco is all about distributed infrastructure, while Nutanix, with its Nutanix Cloud Platform, has the right solutions to harness the power of that infrastructure to better deliver on the promise of data modernization and digital transformation.

The new partnership between Cisco and Nutanix is a natural one. It's also compelling. Nutanix continues to do what it does best, while Cisco now has access to a proven solution that's found broad and deep acceptance among enterprises of all sizes. Both companies will benefit, but IT organizations will benefit even more with the potential offered by the joint solutions. It's a strong story.

Disclosure: Steve McDowell is an industry analyst, and NAND Research an industry analyst firm, that engages in, or has engaged in, research, analysis, and advisory services with many technology companies, which may include those mentioned in this article. Mr. McDowell does not hold any equity positions with any company menti

Steve McDowell is principal analyst and founding partner at NAND Research.Steve is a technologist with over 25 years of deep industry experience in a variety of strategy, engineering, and strategic marketing roles, all with the unifying theme of delivering innovative technologies into the enterprise infrastructure market.

Read more:
Cisco Swaps Hyperflex for Nutanix with New Strategic Relationship - Forbes

Read More..

Google Cloud to verify messages sent between blockchains in agreement with $3 billion startup LayerZero – Fortune

Google Cloud dug deeper into the world of blockchains on Tuesday when LayerZero, a crypto startup recently valued at $3 billion, announced that the cloud provider will verify data sent between blockchains on the startups messaging protocol.

Most blockchains exist in isolation, meaning information on one chain isnt accessible to another. As the number of blockchains, or decentralized databases, have multiplied, developers increasingly use many at once. Hence, products that can transmit data between blockchains, like LayerZeros protocol, have become more in demand.

Blockchains are defined by their trustlessness, or the fact that its extremely difficult to change or fabricate data on them. However, outside data transmitted from one blockchain to another can be fabricated, which is why messaging protocols like LayerZero use outside verifiers to attest to the veracity and reliability of information sent between chains.

This is where Google Cloud comes in. Developers using LayerZero already rely on oracles, or outside verifiers, like Chainlink or Polyhedra, to verify messages sent between blockchains. Now, Google Cloud will be added to the mix, and for any developer spinning up a future application that uses the protocol, Googles cloud computing arm will be added as the default verifier, LayerZero CTO Ryan Zarick told Fortune.

Outside verifiers get a small fee for each transaction, he told Fortune, but he declined to provide any projections of the revenue Google Cloud may reap for becoming a LayerZero oracle. Theyre really getting into the infrastructure of Web3and really kind of all-in in that space, he added.

Google Clouds entrance as an infrastructure provider for blockchain interoperability is yet another bet on Web3. In 2022, it announced the creation of its own dedicated digital assets and Web3 engineering teams. Since then, the cloud computing giant has announced a suite of partnerships with crypto firms and blockchains, including Coinbase, BNB Chain, Celo, and Casper Labs. It has offered up its servers as validators, or computers that help secure and maintain blockchains, for the Sky Mavis, Solana, and Tezos blockchains. And, in October, it unveiled the Blockchain Node Engine, a streamlined method for developers to access and use blockchains on Googles servers.

Google Clouds most recent partnership with LayerZero marks its first step into yet another subset of Web3 infrastructure. Teaming up with LayerZero as an oracle across 15 chains will not only enhance the security of LayerZeros cross-chain messaging capabilities but further accelerate its commitment to Web3 interoperability and enterprise adoption, James Tromans, head of Web3 at Google Cloud, said in a statement.

Continue reading here:
Google Cloud to verify messages sent between blockchains in agreement with $3 billion startup LayerZero - Fortune

Read More..

The best web hosting companies in 2023 – CBS News

Getty Images

Here's the situation: You've got a website to launch, and you don't want to spend forever figuring out the best place to host it. You just want something that works well, keeps your site running fast, and won't break the bank. Is that too much to ask? We don't think so.

Let's face it: Picking the best web hosting company for your needs can feel like you're lost in a maze of tech jargon and endless options. How much is too much per month? What kind of server load can your provider handle? And what's your site's uptime going to be like? Don't sweat it. We've got your back.

We've sorted through the noise to bring you a no-nonsense list of the best web hosting companies in 2023. Whether you're setting up a small blog or launching the next online empire, we've found options for everyone. So sit back, relax, and read on to find the perfect fit for your website.

And if you're thinking about launching a Taylor Swift fan site or something, a word to the wise: You're going to want to spring for something that can handle all that traffic.

If you need to host a website but don't want to get lost in a maze of plans and tech jargon, HostGator is your go-to. It's perfect for anyone looking for something solid but straightforward. Starting at just $12 a month, its Linux-based Hatchling plan hooks you up with unlimited disk space, data transfers, and email, Want more bang for your buck? They've got "Baby" and "Busines"s plans that pile on the perks like unlimited domains and nifty SEO tools.

More of a Windows person? They've included an option for that OS, too. Starting at $10 a month, you get all the space and data transfers you could want. And if you're big on domains, the Enterprise plan lets you juggle up to five of them.

In terms of building your website, if you need help with that, the Gator website builder is a simple drag-and-drop interface that makes it simple to put together your new online home. Need something fancier, or have a soft spot for WordPress? You can run pretty much any content management system you want. And its File Manager tool isn't just a glorified FTP client. You can edit files without jumping through hoops.

Plus, HostGator has a pretty good offer that not all of the competition matches. You can sign up for a year and snag a free domain name. It's not groundbreaking, but hey, it's a freebie that can save you some additional money by not having to go through a third-party to register.

Plus, if you need help after getting started, the host's 24/7 customer service is fantastic, based on our testing. You're talking to a human in under a minute -- not a robot -- and the reps are quick and savvy. They'll make sure you have what you need, and the know-how to tackle whatever issues come your way.

So, if you're after no-nonsense web hosting that doesn't skimp on features, HostGator is where you want to be, especially if you're a beginner to web design or hosting in general.

Key features of Hostgator:

Looking for web hosting that offers a lot of features without a big price tag? Hostinger might be for you. It's super affordable, especially considering what's on offer.

For example, its "Business" plan is only $4 a month currently (down from $15 and includes a variety of features such as NVMe storage, daily backups, and free SSL. It even supports up to 100 websites on a single plan, which is great news if you have multiple projects that also need hosting options.

It's also surprisingly easy to use, even though Hostinger uses a custom control panel called hPanel instead of the industry-standard cPanel. This change might be a bit frustrating for anyone who's used cPanel in the past, but hPanel is surprisingly intuitive and user-friendly. It's great for anyone new to website management while serving up advanced features for the people who already know their way around it.

Hostinger's shared hosting options start at $1.99 a month with basic features. They also offer a Premium plan for $2.99 a month that significantly expands the capabilities, including support for up to 100 websites and a free domain.

It's important to keep in mind, however, that while Hostinger is a solid choice for smaller sites or personal projects, it's not the best choice for any bigger, busier, enterprise-level needs. For everyone else, though? It's a great investment.

Key features of Hostinger:

Bluehost is known for its ease of use and affordability. So whether you're hosting your hundredth website or you're getting started on your very first page, it's a great choice to kick things off with.

Like many of its hosting competitors, Bluehost offers you a free domain name that comes with your subscription, which can save you a small sum upfront. The hosting plans start at a reasonable $3 per month, and you get plenty of great features to help set up your website without much hassle.

New to building websites and need a little more help getting started? Bluehost's cPanel and customer support are super user-friendly. The company also offers 24/7 customer service, so you're never without help. It may not seem like the greatest offering in terms of perks, but you never know when you're going to need more help.

Bluehost also offers shared hosting, which is a great option for small sites. That service comes with the added benefits of free SSL certification and a free domain. But if you plan on sticking with it and growing your website, you may find that shared hosting doesn't meet your performance requirements. In that case, Bluehost has tons of more robust options, like dedicated hosting and VPS hosting for more flexibility and bandwidth.

Bluehost offers a good balance of price and features for anyone just getting started with building websites or anyone who's satisfied with a smaller host. If you need something more robust you'll have to shop elsewhere, but for everything else, Bluehost has you covered.

Key features of Bluehost:

DreamHost has carved out a space in web hosting, thanks to its reliability and affordable plans. Unlike some hosts that scale back features in their monthly plans, DreamHost keeps most of its offerings available for both monthly and yearly subscribers.

When it comes to reliability, DreamHost is hard to beat. The company is so confident in its uptime that it offers a 100% guarantee. If you have any downtime at all, you'll likely get a refund. There's also a super generous, 97-day money-back guarantee that applies to certain hosting plans, but it gives you the peace of mind to try the service and see if you like it before committing.

On the security front, DreamHost doesn't cut corners. It offers free SSL certificates and additional security protocols. All of these come wrapped in a user-friendly, custom control panel that differs from the industry-standard cPanel, but it's still pretty intuitive.

However, DreamHost isn't without its drawbacks. It does have fewer global server locations than some competitors. And DreamHost's servers are only based in Virginia and Oregon. This can be a major concern for anyone needing a global audience, since having servers closer to your visitors generally means faster load times. Also, unlike other providers that offer free email accounts, DreamHost charges extra for this, at least in its basic plans.

Customer support is an area where DreamHost excels, despite charging a bit extra for phone callbacks. However, it does offer robust email and ticket support with a live chat and an active online forum.

DreamHost does simplify the installation process for some popular content management systems (CMSs) like WordPress. But if you're planning to use a less common CMS, be prepared for a steeper learning curve. If WordPress has what you're looking for and more, you should be good to go.

All things considered, DreamHost offers a lot of bang for your buck, especially for those who value flexibility and robust features.

Key features of DreamHost:

GoDaddy goes beyond its well-known role as a domain registrar to offer a variety of hosting services. And of course, it has its brand recognition to lean on, too. It also has plenty of user-friendly features, especially its one-click installations for popular apps like WordPress. This is a major time-saver and eliminates the need for manual setup, making it approachable for both beginners and experienced developers.

GoDaddy's website builder streamlines the process further. You can pick from a variety of templates to kick-start their website, although this convenience might be a trade-off for those who want more creative control.

But where GoDaddy really stands out is in its customer service. In an era where automated bots are becoming the norm, GoDaddy offers 24/7 phone and web chat support. The quick response times are a big plus, often less than two minutes for phone support.

On the performance front, GoDaddy offers quick load times, thanks to multiple data centers across different continents. While they are planning to extend their reach with additional data centers, their current setup already provides solid global performance. And as you might be expecting, the service offers some pretty awesome extras, like a free domain for the first year and complimentary Office 365 email.

You might not think it based on the somewhat cringeworthy ads we've seen from the service over the years, GoDaddy remains a strong choice for web hosting, even if it isn't the cheapest one you'll run across.

Key features of GoDaddy:

Web hosting just means finding a place online to store and display your website. Your content is uploaded and published so that others can see it. Using a web hosting service essentially gives you space on a third-party server where you can store all the elements that make up your website. Consider your hosting provider as your website's landlord, looking after all the server maintenance and security issues. Often, hosts even throw in some extras, like email. In short, if you want your website to be accessible on the internet, you're going to need some type of web hosting.

First, you choose a domain name, which is basically your website's address. Now what? You have to link that address to server space that your hosting company provides. So, when someone wants to visit your website, they'll type in your domain or click a link to your site. That action sends a request through the internet to your server. Your server responds by sending back the files that make up your website. The end user's browser takes those files and puts together the website for them to see. You must have somewhere for all of your website's materials to reside if you want to publish it for the world to see.

There are several ways to get web hosting. Shared hosting is the budget option of the hosting world. You're sharing a server (and all its resources like CPU, memory, etc.) with other websites. It's the cheapest option, but you're also getting what you pay for: lower speed, less security, the works. Next up is VPS, or Virtual Private Server hosting. It's like having your own apartment in a building; you still share some amenities (the server), but you've got a dedicated partition that's all yours. It offers a bit more oomph in terms of speed and reliability.

Dedicated hosting is the penthouse suite, essentially, and the most expensive. You get an entire server to yourself -- lots of space, lots of resources, and lots of control -- but it'll cost you. Lastly, cloud hosting is a newer option. It's a network of servers that work together to host your website. You can start small and grow your hosting resources as your website gains traction, without the hassle of moving everything to a larger server.

Your first order of business is figuring out what you actually need from a web host. The type of website you plan to launch will determine your hosting requirements. Performance is another important factor. You'll typically want to aim for fast server speeds and a minimum uptime of 99.9%. Good customer support should also be high on your list. Go with the provider that gives you the best support available: That's typically 24/7 support across multiple channels, so you can get help whenever you need it.

Of course, as your website grows, your hosting needs could change. Pick a provider that allows for easy scalability without hefty fees or downtime. It might be super tempting to opt for a cheap plan, but remember that you usually get what you pay for. Balance the cost with the features you genuinely need. Security is non-negotiable when it comes to web hosting, so make sure the provider you go with offers robust features like firewalls and SSL certificates. Lastly, don't forget to read reviews and seek recommendations to ensure you're making an informed choice.

Brittany Vincent has been covering gaming, tech, and all things entertainment for 16 years for a variety of online and print publications. She's been covering the commerce space for nearly a decade. Follow her on Twitter at @MolotovCupcake.

See the original post:
The best web hosting companies in 2023 - CBS News

Read More..

There are lots of ways to put a database in the cloud here’s what to consider – The Register

Feature It has been a decade since Amazon RDS launched support for PostgreSQL. Since then, the relational system authored by Turing Award winner Michael Stonebraker in the 1980s has gone on to become the most popular database among professional developers, used by nearly half of them, according to Stack Overflow's 2023 Developer Survey.

I've seen people seduced by the cloud provider, and they fire their DBAs and everybody who knows about databases, but then they figure out when they need schema design, and query optimization well, Amazon's not going to help them

In parallel to PostgreSQL's rise in popularity comes a bewildering array of ways to deploy the database system in the cloud or exploit PostgreSQL-compatible database services. For example, as well as hosting standard versions of PostgreSQL as in RDS, the major three cloud providers, including Azure and Google, also provide PostgreSQL-compatiable enhanced database services such Aurora and AlloyDB.

Meanwhile, some vendors have created serverless systems with PostgreSQL-compatible front ends, such as CockroachDB and Yugabyte.

And that's just PostgreSQL. Similar options are available for other popular database systems, including MySQL, MongoDB and MariaDB. To navigate these choices, developers and database administrators need to understand the strengths and weaknesses of each approach.

As the author of the MySQL performance bible and founder and former CEO of opensource database consultancy Percona, Peter Zaitsev has witnessed the rise of the various ways of deploying database in the cloud and cautions about making choices lightly.

Whether the users might want to manage their deploy in a VM or adopt a serverless system managed by a vendor will depend on how much work they want to do, how much control and flexibility they want to have and how much they can tolerate being locked into a particular vendor.

Added into the mix, the cloud vendors offer proprietary databases for specific workloads, for Amazon offers DynamoDB, a fully managed proprietary value-key database, while Google offers BigQuery, a fully managed, serverless data warehouse.

"These systems are only available if you buy them from a specific cloud vendor: you cannot run it on your own," Zaitsev said.

Alternatively, users can get a standard system based on a popular open source database like PostgreSQL or MySQL, but significantly enhanced and presented as a fully managed service like Amazon Aurora, and Google's AlloyDB.

Lastly, there are fully managed "shrinkwrapped" services based on MySQL or PostgreSQL, such as Googles Cloud SQL or Amazon's RDS.

"This is some standard database technology just with some GUI and interface on top of it and some automatic backups and stuff like that," Zaitsev said.

Going from first to last, users face the most lock-in to the least lock-in with each of these choices. But they should also question what cloud vendors mean by a "fully managed service."

"That is what the cloud vendors recommend to users and what they push them towards, and it also typically comes with the highest cost, because they charge more for that compared to just the basic infrastructure to run a database," he said.

"But when they talk about a fully managed services, you can ask, 'OK, who's responsible for performance or security?' And they come back and say 'This is a shared responsibility.' They expect you to do your part while they keep the environment up and running. That is often misunderstood. I've seen people seduced by the cloud provider, and they fire their DBAs and everybody who knows about databases, but then they figure out when they need schema design, and query optimization well, Amazon's not going to help them. Any cloud provider would turn around and say, 'Hey, guys, we are keeping the database up and running, but all that stuff, which is specific to application and database usage, is on you'," Zaitsev said.

Another challenge to using shrink-wrapped or enhanced database services from the cloud vendors arrives when users want to use systems across cloud infrastructure from different cloud providers, according to corporate policy or geographic limitations.

"Amazon RDS, for example, sounds simple until you have to run it in different clouds. Then you have to deal with the nuances of RDS and the cloud infrastructure as well, and then it becomes very complicated," Zaitsev said.

Users can manage database deployment in the cloud themselves using virtual machines, but the fastest growing approach to cloud deployment of database is via Kubernetes, the open-source container orchestration which originated with Google.

"It gives us a programmable infrastructure, which is much more flexible and advanced than you get just dealing with VMs. At the same time, can you run it on-prem and on all the clouds. Kubernetes has become much more mature and much more capable to run a database compared to the early stages when it was designed to be as solutions for stateless applications," Zaitsev said.

Into the throng of database options in the cloud, a group of vendors have begun offering serverless systems, which is their own back end, but a front end compatible with a common database. For examples, CockroachDB and Yugabyte both offer serverless database with PostgreSQL-compatible front end.

In June, Cockroach CEO and co-founder Spencer Kimball told The Register it took five years to port the serverless system to Azure, a "non-trivial amount of work" that involved understanding the tolerances and failure of a different cloud architecture.

While Yugabyte claims 100 percent compatibility with PostgreSQL, and MariaDB recently launched a PostgreSQL-compatible front end to its distributed MariaDB back end, Kimball admitted CockroachDB does not have full PostgreSQL compatibility, but it is getting there.

Users, however, should question what lies behind serverless databases, Zaitsev said. "There are really servers in the end, right? It is just you are not charged for them and you may or may not be aware about what is going on with the servers."

One approach to serverless was to scale the instance size up and down according to the load. Another was to offer a multi-tenant approach in Google Spanner or CockroachDB.

"They have a different idea. You have a distributed database which is shared by multiple tenants. The benefit of that approach is, you have more ready to use capacity, which can be dynamically shared. If you need more resources, you don't need to reallocate and spin up the larger instance size," Zaitsev said.

Serverless is convenient if the load is very irregular. Users do not pay for keeping a system up and running when it is not in use. On the other hand, if the system is well used, and the operator understands and can predict demands on the system, then it can become less valuable from a pricing perspective, he said.

Earlier this year, Gartner said the DBMS market grew by 14.4 percent in 2022, reaching $91 billion, with the cloud platform-as-a-service model capturing nearly all the gain, with cloud spend at 55 percent exceeding on-premises at 45 percent.

The progress to the cloud is slower than Gartner predicted in 2019, when it said by 2022 75 percent of all databases would be deployed or migrated to a cloud platform. Users seem to be taking their time to navigate the many options available to them in deciding their future database strategy.

Original post:
There are lots of ways to put a database in the cloud here's what to consider - The Register

Read More..

New features for Premiere Pro, After Effects, and Frame.io from Adobe – RedShark News

Adobe is announcing the next set of enhancements to Creative Cloud for video, adding features to Premiere Pro, After Effects and Frame.io in this update cycle.

In May 2023, Adobe released text-based editing in Premiere Pro after a public beta cycle. It is important to note, as our Adobe press briefers reiterated to us, that text-based editing does not require an internet connection for text-to-speech transcription, which does set it apart from other applications that send audio to cloud servers. Adobe is adding Filler Word Detection to this feature in response to user requests. This means that Premiere Pro can detect pauses and so-called filler words like all the uhs and ums and, with one click, delete them. It is also possible to set a duration of pause to be detected and deleted. This also works across multi-track audio.

Enhance Speech removes noise from voice with just one click. AI technology can make voices sound like studio rather than field recordings. This feature too comes from user requests, in this case from the podcast community.

A frequent request from all users is timeline performance. Adobe claims up to a 5x faster performance in timeline thanks in good measure to clearing out considerable legacy code. I guess it is always good to clean out your closets occasionally.

Color sees improvements in automatic tone mapping with options for three new tone mapping methods. Settings in the Lumetri Color panel are also consolidated, and there are improvements to LUT management. But the most significant improvement in color is a fix to the dreaded Quick Time gamma differences. The sequence gamma can be set to match QT, so no more of that gamma 2.2 or 2.4 manual gyrations! Writers notethank you, Adobe.

Ever had a third-party plug-in crash the system? Effects Manager can detect a third-party crash, isolate that plug-in, and, on relaunching the application, recover your place automatically.

According to Adobe, here are a few more features based on community feedback. Metadata and TC can be burned in. New project templates, custom export across projects and export to Media Encoder are added. There is guidance in installing Blackmagic RAW, and perhaps the most significant feature for many of us, batch selection of markers is implemented.

While Premiere Pro is a mature product to which Adobe adds new functionalities, After Effects continues to add features.

Advancing After Effects 3D capabilities is True 3D Workspace for motion graphics. It is now possible to import, animate, light, shade and render 3D models regardless of the source of those 3D models. Users can combine 2D and 3D elements in this new workspace. A new GPU-accelerated rendering engine can deliver photo-realistic results. And through Creative Cloud libraries, Substance 3D assets are available for free.

Rotoscoping gets AI! Simply using the rotoscope tool, just draw a line along the object or subject and AI will figure out the rest. It is particularly effective for overlapping hair or limbs as well as transparent elements. Of course, manual adjustments can be made, but it does seem remarkably accurate in picking up details that could take hours of manual refinement pre-AI rotoscoping.

Adobe continues the development of Frame.io at a rapid pace in response to individual and enterprise-level users.

Frame.io now recognises audio, video, images or PDF assets, and these can be compared side by side, matched and annotated.

ProRes RAW and 10bit 4K workflows are now possible with Atomos Ninja and Ninja Ultra.

The newly announced 102MP Fujifilm GFX-100 II, also with 8K video, is now supported for camera to cloud at the high end, and at the other end, the Accsoon SeeMo and SeeMo Pro are supported for c2c.

Enterprise users constitute a significant part of the Frame.io base, and in direct response to the storage requirements of this segment, AWS S3 bucket can be connected directly to Frame.io. This represents both a cost-saving as well as workflow efficiency for enterprise users in c2c workflows.

All of these features will be released in public beta on September 13, 2023, with release versions coming at some point in Fall 2023.

Read this article:
New features for Premiere Pro, After Effects, and Frame.io from Adobe - RedShark News

Read More..

Two-thirds of small businesses plan to cut cloud spending – Information Age

Braced for a 10 per cent increase in cloud costs this year, almost two-thirds of smaller business users plan to cut cloud spending to combat rising costs.

One third of SMEs plan to reduce the amount of data they store in the cloud and 24 per cent to reduce the number of cloud services they use, according to business internet service provider Beaming.

Beamings Making the cloud work for UK businesses report, which draws on a study of SME leaders conducted by Opinium, reveals that, on average, UK SMEs spent almost 2 per cent of their turnover on cloud services in 2022. This amounts to more than 4 billion in cloud spending across the whole population.

How to get the best price on cloud hosting Cloud haggling: can you as a small business haggle on your cloud hosting fees? Absolutely. Even if you cannot get the headline cost down, your provider can throw in benefits and extras

Although more than a quarter (27 per cent) of SMEs initially adopted cloud to reduce computing expenditure, companies expect a 10 per cent increase in the cost of cloud services during 2023. Several major cloud providers have introduced double-digit price increases for services used by SMEs this year, including IBM, which last week announced plans to increase cloud prices by up to 29 per cent.

Facing increases in the cost of cloud computing that far exceed inflation, just one in five (20 per cent) SMEs that use the cloud said they would absorb the extra costs.

One in six SMEs (17 per cent) plan to move data or applications from the cloud onto on-premise servers.

Sonia Blizzard, managing director at Beaming, said: While the cloud has delivered many benefits to businesses, the cost of cloud has been creeping up for some time now, and at some providers, that creep is starting to look unjustified to businesses dealing with wider inflationary pressures.

Many SMEs, some of which rushed to the cloud to support remote working during the pandemic, are questioning the value of these services for the first time and taking action to get on top of those cost increases.

Key ways to save on AWS costs Amazon Web Services is the #1 cloud hosting provider but costs can spiral for businesses if they dont keep a tight grip on usage. Here are some top tips to cut down your monthly AWS bill

See the original post here:
Two-thirds of small businesses plan to cut cloud spending - Information Age

Read More..

CORRECTION — SAI.TECH Announces an Immersion Containerized Data Center Paired with GIGABYTEs HPC Immersion Servers – Yahoo Finance

SAI.TECH Global Corporation

SINGAPORE, Sept. 14, 2023 (GLOBE NEWSWIRE) -- This release is a full correction of the previous one with the Headline "SAI.TECH releases AI mobile liquid cooling computing center product A1, equipped with Gigabyte's A100/H100 immersion servers" issuedon September 12, 2023 at 5:31 AM ET by SAI.TECH Global Corporation (NASDAQ: SAI, SAITW).The corrected release follows:

SAI.TECH Global Corporation (SAI.TECH or SAI or the Company, NASDAQ: SAI, SAITW) declared today that its business unit ULTIWIT had begun the research, development and production of a containerized data center (the Product) with immersive liquid cooling capabilities, in conjunction with GIGABYTEs HPC immersion servers.

The preliminary design of the Product is a 40-ft container with Tier III Standard, which is able to contain HPC/AI immersion servers from GIGABYTE that are placed in four 36U cooling tanks with the total rack size of 144U.

The Product will provide a stable operating environment for AI-dedicated GPUs. A key feature of the Product will be the equipment of an interface designed to recycle computing waste heat, which is a step towards energy efficiency and sustainability. The prototype of the A1 Product will be tested and operated at the SAI NODE Marietta Computing Heat Recycle Center. In the future, SAI plans to help customers deploy A100, H100, A800 and other models of the same class in the A1 Product, and to achieve faster centralized and modularized rapid deployment of large-scale computing power. Meanwhile, the B1 products Bitcoin mining boxes with liquid cooling and heat recycle capabilities are operating at SAI NODE Marietta.

Above the Products hardware features, SAI.TECH intends to provide AI services globally. Its subsidiary, Boltbit Limited, is researching and developing GPU cloud service, including IaaS (Infrastructure as a Service) and MaaS (Model as a Service), for AI-savvy companies worldwide.

About SAI.TECH

SAI.TECH is a Nasdaq-listed (SAI) company headquartered in Singapore. SAI is dedicated to providing a zero-carbon energy system (HEATNUC) based on Small Modular Reactor, providing clean computing services based on liquid cooling and chip waste heat utilization technology (ULTIWIT), and providing cloud computing services based on blockchain and AI technology (BOLTBIT).

Story continues

In May 2022, SAI became a publicly traded company under the new ticker symbol SAI on the Nasdaq Stock Market (NASDAQ) through a merger with TradeUP Global Corporation (TradeUP). For more information on SAI.TECH, please visit https://sai.tech/.

About Giga Computing

Giga Computing Technology is an industry innovator and leader in the enterprise computing market. Having spun off from GIGABYTE, we maintain hardware expertise in manufacturing and product design, while operating as a standalone business that can drive more investment into core competencies. We offer a complete product portfolio that addresses all workloads from the data center to edge including traditional and emerging workloads in HPC and AI to data analytics, 5G/edge, cloud computing, and more. Our longstanding partnerships with key technology leaders ensure that our new products will be the most advanced and coincide with new partner platforms. Our systems embody performance, security, scalability, and sustainability. To find out more, visit https://www.gigabyte.com/Enterprise and join our newsletter.

Safe Harbor Statement:

This press release may contain forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. The words believe, expect, anticipate, project, targets, optimistic, confident that, continue to, predict, intend, aim, will or similar expressions are intended to identify forward-looking statements. All statements other than statements of historical fact are statements that may be deemed forward-looking statements. These forward-looking statements include, but not limited to, statements concerning SAI.TECH and the Companys operations, financial performance, and condition are based on current expectations, beliefs and assumptions which are subject to change at any time. SAI.TECH cautions that these statements by their nature involve risks and uncertainties, and actual results may differ materially depending on a variety of important factors such as government and stock exchange regulations, competition, political, economic, and social conditions around the world including those discussed in SAI.TECHs Form 20-F under the headings Risk Factors, Results of Operations and Business Overview and other reports filed with the Securities and Exchange Commission from time to time. All forward-looking statements are applicable only as of the date it is made and SAI.TECH specifically disclaims any obligation to maintain or update the forward-looking information, whether of the nature contained in this release or otherwise, in the future.

Media Contact

pr@sai.tech

Investor Relations Contact

ir@sai.tech

Continue reading here:
CORRECTION -- SAI.TECH Announces an Immersion Containerized Data Center Paired with GIGABYTEs HPC Immersion Servers - Yahoo Finance

Read More..