Page 502«..1020..501502503504..510520..»

Unlocking Data from Graphs: How to Digitise Plots and Figures with WebPlotDigitizer – Towards Data Science

Unlocking Digital Potential from Static Image DataGoing from paper to digital. Image generated using DALLE by the author.

When working within data science, geoscience or petrophysics, we often come across data or charts that are in image form, such as those within publications. However, the associated data is not present which means it can be difficult to use this data in our interpretation or research.

This is where a tool like WebPlotDigitizer becomes really useful. This online tool helps us take those charts from images and turn them into data that we can use for further research and analysis.

There are a number of areas in petrophysics and geoscience where digitising charts can be very beneficial, including:

In this article, we will see how we can use the WebPlotDigitizer to extract data from a scatter plot made with synthetic data. In most cases, the quality of the figures we may deal with will likely be poorer.

Also, it is important to remember that when we use data from sources, we should always cite where it came from as well as the methods of how that data was obtained.

After capturing the image from the publication, it is time to load it into the WebPlotDigitizer.

To do this, we first navigate to:

File -> Load Image Files

Here, we can choose what type of plot we are dealing with.

View original post here:

Unlocking Data from Graphs: How to Digitise Plots and Figures with WebPlotDigitizer - Towards Data Science

Read More..

The 11 Best AI Tools for Data Science to Consider in 2024 – Solutions Review

Solutions Reviews listing of the best AI tools for data science is an annual sneak peek of the top tools included in our Buyers Guide for Data Science and Machine Learning Platforms. Information was gathered via online materials and reports, conversations with vendor representatives, and examinations of product demonstrations and free trials.

The editors at Solutions Review have developed this resource to assist buyers in search of the best AI tools for data science to fit the needs of their organization. Choosing the right vendor and solution can be a complicated process one that requires in-depth research and often comes down to more than just the solution and its technical capabilities. To make your search a little easier, weve profiled the best AI tools for data science all in one place. Weve also included platform and product line names and introductory software tutorials straight from the source so you can see each solution in action.

Note: The best AI tools for data science are listed in alphabetical order.

Platform: DataRobot Enterprise AI Platform

Related products: Paxata Data Preparation, Automated Machine Learning, Automated Time Series, MLOps

Description: DataRobot offers an enterprise AI platform that automates the end-to-end process for building, deploying, and maintaining AI. The product is powered by open-source algorithms and can be leveraged on-prem, in the cloud or as a fully-managed AI service.DataRobotincludesseveralindependent but fully integrated tools (PaxataData Preparation,Automated Machine Learning, Automated Time Series,MLOps, and AI applications), and each can be deployed in multiple ways to match business needs and IT requirements.

Platform: H2O Driverless AI

Related products: H2O 3, H2O AutoML for ML, H2O Sparkling Water for Spark Integration, H2O Wave

Description: H2O.ai offers a number of AI and data science products, headlined by its commercial platform H2O Driverless AI. Driverless AI is a fully open-source, distributed in-memory machine learning platform with linearscalability. H2O supports widely used statistical and machine learning algorithms including gradient boosted machines, generalized linear models, deep learning and more. H2O has also developedAutoMLfunctionality that automatically runs through all the algorithms to produce a leaderboard of the best models.

Platform: IBM Watson Studio

Related products: IBM Cloud Pak for Data, IBM SPSS Modeler, IBM Decision Optimization, IBM Watson Machine Learning

Description: IBM Watson Studio enables users to build, run, and manage AI models at scale across any cloud. The product is a part of IBM Cloud Pak for Data, the companys main data and AI platform. The solution lets you automate AI lifecycle management, govern and secure open-source notebooks, prepare and build models visually, deploy and run models through one-click integration, and manage and monitor models with explainable AI. IBM Watson Studio offers a flexible architecture that allows users to utilize open-source frameworks likePyTorch, TensorFlow, and scikit-learn.

https://www.youtube.com/watch?v=rSHDsCTl_c0

Platform: KNIME Analytics Platform

Related products: KNIME Server

Description: KNIME Analytics is an open-source platform for creating data science. It enables the creation of visual workflows via a drag-and-drop-style graphical interface that requires no coding. Users can choose from more than 2000 nodes to build workflows, model each step of analysis, control the flow of data, and ensure work is current. KNIME can blend data from any source and shape data to derive statistics, clean data, and extract and select features. The product leverages AI and machine learning and can visualize data with classic and advanced charts.

Platform: Looker

Related products: Powered by Looker

Description: Looker offers a BI and data analytics platform that is built on LookML, the companys proprietary modeling language. The products application for web analytics touts filtering and drilling capabilities, enabling users to dig into row-level details at will. Embedded analytics in Powered by Looker utilizes modern databases and an agile modeling layer that allows users to define data and control access. Organizations can use Lookers full RESTful API or the schedule feature to deliver reports by email or webhook.

Platform: Azure Machine Learning

Related products:Azure Data Factory, Azure Data Catalog, Azure HDInsight, Azure Databricks, Azure DevOps, Power BI

Description: The Azure Machine Learning service lets developers and data scientists build, train, and deploy machine learning models. The product features productivity for all skill levels via a code-first and drag-and-drop designer, and automated machine learning. It also features expansiveMLopscapabilities that integrate with existing DevOps processes. The service touts responsible machine learning so users can understand models with interpretability and fairness, as well as protect data with differential privacy and confidential computing. Azure Machine Learning supports open-source frameworks and languages likeMLflow, Kubeflow, ONNX,PyTorch, TensorFlow, Python, and R.

Platform: Qlik Analytics Platform

Related products: QlikView, Qlik Sense

Description: Qlik offers a broad spectrum of BI and analytics tools, which is headlined by the companys flagship offering, Qlik Sense. The solution enables organizations to combine all their data sources into a single view. The Qlik Analytics Platform allows users to develop, extend and embed visual analytics in existing applications and portals. Embedded functionality is done within a common governance and security framework. Users can build and embed Qlik as simple mashups or integrate within applications, information services or IoT platforms.

Platform: RapidMiner Studio

Related products:RapidMiner AI Hub, RapidMiner Go, RapidMiner Notebooks, RapidMiner AI Cloud

Description: RapidMiner offers a data science platform that enables people of all skill levels across the enterprise to build and operate AI solutions. The product covers the full lifecycle of the AI production process, from data exploration and data preparation to model building, model deployment, and model operations. RapidMiner provides the depth that data scientists needbut simplifies AI for everyone else via a visual user interface that streamlines the process of building and understanding complex models.

Platform: SAP Analytics Cloud

Related products:SAP BusinessObjects BI, SAP Crystal Solutions

Description: SAP offers a broad range of BI and analytics tools in both enterprise and business-user driven editions. The companys flagship BI portfolio is delivered via on-prem (BusinessObjects Enterprise), and cloud (BusinessObjects Cloud) deployments atop the SAP HANA Cloud. SAP also offers a suite of traditional BI capabilities for dashboards and reporting. The vendors data discovery tools are housed in the BusinessObjects solution, while additional functionality, including self-service visualization, are available through the SAP Lumira tool set.

Platform: Sisense

Description: Sisense makes it easy for organizations to reveal business insight from complex data in any size or format. The product allows users to combine data and uncover insights in a single interface without scripting, coding or assistance from IT. Sisense is sold as a single-stack solution with a back end for preparing and modeling data. It also features expansive analytical capabilities, and a front-end for dashboarding and visualization. Sisense is most appropriate for organizations that want to analyze large amounts of data from multiple sources.

Platform: Tableau Desktop

Related products:Tableau Prep, Tableau Server, Tableau Online, Tableau Data Management

Description: Tableau offers an expansive visual BI and analytics platform, and is widely regarded as the major player in the marketplace. The companys analytic software portfolio is available through three main channels: Tableau Desktop, Tableau Server, and Tableau Online. Tableau connects to hundreds of data sources and is available on-prem or in the cloud. The vendor also offers embedded analytics capabilities, and users can visualize and share data with Tableau Public.

More here:

The 11 Best AI Tools for Data Science to Consider in 2024 - Solutions Review

Read More..

Are autonomous labs the future of science? | by Batman | Jan, 2024 – Medium

Photo by Hyundai Motor Group on Unsplash

The scientific community is on the brink of a revolution, driven by the emergence of autonomous labs.

These cutting-edge facilities mark a significant shift in how research and experiments are conducted.

As technology advances, the potential impact of autonomous labs on the scientific landscape becomes increasingly evident.

One key advantage of autonomous labs is the unparalleled efficiency they bring to experimentation.

Traditionally, scientists spent substantial time and effort on manual tasks, often prone to human error.

With autonomous labs, these repetitive tasks are automated, allowing researchers to focus on the core aspects of their work.

This streamlined process not only accelerates the pace of experiments but also enhances the reliability of results.

Moreover, autonomous labs operate round the clock without the need for constant human supervision.

This continuous workflow ensures that experiments can be conducted efficiently, leading to faster data generation and analysis.

The elimination of downtime associated with traditional labs results in a significant boost to overall productivity in the scientific research domain.

Precision and accuracy are paramount in scientific research. Autonomous labs leverage state-of-the-art technologies, such as advanced sensors and artificial intelligence, to ensure precise data collection and analysis.

These technologies significantly reduce the margin of error, providing researchers with more reliable and reproducible results.

Furthermore, the integration of machine learning algorithms within autonomous labs enables real-time data interpretation.

The ability to analyze data on the fly allows researchers to adapt their experimental approaches dynamically, enhancing the quality of research outcomes.

The synergy between automation and intelligent data processing marks a substantial leap forward in scientific methodology.

Read this article:

Are autonomous labs the future of science? | by Batman | Jan, 2024 - Medium

Read More..

Run Mixtral-8x7B on Consumer Hardware with Expert Offloading – Towards Data Science

Activation pattern of Mixtral-8x7Bs expert sub-networks source (CC-BY)

While Mixtral-8x7B is one of the best open large language models (LLM), it is also a huge model with 46.7B parameters. Even when quantized to 4-bit, the model cant be fully loaded on a consumer GPU (e.g., an RTX 3090 with 24 GB of VRAM is not enough).

Mixtral-8x7B is a mixture of experts (MoE). It is made of 8 expert sub-networks of 6 billion parameters each.

Since only 2 experts among 8 are effective during decoding, the 6 remaining experts can be moved, or offloaded, to another device, e.g., the CPU RAM, to free up some of the GPU VRAM. In practice, this offloading is complicated.

Choosing which one of the experts to activate is a decision taken at inference time for each input token and each layer of the model. Naively moving some parts of the model to the CPU RAM, as with Accelerates device_map, would create a communication bottleneck between the CPU and the GPU.

Mixtral-offloading (MIT license) is a project that proposes a much more efficient solution to reduce VRAM consumption while preserving a reasonable inference speed.

In this article, I explain how mixtral-offloading implements expert-aware quantization and expert offloading to save memory and maintain a good inference speed. Using this framework, we will see how to run Mixtral-8x7B on consumer hardware and benchmark its inference speed.

The tutorial section is also available as a notebook that you can find here:

Get the notebook (#37)

MoE language models often allocate distinct experts to sub-tasks, but not consistently across long token sequences. Some experts are active in short 24 token sequences, while others have intermittent gaps in their activation. This is well illustrated by the following figure:

Follow this link:

Run Mixtral-8x7B on Consumer Hardware with Expert Offloading - Towards Data Science

Read More..

Attacker Targets Hadoop YARN, Flint Servers in Stealthy Campaign – Dark Reading

A threat actor is targeting a common misconfiguration in Hadoop YARN and Apache Flink to try and drop Monero cyrptominers in environments running the two big data technologies.

What makes the campaign especially notable is the adversary's use of sophisticated evasion techniques, such as rootkits, packed ELF binaries, directory content deletion, and system configuration modifications to bypass typical threat detection mechanisms.

Researchers from Aqua Nautilus uncovered the campaign when they spotted new attacks hitting one of their cloud honeypots recently. One attack exploited a known misconfiguration in a feature in Hadoop YARN called ResourceManager that manages resources for applications running on a Hadoop cluster. The other targeted a similarly known misconfiguration in Flink that, like the YARN issue, gives attackers a way to run arbitrary code on affected systems.

Hadoop YARN (Yet Another Resource Negotiator) is a resource management subsystem of the Hadoop ecosystem for big data processing. Apache Flink is a relatively widely used open source stream and batch processor for event-driven data analytics and data pipeline applications.

Assaf Morag, lead researcher for Aqua Nautilus, says the YARN misconfiguration gives attackers a way to send an unauthenticated API request to create new applications. The Flink misconfiguration allows an attacker to upload a Java archive (JAR) file that contains malicious code to a FLINK server.

"Both misconfigurations permit remote code execution, implying that an attacker could potentially gain complete control over the server," Morag says. Given that these servers are used for data processing, their misconfigurations present a data exfiltration risk. "Furthermore, these servers are typically interconnected with other servers within the organization, which could facilitate lateral movement by the attacker," Morag says.

In the attack on Apache Nautilus' honeypots, the adversary exploited the misconfiguration in Hadoop YARN to send an unauthenticated request to deploy a new application. The attacker was then able to execute remote code on the misconfigured YARN by sending a POST request, asking it to launch the new application using the attacker's command. To establish persistence, the attacker first deleted all cron jobs or scheduled tasks on the YARN server and created a new cron job.

Aqua's analysis of the attack chain showed the attacker using the command to delete the content of the /tmp directory on the YARN server, downloading a malicious file to the /tmp directory from a remote command-and-control server, executing the file, and then again deleting the contents of the directory. Aqua researchers found the secondary payload from the C2 server to be a packed ELF (Executable and Linkable Format) binary that served as a downloader for two different rootkits, one of which was a Monero crypto-currency miner. Malware detection engines on Virus Total did not detect the secondary ELF binary payload, Aqua said.

"As these servers are designed for processing big data, they possess high CPU capabilities," Morag says. "The attacker is exploiting this fact to run cryptominers, which also require a substantial amount of CPU resources."

Morag says the attack is noteworthy for the different techniques the attacker used to conceal their malicious activity. These included the use of a packer to obfuscate the ELF binary, the use of stripped payloads to make analysis more challenging, an embedded payload within the ELF binary, file and directory permissions modifications, and the use of two rootkits to hide the cryptominer and shell commands.

See the original post here:

Attacker Targets Hadoop YARN, Flint Servers in Stealthy Campaign - Dark Reading

Read More..

Vitalik Buterin Commends $100M Boost to Ethereum Ecosystem – U.Today

Alex Dovbnya

Ethereum founder Vitalik Buterin has praised Optimism for distributing over $100 million in its third RetroPGF round

Over $100 million has been distributed in the third round of Optimism's Retroactive Public Goods Funding (RetroPGF).

In a recent post on the X social media platform, Ethereum founder Vitalik Buterin expressed his admiration for Optimism's ongoing dedication to funding public goods.

This initiative is particularly significant as it supports developers and other contributors to the Ethereum ecosystem who may lack a traditional business model.

Optimism's RetroPGF has been described as an economic flywheel that propels the Optimism Collective forward.

It functions by rewarding individuals and projects that build essential infrastructure, tooling, and content, thus enabling the ecosystem to flourish.

This round of funding has acknowledged 501 projects and individuals for their positive impact on the Collective.

The initiative shows the value of community-driven development in the cryptocurrency world, where projects are often collaboratively built and maintained.

The beneficiaries of this round of funding include both well-established names in the Optimism Collective and newcomers who are contributing to the future of the ecosystem.

In total, 643 projects have received awards across the three rounds of RetroPGF. This demonstrates a commitment to building a self-sustaining economic system that consistently rewards contributors.

The process involves significant community participation. This includes badgeholders who vote on the allocation of awards, and teams dedicated to building the public goods infrastructure.

Optimism has also announced that more rounds are planned for 2024, signaling continued support for this experimental and iterative process.

About the author

Alex Dovbnya

Alex Dovbnya (aka AlexMorris) is a cryptocurrency expert, trader and journalist with extensive experience of covering everything related to the burgeoning industry from price analysis to Blockchain disruption. Alex authored more than 1,000 stories for U.Today, CryptoComes and other fintech media outlets. Hes particularly interested in regulatory trends around the globe that are shaping the future of digital assets, can be contacted at alex.dovbnya@u.today.

The rest is here:

Vitalik Buterin Commends $100M Boost to Ethereum Ecosystem - U.Today

Read More..

Ethereum devs air concern over Vitalik’s plan to increase gas limit – Cointelegraph

Ethereum developers, node operators and users have yet to agree on Vitalik Buterins recent suggestion to increase the gas limit on Ethereum.

On Jan. 11, Buterin advocated for a modest 33% gas limit increase to potentially improve network throughput.

Increasing the gas limit to the proposed 40 million from the current 30 million would allow more transactions for each block, theoretically increasing the overall throughput and capacity of the network, he argued.

However, there are some drawbacks, according to Ethereum developer Marius van der Wijden, who aired his concerns in a Jan. 11 blog post titled Why increasing the gas limit is difficult.

The primary concern would be the increase in the size of the blockchain state, which contains account balances and smart contract data.

The total space needed right now is roughly 267 gigabytes (GB) only for the state, he said, adding, If we increase the gas limit, this size will grow even quicker.

The Ethereum blockchain full history data size is currently around 900GB, according to Blockchair.

Wijden argued that storage is cheap, so size is not the issue, and everyone will be able to store that amount of data, however, accessing and modifying it will become slower and slower, before adding there are no concrete solutions yet for state growth.

Moreover, higher limits also raise synchronization times and make building diverse clients harder, he added.

Gnosis co-founder Martin Kppelmann also aired concerns, stating there would also be an increase in bandwidth should the gas limit be raised.

Ethereum team lead Pter Szilgyi was another who echoed concerns about increasing gas limits.

The gas limit refers to the maximum amount of work and gas spent executing Ethereum transactions or smart contracts in each block. It is set to ensure that blocks are not too large, which would impact network performance and synchronization.

Potential solutions include upgrades like EIP-4444 tackling chain history expiration and EIP-4844 for rollup data availability using blobs,which will help curb long-term growth trends.

Related: Big changes coming to Ethereums account abstraction to save on gas

Software developer Micah Zoltu replied to Vitaliks Reddit post, saying that the goal should be enabling real-world users to run Ethereum nodes on their everyday machines. However, this will be a greater challenge as the state and full blockchain size grows over time.

Our goal should not be to ensure that you can run an Ethereum node on an $X machine. It should be that demographic X can run an Ethereum node, he said.

Magazine: Account abstraction supercharges Ethereum wallets: Dummies guide

Read this article:

Ethereum devs air concern over Vitalik's plan to increase gas limit - Cointelegraph

Read More..

Vitalik Buterin endorses raising Ethereum gas limit by 33% – The Block – Crypto News

Ethereum co-founder Vitalik Buterin voiced support for increasing the block gas limit a move that could enhance the networks capacity.

In a Reddit ask-me-anything session, when questioned about the safe increment of the gas limit, Buterin recommended a 33.3% increase, proposing to raise the Ethereum block gas limit from its current 30 million to 40 million units of gas.

If implemented, this adjustment would allow more transactions in each Ethereum block and improve capacity. Buterin stressed that the current gas limit, set at 30 million, has remained unchanged for nearly three years.

The gas limit in Ethereum represents the maximum computational effort that can be expended on processing transactions or executing smart contracts in a single block.

Martin Kppelmann, co-founder of Gnosis, observed that increasing the gas limit would also push the operational requirements for nodes likely necessitating more costs. Nevertheless, Kppelmann believed that the benefits of a higher gas limit, such as improved network efficiency and capacity, outweigh these potential issues.

Jesse Pollak, Coinbase protocols lead, also expressed strong support for the change. He stated, Im strongly in support of increasing the Ethereum gas limit to 40-45M we have the network headroom and it will be beneficial for all parties.

Unlike several changes to Ethereum that necessitated hard forks, the increase in the Ethereum block gas limit can be achieved through validators adjusting their node configurations.

Since Ethereums inception in 2015, the average gas limit was approximately three million. This limit has incrementally risen over time reflecting the networks escalating usage and adoption.

Disclaimer: The Block is an independent media outlet that delivers news, research, and data. As of November 2023, Foresight Ventures is a majority investor of The Block. Foresight Ventures invests in other companies in the crypto space. Crypto exchange Bitget is an anchor LP for Foresight Ventures. The Block continues to operate independently to deliver objective, impactful, and timely information about the crypto industry. Here are our current financial disclosures.

2023 The Block. All Rights Reserved. This article is provided for informational purposes only. It is not offered or intended to be used as legal, tax, investment, financial, or other advice.

View original post here:

Vitalik Buterin endorses raising Ethereum gas limit by 33% - The Block - Crypto News

Read More..

ETH developers voice concern over gas limit hike proposed by Vitalik Buterin – crypto.news

Ethereum developers are expressing concerns over Vitalik Buterins latest proposal to raise the gas limit by 33%, citing worries about the growing size of the blockchain state.

The second largest blockchain Ethereum is currently at a crossroads over a proposal by its co-founder Vitalik Buterin to increase its gas limit.

On Jan. 11, Buterin recommended a 33% increase in the gas limit, a change aimed at enhancing the networks throughput. The proposal suggests raising the limit from the current 30 million to 40 million, potentially allowing more transactions per block and increasing the networks capacity.

The concept of the gas limit has evolved since Ethereums inception in 2015. Initially set at around 3 million, the limit has gradually increased in response to the networks growing usage and adoption.

The Ethereum blockchain state, which stores account balances and smart contract data, currently requires approximately 267 gigabytes (GB) of space. The entire history of the blockchain is even larger, around 900GB, according to Blockchair.

However, the proposal has met with some resistance. Key among the concerns is the impact on the blockchains state size.

Marius van der Wijden, an Ethereum developer, expressed these worries in his Jan. 11 blog post, Why increasing the gas limit is difficult. He pointed out that while storage might be affordable, a larger state size could slow down the process of accessing and modifying data.

I wrote down some of my thoughts on raising the gas limit today: https://t.co/gX0eihUyYa

(Haven't proof-read it, so if you find a mistake, you can keep it)

Wijden also noted the need for definitive solutions for managing the growth of the state.

The potential increase also poses technical challenges, including longer synchronization times and difficulties developing diverse clients. Martin Kppelmann, co-founder of Gnosis, highlighted that a higher gas limit could necessitate increased bandwidth.

Ethereum team lead Pter Szilgyi shared similar concerns, emphasizing the trade-offs of a higher gas limit. While it might improve transaction capacity, it could also lead to faster state growth, slower synchronization, and increased network attacks and spam risks.

What problem does increasing the gas limit solve?

Increasing it definitely has a downside. State will grow faster, sync time will get slower quicker, DoS potential will grow. Would be nice to have a number on those.

That said, what does increasing the gas limit net us?

The gas limit in Ethereum serves as a cap on the amount of gas spent on transactions or smart contracts in each block. It is crucial to keep block sizes manageable to ensure optimal network performance.

Validators have the flexibility to adjust the gas limit within certain parameters as they produce blocks.

While increasing the gas limit could theoretically expand the networks throughput and capacity, it comes with the cost of higher loads on hardware and potential network security risks.

Developers are exploring potential solutions, such as EIP-4444, which deals with chain history expiration, and EIP-4844, focusing on rollup data availability using blobs. These upgrades are aimed at addressing long-term growth concerns within the Ethereum network.

Continued here:

ETH developers voice concern over gas limit hike proposed by Vitalik Buterin - crypto.news

Read More..

Ethereum’s Vitalik Buterin Proposes Gas Limit Increase – CoinDesk

Ethereum co-founder Vitalik Buterin suggested raising the network's gas limit by 33% on Wednesday a move that would raise the network's transaction capacity and could reduce fees for end-users, but could increase operational costs for validators.

Users of the Ethereum blockchain pay gas fees to ensure that their transactions are added to the network, and the gas that one pays to execute a transaction roughly correlates to its computational complexity (e.g. a simple token swap will cost less gas than opening up a convoluted lending position).

Ethereum's gas limit refers to the total amount of gas that can be squeezed into an individual Ethereum block the bundles of transactions that get added to the Ethereum network at regular intervals. Increasing the gas limit would mean increasing the amount (and complexity) of transactions that can be added to a block.

"The gas limit has not been increased for nearly three years, which is the longest time ever in the protocol's history," Buterin wrote in response to a commenter who asked whether Ethereum could "safely increase" its gas limit. Buterin suggested increasing the gas limit to 40 million gas units a 33% increase over today's 30 million limit.

A ramp-up in Ethereum's gas limit wouldn't require a big update or "hard fork" of the network's core code. Instead, the validators that operate the network should be able to implement the change by adjusting certain parameters in their node software.

Calls for increasing the gas limit started back in December when some of Ethereum's layer 2 (L2) networks were experiencing record usage. Martin Kppelmann, the co-founder of Gnosis Chain, wrote on X that for Ethereum to become a settlement layer for L2s it needs to increase its block gas limit.

Following Buterins Reddit comments on Wednesday, more users on X, the platform formerly known as Twitter, chimed in with words of support for the suggested increase. Jesse Pollak, the head of protocols at Coinbase and creator of the layer-2 blockchain Base, shared his support of the move and suggested the gas limit could even be increased even further, to 45 million.

Others expressed more caution about the gas change, like Ethereum core developer Dankrad Feist, who suggested that calldata and blobs per block should be targeted in addition to the overall gas limit.

As for what the limit increase accomplishes, "it simply allows more activity on L1 - it will either reduce tx costs - or more likely IMO just increase capacity at similar cost -> more burn, said Kppelmann.

Link:

Ethereum's Vitalik Buterin Proposes Gas Limit Increase - CoinDesk

Read More..