Page 1,603«..1020..1,6021,6031,6041,605..1,6101,620..»

$35+ Billion Data Pipeline Tools Markets – Global Opportunity Analysis and Industry Forecasts, 2021-2022 & 2022-2031 – Surge in Adoption of ML…

Company Logo

Data Pipeline Tools Market

Data Pipeline Tools Market

Dublin, March 23, 2023 (GLOBE NEWSWIRE) -- The "Data Pipeline Tools Market By Product Type, By Deployment Mode, By Application Area: Global Opportunity Analysis and Industry Forecast, 2021-2031" report has been added to ResearchAndMarkets.com's offering.

he global data pipeline tools market was valued at $6.8 billion in 2021, and is estimated to reach $35.6 billion by 2031, growing at a CAGR of 18.2% from 2022 to 2031.

Data pipeline tools are a category of software that allow large volumes of data to be moved from several disparate data sources to a central destination, often a data warehouse. Data is normalized or transformed so that it's in a consistent format and schema in the data warehouse and can be used for analysis and reports.

Key factors driving the growth of the data pipeline tools market include in Increase in demand for cloud data storage, Increase in demand for real-time data analytics, and Surge in need of data protection facilities. Strong security protocols are essential when planning a data pipeline.

Automated extract, transform and load. ETL platforms remove much of the risk involved, as data is never directly exposed. Instead, the ETL platform queries the destinations via Application Programming Interface (API), then securely transports the data to its destination. There is little risk as there is no manual interaction with the data while transferring the data.

For instance, in November 2022, Amazon Web Services inc., shared responsibility model that applies to data protection in AWS Data Pipeline. It is protecting the global infrastructure that runs all of the AWS Cloud. This content includes the security configuration and management tasks for the AWS services. Such factors have helped the growth of the data pipeline tools market.

The market also offers growth opportunities to the key players in the market. Machine learning is a subfield of computer science that deals with tasks such as pattern recognition, computer vision, speech recognition, text analytics and has a strong link with statistics and mathematical optimization. Machine learning constitutes model-building automation for data analysis.

Story continues

Data pipeline tools in machine learning are an infrastructural path for the entire ML workflow. Pipelines help automate the ML workflow, from data gathering, In statistics, exploratory data analysis (EDA), data augmentation, to model building and deployment. After the deployment, it also supports reproduction, tracking, and monitoring.

Many key players have introduced different frameworks to enhance their pipeline services. For instance, in January 2022, Metaflow introduced a framework for real-life data pipeline tools and machine learning. It helps to build and manage real-life data science and ML projects and to address the needs of data scientists who work on demanding real-life data analytics and ML projects. As a result, there has been a surge in adoption in machine learning and data analytical tools which helps boost the growth of the data pipeline tools market.

The key players profiled in the study include Amazon Web Services, Inc. Google LLC, IBM Corporation, Microsoft Corporation, Oracle Corporation, Precisely Holdings, LLC, SAP SE, Snowflake, Inc., Software AG, Tibco Software.

The players in the market have been actively engaged in the adoption various Strategies such as acquisition, product launch and expansion to remain competitive and gain advantage over the competitors in the market. For instance, in June 2021, Precisely Holdings LLC, acquired Winshuttle.

The Winshuttle product portfolio of SAP automation (Winshuttle Studio & Evolve) capabilities are also a part of the Precisely Automate product family. Master Data Management solutions are also part of the portfolio, as Precisely EnterWorks which in turn will help to improve data pipeline tool services.

Key Market Insights

By product type, the ELT Data Pipeline segment was the highest revenue contributor to the market, and is estimated to reach $10,845.20 million by 2031, with a CAGR of 17.88%. However, the ceramic ETL Data Pipeline segment is estimated to be the fastest growing segment with the CAGR of 19.22% during the forecast period.

By deployment mode, the cloud-based segment was the highest revenue contributor to the market, with $5,234.10 million in 2021, and is estimated to reach $28,685.60 million by 2031, with a CAGR of 18.67%.

Based on application area, the Real Time Analytics segment was the highest revenue contributor to the market, with $2,722.80 million in 2021, and is estimated to reach $17,763.30 million by 2031, with a CAGR of 20.73%.

Based on region, North America was the highest revenue contributor, accounting for $2,649.70 million in 2021, and is estimated to reach $12,880.00 million by 2031, with a CAGR of 17.25%.

Key Attributes:

Report Attribute

Details

No. of Pages

152

Forecast Period

2021 - 2031

Estimated Market Value (USD) in 2021

$6782 million

Forecasted Market Value (USD) by 2031

$35609.6 million

Compound Annual Growth Rate

18.0%

Regions Covered

Global

Market Dynamics

Drivers

Increase in demand for cloud data storage

Increase in demand for real-time data analytics

Surge in need of data protection facilities

Restraints

Opportunities

COVID-19 Impact Analysis on the market

Key Market Players

Key Market Segments

By Product Type

Batch Data Pipeline

ELT Data Pipeline

ETL Data Pipeline

Streaming Data Pipeline

By Deployment Mode

By Application Area

By Region

North America

U.S.

Canada

Europe

Germany

Italy

France

Spain

UK

Rest of Europe

Asia-Pacific

China

Japan

India

South Korea

Rest of Asia-Pacific

LAMEA

Latin America

Middle East

Africa

For more information about this report visit https://www.researchandmarkets.com/r/k9ybm7

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Attachment

Read more from the original source:

$35+ Billion Data Pipeline Tools Markets - Global Opportunity Analysis and Industry Forecasts, 2021-2022 & 2022-2031 - Surge in Adoption of ML...

Read More..

Student Debt and the Spending Crisis: What Trustees Need to Know … – American Council of Trustees and Alumni

EVENT DETAILS

In the next few months, the U.S. Supreme Court will hand down one of the most consequential decisions for higher education in recent years. But regardless of the future of the Biden administrations loan forgiveness plan, to do the best for todays and tomorrows graduates, colleges and universities must think creatively if they are to use scarce resources efficiently. And for some governing boards, it is an existential matter for their institution.

Please join the American Council of Trustees and Alumni (ACTA) on April 18, 2023, at 1 p.m. EST for a one-hour webinar designed specifically for college trustees and other higher education leaders to discuss college spending, budgets, and in what areas institutions can tighten their belts without harming academic quality or student outcomes. After this timely discussion, trustees will be better equipped to understand their institutions budgets and maximize their institutions resources.

Matthew HendricksFounder, Perspective Data Science

Dr. Matthew Hendricks is the founder of Perspective Data Science, a data consultancy firm that provides organizations, particularly those in the education sector operating on small budgets, with state-of-the-art analytics. He works with institutions to help them make better policy decisions that promote financial stability and improve student outcomes. Dr. Hendricks previously served as chair of the Department of Economics and associate professor of economics at the University of Tulsa, where his scholarship focused on labor economics, applied econometrics, and education policy. His research on the impact of changes in base salaries on teacher productivity was published in theJournal of Public EconomicsandEconomics of Education Review. Dr. Hendricks holds a B.S. in economics from St. Johns University and a Ph.D. in applied economics from the University of Minnesota.

The Honorable George Hank BrownFormer President, University of Colorado, and Former U.S. Senator

The Honorable Hank Brown served as president of the University of Colorado from 20052008. He was then named to the Quigg and Virginia S. Newton Endowed Chair in Leadership at the University of ColoradoBoulder and was an adjunct professor in the law school. Over his distinguished career in public service, he has served as the president of the University of Northern Colorado (19982002); as a member of the Colorado Senate (19721976); and as a member of both the U.S. House of Representatives (19811991) and the U.S. Senate (19911997). He led the university to record growth in enrollment, donations, and diversity and spearheaded the largest increase in state funding in the schools history. Dr. Brown is one of 22 signatories to ACTAsGovernance for a New Erareport, which calls on college trustees to work with faculty and presidents to form effective responses to ever-increasing tuition costs, outsized administrative expenditures, and the erosion of academic freedom. He received a B.S. in accounting from the University of Colorado, a masters degree in law from George Washington University, and a J.D. from the University of Colorado School of Law, and is a certified public accountant (CPA).

Robert C. DickesonCofounder, Academic Strategy Partners

Dr. Robert C. Dickeson iscofounder of Academic Strategy Partners, now the Academic Strategy Consortium, which provides expert consultation to colleges, universities, and other organizations to improve institutional leadership, systems and processes, governance, planning, enrollment management, performance analytics, and financial standing. A national leader in higher education, Dr. Dickeson served as president of the University of Northern Colorado from 1981 to 1991. He has chaired blue-ribbon commissions appointed by three governors in two states; has been an officer of 80 corporate, government, foundation, and public affairs organizations; and served as commissioner from Colorado to the Education Commission of the States. As co-founder and Senior Vice President of Lumina Foundation for Education, he led the national effort to control college costs. He is the author of more than 200 publications in the fields of higher education leadership and policy and public administration. Dr. Dickesons book,PrioritizingAcademic Programs and Services, is based on his extensive consulting experiences serving several hundred two- and four-year colleges and corporations ranging from hospitals to bank holding companies. Dr. Dickeson holds an A.B., M.A., and Ph.D. in political science and public administration from the University of MissouriColumbia.

Alice Lee Williams BrownPrincipal, AWB & AssociatesAlice Lee Williams Brown taught at Appalachian State University, Eastern Kentucky University, Ohio University, and the University of Kentucky prior to leading the Appalachian College Program at the University of Kentucky. After 10 years, the Program became the non-profit Appalachian College Association housed in Berea, KY. For the next 15 years, Dr. Brown raised over $50 million to provide fellowships for faculty and research experiences for students at 35 small private colleges across the five states of central Appalachia. Since retiring as President Emerita from that association, she has received funding from various foundations to research colleges that closed or almost closed and the importance of trusteeship. Her research has resulted in the publication of 15 articles, three books, and a confidential report on the almost closing of Sweet Briar College. She reviews proposals for the Skelly Foundation and the US Department of Education, has served on the boards of the Southern Education Foundation, Colby-Sawyer College, HERS, the Association of Collaborative Leadership, Teaching Learning Technology, the Appalachian Studies Association, and has advised non-profit organizations and private colleges. She earned her B.S. and M.A. from Appalachian State University and her Ed.D. from the University of Kentucky.

Anna SillersData Analyst Fellow, ACTA

Anna Sillers serves as the data analyst fellow in ACTAs Trustee & Government Affairs Department where she oversees HowCollegesSpendMoney.com. She is responsible for examining education data related to college spending and tuition to understand how spending can hurt or help students and led the quantitative research for ACTAs The Cost of Excess. Prior to joining ACTA, she was an associate consultant for Manhattan Strategy Group, where she served as a researcher and data analyst for the U.S. Department of Education and U.S. Department of Labor. She holds a B.A. in economics from Mount Holyoke College and an M.P.P. from Georgetown University.

Go here to read the rest:

Student Debt and the Spending Crisis: What Trustees Need to Know ... - American Council of Trustees and Alumni

Read More..

Birmingham-Southern College is Important to Our City, Our State … – birminghamal.gov

Randall L. Woodfin

Mayor of Birmingham

As the mayor of Birmingham, I am deeply troubled by the potential loss of Birmingham-Southern College a private liberal arts college formed in 1918 and I am fully supportive of the colleges strategy to conquer its financial challenges once and for all.

I support this effort for many reasons, not the least of which is the $70.5 million annual economic impact the college has on Jefferson County, though that is compelling on its own.

While BSC is small compared to Alabamas state-supported institutions, it has an outsized impact on our city and state in many other ways.

The college has a well -known reputation for producing doctors, lawyers, teachers, ministers, performing artists, and business owners. What is especially important about those BSC graduates is that more than half of them stay in Alabama, and that two-thirds of those remain in Birmingham after graduation or return here after earning additional degrees.

Equally important is the fact that BSC borders College Hills and Bush Hills, and partners with leadership in those historic neighborhoods to develop opportunities for residents and students to connect thoughtfully and intentionally. Should BSC close, what will happen on those nearly 200 acres? There is no buyer waiting in the wings; no other college sitting ready to move onto the campus and provide the stability that BSC has brought to the western edge of the city for more than a century.

Finally, at a time when so many are working so hard to keep Alabama moving forward, the loss of this nationally ranked liberal arts college would be an enormous setback, and not just for Birmingham. Smart, ambitious, service-focused students looking to get their start at a nationally ranked liberal arts college will undoubtedly leave Alabama to attend the BSCs of neighboring states. And most of them will not return to Birmingham after college.

With a relatively small investment, saving BSC would signal that Alabama values education, that one size does not fit all, and that there is room on this states educational landscape for colleges large and small, public and private.

Saving BSC will also send a powerful message to innovation-focused businesses thinking about settling in Birmingham: That Alabama is committed to changing the fact that only 25 percent of adults over age 25 have a college degree so that they can find their next great hires right here.

Those great hires from BSC will include graduates working in Birminghams growing community of technology-focused companies. BSC has launched a summer program in data science and has been approved to include a data science masters degree in fall 2023. Tech

entrepreneurs need those graduates, and Birmingham needs them to stay here to live, work, serve, and be part of our citys bright future.

I encourage BSC leadership and trustees to continue to rally support from every corner of the state and beyond to secure the understanding of its important role in the past, present, and the need for a sustainable BSC in the future.

More here:

Birmingham-Southern College is Important to Our City, Our State ... - birminghamal.gov

Read More..

Domino Data Lab’s Spring Release Offers Accessible and … – Database Trends and Applications

Domino Data Lab, the enterprise MLOps platform company, is announcing updates to its platform that will drive accessibility to open source tools and techniquesincluding Ray 2.0, MLflow, and Feasts feature store for machine learning (ML)allowing enterprises to see tangible value from their AI, sooner. The announcement is also accompanied by the launch of Domino Cloud, the fully-managed MLOps Platform-as-a-Service, and the general availability of Dominos hybrid and multi-cloud Nexus capability.

Now supporting Ray 2.0, an open source framework, Dominos platform features accelerated development and training of generative AI models at scale. The development process is further streamlined with Dominos auto-scaling compute clusters, paired with data prep via Apache Spark, as well as ML and deep learning with PyTorch, TensorFlow, and XGBoost.

The incorporation of on-demand, auto-scaling clusters and Ray 2.0 support in Dominos spring release accelerates both development and data preparation for teams at any scale. Ray speeds up the process by providing a unified, distributed compute framework that makes it easy to scale AI and Python workloadsfrom reinforcement learning to deep learning to model tuning, explained Chris Lauren, SVP of product at Domino Data Lab. This single-platform integration enables data scientists to be more productive by streamlining data preparation and model training from end to end.

The MLflow integration targets ML lifecycle management, enabling data scientists to more simply track, reproduce, and share experiments and artifacts directly within their Domino projects. Dominos security protocols are maintained across artifacts, metrics, and logs.

As data scientists iteratively explore which new breakthroughs in algorithms, fine-tuning foundational models, and tuning hyperparameters yield the best results, its important to track their progress in a consistently sharable way for model review and audits, said Lauren. Our customers can now leverage MLflow to automatically log key metrics and artifacts that help them manage experimentation at scale, streamline their work, and increase collaboration with other team members, team leads, or auditors.

Native integration of Feast within Domino streamlines access to query and transform ML functions. This introduces a cost-saving reusable feature logic across data science projects, further tracing feature lineage while simultaneously ensuring data accuracy and security.

The launch of Domino Cloud focuses on accelerating time-to-value for AI projects with scalable resources and a secure, governed, enterprise-grade platform, according to the company. The platform needs no setup or management investment, ensuring that data science workflows can maintain focus on more critical tasks. This solution allows teams to do more with less, reducing operational burden; customers only pay for compute used and can access GPUs and distributed compute frameworks.

In contrast to more limited fully managed data science platforms offered by cloud providers, Domino Cloud allows teams to integrate workflows and accelerate the full lifecycle from experiment to production. This means that teams can use end-to-end workflows with common patterns and practices, even if different users have different tool preferences, said Lauren. In this way, Domino Cloud is ideal for teams with an urgent need to scale AI while maintaining full access to the complete ecosystem of professional data science tools and scalable infrastructure they need to drive immediate business impact.

Domino is additionally announcing the general availability of Domino Nexus, built for enterprises with complex accelerated computing associated with generative AI projects existing throughout hybrid and multi-cloud environments. Workload deployment ranges from data centers to the edge, empowering enterprises with seamless workload migration.

Due in part to Dominos achievement of membership within the NVIDIA AI Accelerated program, enterprise customers are assured that Dominos platform supports the latest GPU and DPU technologies.

Once again alleviating cost pains, Domino Nexus is accompanied by a Vultr validation, enabling Domino Nexus customers to burst to Vultr Cloud with virtualized fractional NVIDIA A100 Tensor Core GPUs.

Our partnership with Vultr Cloud provides an exciting new opportunity for our customers to access the latest NVIDIA GPUs at a competitive price point. This further increases customers flexibility and choice in their hybrid and multi-cloud data science strategies, said Lauren. By offering more options from companies like Vultr, we're continuously evolving Nexus to support our customers' changing needs and helping them streamline their migration to the cloud or hybrid/multi-cloud strategies.

This Vultr infrastructure, working in combination with the NVIDIA NGC catalog, and the NVIDIA AI Enterprise software suite helps enterprises reduce costs while driving innovative generative AI projects.

The Spring release of Dominos platform (Domino 5.5), as well as Domino Cloud and Domino Nexus, are available today. Integrations with MLflow and the Feast feature store are available in preview. The Domino and Vultr solution will be released later this year.

Dominos Spring 23 release unveils powerful new capabilities which give every enterprise access to cutting-edge, open source tools and techniques to achieve real business value from AI in a fraction of the time, concluded Lauren. No company, regardless of scale, needs to be without the tools to unlock new insights and capabilities, and allows them to drive innovation for the creation of new AI infused products and services that were previously beyond reach.

For more information about this news, visit http://www.dominodatalab.com.

Read more from the original source:

Domino Data Lab's Spring Release Offers Accessible and ... - Database Trends and Applications

Read More..

Data science is the future; get rural girls in colleges, says IIT Madras director V Kamakoti – Times of India

CHENNAI: India should pursue divergent goals such as providing cutting-edge data science education to students at large on one hand and ensuring higher education to girls in villages, said IIT Madras director V Kamakoti on Wednesday."Oil of the world is data. Education has come to our houses through technology. For India to become technologically independent we as a society need to invest in data education. Every student should aspire to pursue a data analytics course. With massive digitization, the next level of business is driven by data," Kamakoti said at the college day of MOP Vaishnav College for Women.He urged colleges to provide education to girls from rural areas. "IIT madras is ready to welcome girl children from rural areas. If we don't improve our educational policies, we will become one of the least educated countries by the next decade," he said.College principal Lalitha Balakrishnan said MOP students would soon have twinning options. "They will be encouraged to take up data science along with their current subjects," she said.BCom (hons) student Rahini S, 20, a first-generation graduate, was awarded the best outgoing student award. "My older sister did not go to college. I've realised that education is important for my livelihood. The college has supported me not just in studies, but in my extra-curricular activities too," she said.

Read more:

Data science is the future; get rural girls in colleges, says IIT Madras director V Kamakoti - Times of India

Read More..

NMDSI Symposium on untapped AI, 13 – Marquette Today – Marquette Today

The Northwestern Mutual Data Science Institute (NMDSI) will host its inaugural symposium, Untapped AI Next Frontiers, on Thursday, April 13, at Northwestern Mutuals downtown campus.

This event is free and open to the public and will focus on AI research, ethics, emerging trends and best practices. Attendees will hear from a variety of experts in the data science industry and academia and have the opportunity to network with industry peers.

The NMDSI is a collaboration among Marquette, the University of Wisconsin-Milwaukee and Northwestern Mutual with the goals of furthering research, corporate and community applications, and talent development in data science.

The event will kick off with a keynote address from Dr. Desmond Patton, a nationally recognized AI expert. Following, the NMDSI will host a fireside chat conversation with NMDSI leadership, including:

Attendees will then hear from a variety of NMDSI experts via a series of topical Lightning Talks.

The day will conclude with a student poster session highlighting students from Marquette University and the University of Wisconsin-Milwaukee as well as Northwestern Mutual employees, who will create and share posters to showcase their research projects.

Participants will present to attendees and judges and should be willing to answer questions and discuss their work.

Apply onlineto be a part of the poster session by Friday, March 31.

Join the NMDSI for a networking reception with refreshments during the poster session.

The event will take place virtually and in person at Northwestern Mutuals downtown Milwaukee campus.

Learn more and register via Eventbrite.

With questions, contact Stephanie Van Wieringen.

Follow this link:

NMDSI Symposium on untapped AI, 13 - Marquette Today - Marquette Today

Read More..

Learning to grow machine-learning models | MIT News | Massachusetts Institute of Technology – MIT News

Its no secret that OpenAIs ChatGPT has some incredible capabilities for instance, the chatbot can write poetry that resembles Shakespearean sonnets or debug code for a computer program. These abilities are made possible by the massive machine-learning model that ChatGPT is built upon. Researchers have found that when these types of models become large enough, extraordinary capabilities emerge.

But bigger models also require more time and money to train. The training process involves showing hundreds of billions of examples to a model. Gathering so much data is an involved process in itself. Then come the monetary and environmental costs of running many powerful computers for days or weeks to train a model that may have billions of parameters.

Its been estimated that training models at the scale of what ChatGPT is hypothesized to run on could take millions of dollars, just for a single training run. Can we improve the efficiency of these training methods, so we can still get good models in less time and for less money? We propose to do this by leveraging smaller language models that have previously been trained, says Yoon Kim, an assistant professor in MITs Department of Electrical Engineering and Computer Science and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL).

Rather than discarding a previous version of a model, Kim and his collaborators use it as the building blocks for a new model. Using machine learning, their method learns to grow a larger model from a smaller model in a way that encodes knowledge the smaller model has already gained. This enables faster training of the larger model.

Their technique saves about 50 percent of the computational cost required to train a large model, compared to methods that train a new model from scratch. Plus, the models trained using the MIT method performed as well as, or better than, models trained with other techniques that also use smaller models to enable faster training of larger models.

Reducing the time it takes to train huge models could help researchers make advancements faster with less expense, while also reducing the carbon emissions generated during the training process. It could also enable smaller research groups to work with these massive models, potentially opening the door to many new advances.

As we look to democratize these types of technologies, making training faster and less expensive will become more important, says Kim, senior author of a paper on this technique.

Kim and his graduate student Lucas Torroba Hennigen wrote the paper with lead author Peihao Wang, a graduate student at the University of Texas at Austin, as well as others at the MIT-IBM Watson AI Lab and Columbia University. The research will be presented at the International Conference on Learning Representations.

The bigger the better

Large language models like GPT-3, which is at the core of ChatGPT, are built using a neural network architecture called a transformer. A neural network, loosely based on the human brain, is composed of layers of interconnected nodes, or neurons. Each neuron contains parameters, which are variables learned during the training process that the neuron uses to process data.

Transformer architectures are unique because, as these types of neural network models get bigger, they achieve much better results.

This has led to an arms race of companies trying to train larger and larger transformers on larger and larger datasets. More so than other architectures, it seems that transformer networks get much better with scaling. Were just not exactly sure why this is the case, Kim says.

These models often have hundreds of millions or billions of learnable parameters. Training all these parameters from scratch is expensive, so researchers seek to accelerate the process.

One effective technique is known as model growth. Using the model growth method, researchers can increase the size of a transformer by copying neurons, or even entire layers of a previous version of the network, then stacking them on top. They can make a network wider by adding new neurons to a layer or make it deeper by adding additional layers of neurons.

In contrast to previous approaches for model growth, parameters associated with the new neurons in the expanded transformer are not just copies of the smaller networks parameters, Kim explains. Rather, they are learned combinations of the parameters of the smaller model.

Learning to grow

Kim and his collaborators use machine learning to learn a linear mapping of the parameters of the smaller model. This linear map is a mathematical operation that transforms a set of input values, in this case the smaller models parameters, to a set of output values, in this case the parameters of the larger model.

Their method, which they call a learned Linear Growth Operator (LiGO), learns to expand the width and depth of larger network from the parameters of a smaller network in a data-driven way.

But the smaller model may actually be quite large perhaps it has a hundred million parameters and researchers might want to make a model with a billion parameters. So the LiGO technique breaks the linear map into smaller pieces that a machine-learning algorithm can handle.

LiGO also expands width and depth simultaneously, which makes it more efficient than other methods. A user can tune how wide and deep they want the larger model to be when they input the smaller model and its parameters, Kim explains.

When they compared their technique to the process of training a new model from scratch, as well as to model-growth methods, it was faster than all the baselines. Their method saves about 50 percent of the computational costs required to train both vision and language models, while often improving performance.

The researchers also found they could use LiGO to accelerate transformer training even when they didnt have access to a smaller, pretrained model.

I was surprised by how much better all the methods, including ours, did compared to the random initialization, train-from-scratch baselines. Kim says.

In the future, Kim and his collaborators are looking forward to applying LiGO to even larger models.

The work was funded, in part, by the MIT-IBM Watson AI Lab, Amazon, the IBM Research AI Hardware Center, Center for Computational Innovation at Rensselaer Polytechnic Institute, and the U.S. Army Research Office.

Read more:
Learning to grow machine-learning models | MIT News | Massachusetts Institute of Technology - MIT News

Read More..

Machine Learning Programs Predict Risk of Death Based on Results From Routine Hospital Tests – Neuroscience News

Summary: Using ECG data, a new machine learning algorithm was able to predict death within 5 years of a patient being admitted to hospital with 87% accuracy. The AI was able to sort patients into 5 categories ranging from low to high risk of death.

Source: University of Alberta

If youve ever been admitted to hospital or visited an emergency department, youve likely had an electrocardiogram, or ECG, a standard test involving tiny electrodes taped to your chest that checks your hearts rhythm and electrical activity.

Hospital ECGs are usually read by a doctor or nurse at your bedside, but now researchers are using artificial intelligence to glean even more information from those results to improve your care and the health-care system all at once.

Inrecently published findings, the research team built and trained machine learning programs based on 1.6 million ECGs done on 244,077 patients in northern Alberta between 2007 and 2020.

The algorithm predicted the risk of death from that point for each patient from all causes within one month, one year and five years with an 85 percent accuracy rate, sorting patients into five categories from lowest to highest risk.

The predictions were even more accurate when demographic information (age and sex) and six standard laboratory blood test results were included.

The study is a proof-of-concept for using routinely collected data to improve individual care and allow the health-care system to learn as it goes, according to principal investigatorPadma Kaul, professor of medicine and co-director of theCanadian VIGOUR Centre.

We wanted to know whether we could use new methods like artificial intelligence and machine learning to analyze the data and identify patients who are at higher risk for mortality, Kaul explains.

These findings illustrate how machine learning models can be employed to convert data collected routinely in clinical practice to knowledge that can be used to augment decision-making at the point of care as part of a learning health-care system.

A clinician will order an electrocardiogram if you have high blood pressure or symptoms of heart disease, such as chest pain, shortness of breath or an irregular heartbeat. The first phase of the study examined ECG results in all patients, but Kaul and her team hope to refine these models for particular subgroups of patients.

They also plan to focus the predictions beyond all-cause mortality to look specifically at heart-related causes of death.

We want to take data generated by the health-care system, convert it into knowledge and feed it back into the system so that we can improve care and outcomes. Thats the definition of a learning health-care system.

Author: Ross NeitzSource: University of AlbertaContact: Ross Neitz University of AlbertaImage: The image is in the public domain

Original Research: Open access.Towards artificial intelligence-based learning health system for population-level mortality prediction using electrocardiograms by Padma Kaul et al. npj Digital Medicine

Abstract

Towards artificial intelligence-based learning health system for population-level mortality prediction using electrocardiograms

The feasibility and value of linking electrocardiogram (ECG) data to longitudinal population-level administrative health data to facilitate the development of a learning healthcare system has not been fully explored. We developed ECG-based machine learning models to predict risk of mortality among patients presenting to an emergency department or hospital for any reason.

Using the 12-lead ECG traces and measurements from 1,605,268 ECGs from 748,773 healthcare episodes of 244,077 patients (20072020) in Alberta, Canada, we developed and validated ResNet-based Deep Learning (DL) and gradient boosting-based XGBoost (XGB) models to predict 30-day, 1-year, and 5-year mortality. The models for 30-day, 1-year, and 5-year mortality were trained on 146,173, 141,072, and 111,020 patients and evaluated on 97,144, 89,379, and 55,650 patients, respectively. In the evaluation cohort, 7.6%, 17.3%, and 32.9% patients died by 30-days, 1-year, and 5-years, respectively.

ResNet models based on ECG traces alone had good-to-excellent performance with area under receiver operating characteristic curve (AUROC) of 0.843 (95% CI: 0.8380.848), 0.812 (0.8080.816), and 0.798 (0.7920.803) for 30-day, 1-year and 5-year prediction, respectively; and were superior to XGB models based on ECG measurements with AUROC of 0.782 (0.7760.789), 0.784 (0.7800.788), and 0.746 (0.7400.751).

This study demonstrates the validity of ECG-based DL mortality prediction models at the population-level that can be leveraged for prognostication at point of care.

Read the original:
Machine Learning Programs Predict Risk of Death Based on Results From Routine Hospital Tests - Neuroscience News

Read More..

A.I. and machine learning are about to have a breakout moment in finance – Fortune

Good morning,

Theres been a lot of discussion on the use of artificial intelligence and the future of work. Will it replace workers? Will human creativity be usurped by bots? How will A.I. be incorporated into the finance function? These are just some of the questions organizations will face.

I asked Sayan Chakraborty, copresident at Workday (sponsor of CFO Daily), who also leads the product and technology organization, for his perspective on a balance between tech and human capabilities.

Workdays approach to A.I. and machine learning (ML) is to enhance people, not replace them, Chakraborty tells me. Our approach ensures humans can effectively harness A.I. by intelligently applying automation and providing supporting information and recommendationswhile keeping humans in control of all decisions. He continues, We believe that technology and people, working together, can allow businesses to strengthen competitive advantage, be more responsive to customers, deliver greater economic and social value, and generate more meaning and purpose for individuals in their work.

Workday, a provider of enterprise cloud applications for finance and HR, has been building and delivering A.I. and ML to customers for nearly a decade, according to Chakraborty. He holds a seat on the National Artificial Intelligence Advisory Committee (NAIAC), which advises the White House on policy issues related to A.I. (And as much as I pressed, Chakraborty is not at liberty to discuss NAIAC efforts or speak for the committee, he says.) But he did share that generative A.I. continues to be a growing part of policy discussions both in the U.S. and in Europe, which has embraced a risk-based approach to A.I. governance.

Techs future in finance

Chakrabortys Workday colleague Terrance Wampler, group general manager for the Office of the CFO at Workday, has further thoughts on how A.I. will impact finance. If you can automate transaction processes, that means you reduce risk because you reduce manual intervention, Wampler says. Finance chiefs are also looking for the technology to help in accelerating data-based decision-making and recommendations for the company, as well as play a role in training people with new skills, he says.

Consulting firm Gartner recently made three predictions on financial planning and analysis (FP&A) and controller functions and the use of technology:

By 2025, 70% of organizations will use data-lineage-enabling technologies including graph analytics, ML, A.I., and blockchain as critical components of their semantic modeling.

By 2027, 90% of descriptive and diagnostic analytics in finance will be fully automated.

By 2028, 50% of organizations will have replaced time-consuming bottom-up forecasting approaches with A.I.

Workday thinks about and implements A.I. and ML differently than other enterprise software companies, Wampler says. I asked him to explain. Enterprise resource planning (ERP) is a type of software that companies use to manage day-to-day business activities like accounting and procurement. What makes Workdays ERP for finance and HR different is A.I. and ML are embedded into the platform, he says. So, its not like the ERP is just using an A.I. or ML program. It is actually an A.I. and ML construct. And having ML built into the foundation of the system means theres a quicker adaptation of new ML applications when theyre added. For example, Workday Financial Management allows for faster automation of high-volume transactions, he says.

ML gets better the more you use it, and Workday has over 60 million users representing about 442 billion transactions a year, according to the company. So ML improves at a faster rate. The platform also allows you to use A.I. predictively. Lets say an FP&A team has its budget for the year. Using ML, they predictively identify reasons why they would meet that budget, he says. And Workday works on a single cloud-based database for both HR and financials. You have all the information in one place. For quite some time, the company has been using large language models, the technology that has enabled generative A.I., Wampler says. Workday will continue to look into use cases where generative A.I. can add value, he says.

It will definitely be interesting to have a front-row seat as technology in the finance function continues to evolve over the next decade.

Sheryl Estradasheryl.estrada@fortune.com

Upcoming event: The nextFortuneEmerging CFO virtual event, Addressing the Talent Gap with Advanced Technologies, presented in partnership with Workday (a CFO Daily sponsor), will take place from 11 a.m.-12 p.m. EST on April 12. Matt Heimer, executive editor of features atFortune, and I will be joined byKatie Rooney, CFO at Alight Solutions; andAndrew McAfee, cofounder and codirector of MITs Initiative on the Digital Economy and principal research scientist at MIT Sloan School of Management.Click here to learn more and register.

The race to cloud: Reaching the inflection point to long-sought value, a report by Accenture, finds that over the past two years, theres been a surge in cloud commitment, with more than 86% of companies reporting an increase in cloud initiatives. To gauge how companies today are approaching the cloud, Accenture asked them to describe the current state of their cloud journeys. Sixty-eight percent said they still consider their cloud journeys incomplete. About a third of respondents (32%) see their cloud journeys as complete and are satisfied with their abilities to meet current business goals. However, 41% acknowledge their cloud journeys are ongoing and continue to evolve to meet changing business needs. The findings are based on a global survey of 800 business and IT leaders in a variety of industries.

The workforce well-being imperative, a new report by Deloitte, exploresthree factors that have a prominent impact on well-being in todays work environment: leadership behaviors at all levels, from a direct supervisor to the C-suite; how the organization and jobs are designed; and the ways of working across organizational levels. Deloitte refers to these as work determinants of well-being.

Lance Tucker was promoted to CFO at Papa Johns International, Inc. (Nasdaq: PZZA). Tucker succeeds David Flanery, who will retire from Papa Johns after 16 years with the company. Flanery will continue at the company through May, during a transition period. Tucker, 42, has served as Papa Johns SVP of strategic planning and chief of staff since 2010. He has 20 years of finance and management experience, including previously serving in manager and director of finance roles at Papa Johns from 1994 to 1999. Before Papa Johns, Tucker was CFO of Evergreen Real Estate, LLC.

Narayan Menon was named CFO at Matillion, a data productivity cloud company. Menon brings over 25 years of experience in finance and operations. Most recently, Menon served as CFO of Vimeo Inc., where he helped raise multiple rounds of funding and took the company public in 2021. Hes also held senior executive roles at Prezi, Intuit, and Microsoft. Menon also served as an advisory board member for the Rutgers University Big Data program.

This was a bank that was an outlier.

Federal Reserve Chair Jerome Powell said of Silicon Valley Bank in a press conference following a Fed decision to hike interest rates 0.25%, Yahoo Finance reported. Powell referred to the banks high percentage of uninsured deposits and its large investment in bonds with longer durations. These are not weaknesses that are there at all broadly through the banking system, he said.

See the rest here:
A.I. and machine learning are about to have a breakout moment in finance - Fortune

Read More..

A Growing Phenomenon: The Importance of Saudi Women In Data … – About Her

On Tuesday, Prince Sultan University and Stanford University collaborated to launch the Women in Data Science Conference 2023 in Riyadh with the objective to encourage female participation and interest in the field of data science, engineering and computer science.

The conference is conducted annually in nearly 150 universities across more than 60 countries, showcasing the most current research and practices in data science.

Data science plays a crucial role in today's world as it enables individuals and organizations to extract meaningful insights and knowledge from vast amounts of data. It involves the use of statistical and computational methods to analyze and interpret data, and to identify patterns and trends that can be used to make informed decisions.

With that being said, the risk of running and implementing practices that do not come from a diverse perspective is high. Which is why plain and simple: we need more women in the field of data.

In KSA, have picked up on that and so, many renowned personalities and entities were in attendance such as Saudi Arabian and international researchers. In accordance with the agenda, they presented scientific papers and participated in dialogue sessions.

Several Saudi women that have excelled in this department were in attendance, such as Maysa Al-Qurashi. She is a Professor of Mathematics, with a PhD in Analysis from KSU. Her main focus within the academic field falls in Strategic Planning, Quality Assurance, Science and that is not even the most interesting thing about her.

Dr. Al-Qurashi has more than 60 publications in prestigious journal covering various fields of Mathematics and Analysis such as: Operator Algebras, Harmonic Analysis, Applied Mathematics and Biological Mathematics.

Another Saudi data superstar is Reem Alattas who was at the conference, is a director, SAP Value Advisory, Techpreneur, Inventor, NASA Datanaut, and Speaker. That is what her LinkedIn header says, but we are sure she is being humble.

This Saudi woman has successfully launched many innovative products over the years and raised capital across multiple funding rounds from investors. A household name, she is.

As for the conference, it included workshops and a datathon competition to evaluate the students' proficiency, while also promoting mentorship and interdisciplinary collaboration in accordance with technological advancements.

Vice President of Prince Sultan Universitys female campus, Heba Khoshaim, stated in the conference that these types of platforms are an opportunity to highlight womens achievement in the digital field, and to discuss and review the most prominent research and practices.

While Ahmed bin Saleh Al-Yamani, president of Prince Sultan University, said: The goals of the Kingdoms Vision 2030 and the initiatives taken in the digital field have enabled Saudi women to draw their own path in the data science field.

This is not the first time Saudi Arabia has launched an initiative to push for the field of data science, as a partnership was initiated in the past that yielded Saudi Arabias first female data scientists.

The largest university for women in the world, Princess Nourah bint Abdulrahman University (PNU), had partnered up with Dell, the American tech company, to train data scientists in the kingdom. That collaboration resulted in training 57 female data science and big data analytics students in one semester only, with an additional 103 students certified in cloud infrastructure.

With that training, we are on the right path to ensure data is not biased. The reason that the world, and not just Saudi Arabia, is pushing for female data scientists is that the analysis needs to be made from both genders, not only to ensure collaboration and innovation at all levels, but also to ensure inclusivity in decision making.

Data science is used in a wide range of fields such as finance, healthcare, marketing, and social media, among others, to improve business processes, develop new products and services, and enhance decision-making.

So, imagine having all this vast amount of data analyzed and worked on from just one perspective. Recently, weve written an article on how woman pain is dismissed (LINK ARTICLE), and now the kingdom is making those firm steps to rectify that through not only giving women a seat on the table, but having them head these conversations.

More:

A Growing Phenomenon: The Importance of Saudi Women In Data ... - About Her

Read More..