Page 760«..1020..759760761762..770780..»

NOT-HL-23-118: Request for Information (RFI): National Heart Lung … – National Institutes of Health (.gov)

Request for Information (RFI): National Heart Lung and Blood Institute (NHLBI) Strategic Vision Refresh

The NHLBI provides global leadership for research and training to promote the prevention, diagnosis, and treatment of heart, lung, and blood diseases and sleep disorders and to enhance the health of all individuals so that they can live longer and more fulfilling lives. The NHLBI stimulates basic discoveries about the causes of disease, enables the translation of basic discoveries into clinical practice, leverages implementation science to take innovations toward application, and communicates research advances to the public.

By design, the NHLBI Strategic Vision is dynamic, reflecting input from partners who are at the leading edge of scientific exploration and from communities with a vested interest. As part of updating the Strategic Vision, the NHLBI wants to harness community-level participation. While the NHLBI Strategic Vision goals and objectives set in 2016 remain timely, the refresh aims to ensure such by keeping up with scientific needs and advances.

The NHLBIs Strategic Vision has four goals and eight objectives that were identified in 2016.

Strategic goals:

Strategic objectives:

For the refresh of the NHLBI Strategic Vision, this RFI invites input on the current relevance of the NHLBI strategic objectives and considering whether additional Compelling Questions (CQs) and Critical Challenges (CCs) are needed to address topics that have surfaced as priorities over the past five years to drive important scientific and health advances.

Compelling Questions are unanswered questions or poorly understood areas of research requiring NHLBI facilitation because their complexity exceeds the capacity of any single investigator-initiated program.

Critical Challenges are barriers or impediments to scientific progress, and overcoming these obstacles will result in significant impact.

The NHLBI Strategic Vision is inclusive of a broad portfolio of scientific ideas spanning basic through implementation sciences. This refresh will adhere to our commitment to scientific stewardship and accountability. This RFI has a particular interest in seeking input on novel research needs and approaches in the following focus areas:

Input sought includes the following:

Perspective on critical research needs or compelling research questions for any of the refresh focus areas

Perspective on challenges or barriers that need to be addressed to support progress in any of the refresh focus areas

The relevance of the 8 objectives for the NHLBI Strategic Vision and any critical questions or challenges that are not already incorporated into the NHLBI Strategic Vision

Comments must be submitted electronically on the submission website: https://rfi.grants.nih.gov/?s=65119386df04aef8db066042. Responses will be accepted through 11:59:59 pm (ET) on December 15, 2023.

Responses to this RFI are voluntary. Respondents are advised that the Government is under no obligation to acknowledge receipt of the information shared or provide feedback to respondents with respect to any information submitted. No proprietary, classified, confidential, or sensitive information should be included in your response. This RFI is for information and planning purposes only and should not be construed as a solicitation or as an obligation on the part of the Federal Government in general, the NIH, or the NHLBI specifically.

Other than your name and contact information, the Government reserves the right to use any submitted information on public websites, in reports, in summaries of the state of the science, in any possible resultant solicitation(s), grant(s), or cooperative agreement(s), or in the development of future funding opportunity announcements. Please note that the Government will not pay for the preparation of any information submitted or for use of that information.

We appreciate your input and invite you to share this RFI opportunity with your colleagues and others in your community.

See the rest here:

NOT-HL-23-118: Request for Information (RFI): National Heart Lung ... - National Institutes of Health (.gov)

Read More..

Data Science Platform Market to grow by USD 249.15 billion … – PR Newswire

NEW YORK, Oct. 13, 2023 /PRNewswire/ -- The Data Science Platform Marketreport has been added to Technavio's offering.With ISO 9001:2015 certification, Technavio has proudly partnered with more than 100 Fortune 500 companies for over 16 years. The potential growth difference for the data science platform market between 2022and 2027 is USD 249.145 billion.Get deeper insights into the market size, current market scenario, future growth opportunities, major growth driving factors, the latest trends, and much more. Buy the full report here

The majordriving factor forthe global data science platform market is thehigh generation of data volumes.Since 2014,data volumes have exploded, and more data has been created. Business applications are generating enormous volumes of data, and this will continue throughout the forecast period and beyond. Further, the growing volume of data generated in organizations through multiple channels and sources has forced organizations to implement big data analytics and save a significant amount of cost for organizations. However,data analysts or scientists need to thoroughly analyze large amounts of data and convert insights into real-time action.For instance, apopular data science application such asbig data analyticscan be used for retrieving and analyzing data to discover significant weaknesses, develop indicator patterns to identify opportunities and threats; and optimize business decisions. Therefore, as data volumes are rising, the demand for data analytics is also growing, which is anticipated to boost the growth of the market during the forecast period.

The data science platform market is segmented by Component (Platform and Services), Deployment (On-premise and Cloud), and Geography (North America, Europe, APAC, South America, and Middle East and Africa).

Key Companies in the Data Science Platform Market:

Alphabet Inc., Altair Engineering Inc., Alteryx Inc., Anaconda Inc., Cloudera Inc., Databricks Inc., Dataiku Inc., DataRobot Inc., Domino Data Lab Inc., International Business Machines Corp., Microsoft Corp., Oracle Corp., Rapid Insight Inc., RapidMiner Inc., Rexer Analytics, Rstudio PBC, SAS Institute Inc., The MathWorks Inc., Vista Equity Partners Management LLC, Wolfram

Related Reports:

The online data science training programs market share is expected to increase by USD3.76 million from 2021 to 2026,and the market's growth momentum will accelerate at a CAGR of 25.1%.

TheIndia -professional online courses marketsize is estimated to grow at aCAGR of 16.24%between 2022and 2027. Themarket size is forecast to increase byUSD 2,782.59 million.

ToC:

Executive Summary

Market Landscape

Market Sizing

Historic Market Sizes

Five Forces Analysis

Market Segmentation by Component

Market Segmentation by Deployment

Market Segmentation by Geography

Customer Landscape

Geographic Landscape

Drivers,Challenges, &Trends

Company Landscape

Company Analysis

Appendix

About Technavio

Technavio is a leading global technology research and advisory company. Their research and analysis focus on emerging market trends and provideactionable insights to help businesses identify market opportunities and develop effective strategies to optimize their market positions.

With over 500 specialized analysts, Technavio's report library consists of more than 17,000 reports and counting, covering 800 technologies, spanning across 50 countries. Their client base consists of enterprises of all sizes, including more than 100 Fortune 500 companies. This growing client base relies on Technavio's comprehensive coverage, extensive research, and actionable market insights to identify opportunities in existing and potential markets and assess their competitive positions within changing market scenarios.

Contacts

Technavio ResearchJesse MaidaMedia & Marketing ExecutiveUS: +1 844 364 1100UK: +44 203 893 3200Email:[emailprotected]Website:www.technavio.com

SOURCE Technavio

See the original post:

Data Science Platform Market to grow by USD 249.15 billion ... - PR Newswire

Read More..

Robinson Receives New Mathematics Teacher of the Year Award – University of Arkansas Newswire

Bella Rose Robinson

Vice Chair and teaching associate professor Samantha Robinson

Samantha Robinson, vice chair and teaching associate professor in theDepartment of Mathematical Sciencesin theFulbright College of Arts and Sciences, will receive the 2023Arkansas Council of Teachers of MathematicsExcellence in Four-Year College/University Mathematics Teaching Awardfor making significant contributions to mathematics, statistics and data science education in the state.

The mission of the Arkansas Council of Teachers of Mathematics is to provide vision and leadership in improving the teaching and learning of mathematics so that every student is ensured of an equitable standards-based mathematics education, and every teacher of mathematics is ensured the opportunity to grow professionally.

This inaugural ACTM teaching award is one of six newly introduced Mathematics Teacher of the Year awards, each representing a different level of mathematics teaching e.g., Elementary School, Middle School, Secondary School, etc.

Robinson currently serves as vice chair in the Department of Mathematical Sciencesbut previously served as course coordinator for all sections of Principles of Statistics (STAT 2303) and Biostatistics (STAT 2823), which have an annual student enrollment of approximately 1,500-1,700 students. She drafted the Arkansas Course Transfer System learning objectives for Principles of Statistics, directly impacting all two-year and four-year institutions in the state of Arkansas, helped to align K-12 and college-level statistics and data science education objectives while serving on the Arkansas Mathematics Pathways Task Force Alignment Working Group and recently was elected to the American Statistical Association Section of Statistics and Data Science Education executive committee.

In recent years,Robinson has led groups of student researchersto conferences, presenting statistical research that has resulted in publications, oral presentations, poster presentations and numerous awards for the student researchers. Robinson has also been previously recognized for her distinguished university teaching and her contribution to mathematics, statistics and data science education at theuniversity,regionalandnationallevels.

Robinson has been teaching in the Department of Mathematical Sciences in a variety of different roles for approximately 10 years, with her first full-time teaching appointment in 2013.

Robinson will be the very first recipient of this newly established award at the Four-Year College/University level. She (along with other ACTM awardees) will be honored at the annual ACTM conference this fall in Little Rock.

Original post:

Robinson Receives New Mathematics Teacher of the Year Award - University of Arkansas Newswire

Read More..

worldsteel Safety and Health Excellence Recognition 2023 … – World Steel Association

As part of its commitment to the highest safety and health standards, the World Steel Association (worldsteel) recognises excellence in six of its member companies for delivering demonstrable improvements in safety and health.

Andrew Purvis, Director, Sustainable Manufacturing, said, I am proud to recognise the commitment and effort of our members towards the wellbeing of their workforce and contractor community. The stories shared here are more than just examples; they highlight the remarkable progress made in safeguarding lives and promoting and preserving health.

The recognised companies this year are:

BlueScope Steel Limited Integrating HOP into foundational HSE processes

In 2019, BlueScope started its global HOP (Human and Organisational Performance) journey by proactively piloting HOP-based Leadership Workshops as top management was curious about evolved safety thinking. BlueScope is at the stage of systems and processes being simplified and updated to embed the HOP philosophy into everything it does, so that the practice is sustained.

Liberty Steel Transforming safety culture and performance through human performance principles

In 2019, a decision was made to initiate a transformative journey to reshape the safety culture across the organisation. To drive this transformation, a comprehensive roadmap was developed, known as The WRIB [We are InfraBuild] Safe Way. Underpinning this, are four strategic pillars, with the overarching goal of creating a world-class safety culture and safety performance.

ACERINOX S.A. Innovative roll cover solution enhances safety and operational efficiency in hot mill operations

In February 2022, a critical challenge arose at Columbus Stainless hot strip mill when a finishing mill backup roll (BUR) suffered a catastrophic failure, posing a risk to personnel and equipment. To address this, a cross-functional team embarked on a mission to create a preventive solution that prioritised safety without compromising operational efficiency.

JFE Steel Corporation Safe work support using safety monitoring system

At JFE Steel, the latest information and communications technology (ICT), artificial intelligence (AI) and data science technologies are being used to develop and commercialise more new technologies to ensure the safety of workers at manufacturing sites.

Tenaris Ergonomics programme

Confab, Tenaris production centre in Brazil, started evaluating the ergonomic conditions in its pipe manufacturing mills back in 2016. Before implementing its ergonomics programme, the production centre reported an average of 42 employees per year with work restrictions due to injuries associated with poor ergonomics. Following this assessment, a three-year ergonomics programme was introduced, including an annual review and evaluations by a cross-functional team to establish investment priorities.

Tata Steel Real-time visualisation of risk movement

All high-potential safety risk scenarios were identified at Tata Steel by implementing a Process Safety Management framework. To prevent and mitigate high-potential scenarios, a number of safety barriers were identified. However, in some instances, there was a fair probability of some early failure indications going unnoticed, which could cause the failure of barriers, leading to high potential incidents. Consequently, Tata Steel felt that tracking the health of the barriers on a real-time basis was needed. The companys innovative approach to real-time visualisation of risk movement aims to provide real-time insights and alerts on the level of risk.

More details on each of the initiatives can be found in worldsteels Safety and Excellence Recognition 2023 publication (click on the link on the right of this page).

#Ends#

Notes

The World Steel Association (worldsteel) is one of the largest and most dynamic industry associations in the world, with members in every major steel-producing country. worldsteel represents steel producers, national and regional steel industry associations, and steel research institutes. Members represent around 85% of global steel production.

See the article here:

worldsteel Safety and Health Excellence Recognition 2023 ... - World Steel Association

Read More..

New York Life Hires Industry Veteran Don Vu as Chief Data and Analytics Officer – Yahoo Finance

NEW YORK, October 12, 2023--(BUSINESS WIRE)--New York Life today announced the hiring of industry veteran Don Vu as senior vice president and chief data and analytics officer. Mr. Vu will lead a newly formed artificial intelligence (AI) and data team with responsibility for AI, data, and insights capabilities and aligning data architecture with business architecture in support of the companys business strategy and objectives. Mr. Vu will report to Alex Cook, senior vice president and head of Strategic Capabilities.

"Considering Dons impressive experience and track record of success, we are delighted to welcome him to the team at this exciting point in our innovation journey," said Cook. "We look forward to the role Don will play in furthering our digitization efforts and leveraging AI to provide industry-leading experiences for our customers, agents, advisors, and employees."

Mr. Vu joins New York Life from Northwestern Mutual where he served as chief data officer since early 2020. In leading the companys data and analytics function, he drove organizational transformation and enterprise data and AI strategy across the company. He led a consolidated team across various disciplines, including data product and strategy; data science, AI, and analytics; data engineering; and data governance. He also served on the executive steering committee of the companys Data Science Institute.

Previously, Mr. Vu served as vice president of data and analytics at WeWork, where he led its central data and analytics organization. He also spent 13 years at Major League Baseball (MLB), where he led its consolidated analytics organization as vice president of data and analytics. While at MLB, Mr. Vu led data and advanced analytics efforts at BAMTech, a streaming media technology spin-off of MLB Advanced Media.

Mr. Vu received a B.S. in Information Systems and Commerce from the University of Virginias McIntire School of Commerce. He currently serves on the advisory board for McIntires Business Analytics program.

Story continues

ABOUT NEW YORK LIFE

New York Life Insurance Company (www.newyorklife.com), a Fortune 100 company founded in 1845, is the largest1 mutual life insurance company in the United States and one of the largest life insurers in the world. Headquartered in New York City, New York Lifes family of companies offers life insurance, retirement income, investments, and long-term care insurance. New York Life has the highest financial strength ratings currently awarded to any U.S. life insurer from all four of the major credit rating agencies.2

1Based on revenue as reported by "Fortune 500 ranked within Industries, Insurance: Life, Health (Mutual)," Fortune magazine, 6/5/2023. For methodology, please see https://fortune.com/franchise-list-page/fortune-500-methodology-2023/.

2Individual independent rating agency commentary as of 10/18/2022: A.M. Best (A++), Fitch (AAA), Moodys Investors Service (Aaa), Standard & Poors (AA+).

View source version on businesswire.com: https://www.businesswire.com/news/home/20231012892088/en/

Contacts

Kevin Maher New York Life (212) 576-6955 kevin_b_maher@newyorklife.com

Read more:

New York Life Hires Industry Veteran Don Vu as Chief Data and Analytics Officer - Yahoo Finance

Read More..

Tiger Analytics: Generative AI has improved productivity – Storyboard18

Tiger Analytics, a data sciences and advanced analytics company, has undergone a rebranding exercise. Storyboard18 caught up with Mahesh Kumar, founder and chief executive officer, and Pradeep Gulipalli, co-founder of Tiger Analytics, who spoke about the rebranding initiative, the combination of AI and analytics, and the benefits of generative AI.

Rebranded logo

Could you touch upon the rebranding initiative of Tiger Analytics?

Mahesh Kumar: The last time we may have done something similar on a much smaller scale was when we were a team of 100 people, which was almost eight to 10 years ago. Today, our team has grown to 4,000 people. Now, we have multiple large offices in India and seven to eight countries.

We provide services in areas including data science, data engineering, ML (machine learning) engineering, MLOps (machine learning operations), quality engineering and application engineering. We are doing quite a lot of work in generative AI and large language models.

Today, we have 100+ clients. These are large companies spread globally, and the nature of the work spans across retail, consumer product goods, pharmaceutical, insurance, banking, transportation, logistics and manufacturing.

We have been constantly evolving our messaging when we interact with the external world, our clients, agencies as well as internally. But we felt that this is the time we need to take a holistic look at what we are trying to do here, create a consistent simple story to convey to everyone in this complex world.

What are the goals you aim to accomplish through this exercise?

Pradeep Gulipalli: What does building the world's best AI and analytics firm mean? What do we do? This is what the whole branding exercise was about.

The whole business ecosystem has gotten pretty complex. You don't know a particular action any business takes and what the outcome will be. Customer behaviours are changing with all the technology and the revolution that is happening. The competitive landscape is also quite dynamic. ChatGPT didn't exist a year ago and then see how dramatically things have changed. So it's a pretty complex situation that our clients deal with.

And how do they make decisions? Sometimes they have a lot of data. Sometimes they have no data. Signals are hidden. It's ambiguous. It's not easy. So many times, they end up just sitting on it. This is where we come in and we say, We will make sense of all of this for you.'

Generative AI is a big buzzword. How are you making use of this tool in your daily practices to solve the toughest of problems?

Gulipalli: If you take a look at our work, we are working on some business problems. It could be trying to predict something, some forecasting or trying to optimise something. Now, generative AI is doing two things for us.

One is that the process of developing these models is becoming easier now. It's been heard that a generative AI can now replace software engineers or it's able to write code. Parts of it are definitely happening. Not the replacement, but certainly it's substantially improved productivity.

But that's one part. The second part is solving actual business problems. Things which were unsolvable earlier, now some solutions are coming up. There is a live project we are working on with a very large company. I consider it to be among the top three BPO companies in the world. They run call centres across multiple industries globally.

We are working with the group that works with airlines. With airlines, it's easiest to buy a ticket. But then once you ring the call centre, it can be a frustrating experience.

So the solution we are building for them, we're doing it in two phases. Phase one, when a customer calls, they are going to listen to that conversation live with the help of AI.

And then, there are generative AI models which understand it. And they are giving a solution to the query being raised by the customer. Now, phase one was, Will there still be a human who will answer?

We are giving that solution to the call centre agent, saying, Here is the answer to that question.

Now, in the next phase, we are saying, 'Can we experiment where the generative AI or the bot directly answers the question, without a human in the loop?'

And there will be some pockets which are complex enough that only humans can do. And people can spend more time there. So that's phase two. So this is an example. And you can draw parallels to how other people would be using the massive amount of information available, but we don't know what is relevant. That's where AI comes in handy.

Kumar: About 1.4 billion people in India can register their complaints with the ministry of consumer affairs. Any product or service you buy as a consumer, you become the customer for the country. Hence, you can register your complaints.

And earlier there used to be fewer complaints just because the channels were cumbersome. Now they have opened up mobile channels, email, phone, etc. So, the current government is encouraging more registration of these complaints, suggestions and feedback.

The problem for them is now the volume is so high. There are only 25-30 people who are manually processing those complaints right now. So, they are launching an RFP right now for the use of gen AI for processing these complaints. And the purpose is categorising them.

First, many complaints are duplicates from the multiple channels people are getting. So how to deduplicate them? Second, some could be bogus complaints. How can they classify them into relevant, non-relevant, depending on the intensity of the complaints?

There's a question: How important is the complaint? Everyone says it's an important complaint, but they do internal categorisation. Then also, which category kind of complaint it is.

So all of that can be done by gen AI, which was earlier done manually. Eventually, what happens is if the complaints are more serious, one can approach the court. What they are trying to do is, can AI provide an intermediate solution, almost like negotiation between the service provider and the consumer? So there are a lot of interesting applications coming up with gen AI, which is going to touch the day-to-day life of every individual.

What trends are you expecting for 2023 in the space you cover? And could you explain the disruption that you will witness in space?

Kumar: Tiger Analytics was started in 2011 and since then, we have witnessed a lot of changes in the work of traditional data science and data engineering. In the post-Covid era, the demand for and adoption of these technologies has increased significantly with the entry of cloud.

The demand for gen AI has increased, where it led to the increase of our services. So gen AI is in that category where it has a promise of high value.

Earlier, if one went to large Fortune 500 companies, there would be 100 people talking about artificial intelligence (AI), data science and data engineering.

But gen AI has everyone talking about it. So, it has expanded the total addressable market of the services we provide. That's a big change we're seeing. And it will translate into much wider adoption of what we do.

Through AI and analytics, what are the challenges and opportunities Tiger Analytics has helped companies overcome and embrace?

Gulipalli: AI and analytics mainly help with decision making. Coming up with decisions based on data is a much more scientific way of making a decision. Sometimes the decision can be manual or it can be automated. AI can make the decision for you. But at the end of the day, we are making a decision.

Now think of looking across the organisation. You have a product which you are trying to sell. So it starts with sales. How can I better sell? Who is the right audience? How do I sell? Who do I market it to? How much do I spend on marketing? Today, there are a number of ways a product can be marketed.

There are hundreds of television channels. Digital has exploded. Where do I market? AI can give you answers for all of these. For example, you have acquired a customer. How do you provide a good customer experience? How do you retain the customer? How do you address their pain points? Identifying issues manually is a huge problem.

AI can help identify these hotspots. Then, you approach the operations of a company. Are you having optimal usage of resources? Is there wastage? Through AI and analytics, there can be better optimal usage of resources.

When you try to do planning as a business, you want to know how much I manufacture. For that, you would want to know what the demand is going to look like. How do you predict demand? There could be so many things that impact demand. The decisions spanning from sales, marketing, customer, operations, financial, risk, manufacturing, supply chain, all of these are things that can be solved using analytics and AI.

What makes AI and analytics a very deadly combination? Is there any other tool you would combine with analytics or AI?

Gulipalli: AI is like an engine. Think of it as if you're building a Formula One car. AI is the engine. But then just that engine will not get you to your destination. You need to build the overall chassis, have the steering, the seats, the body, etc.

There are different technologies which will come in to complete the picture. For example, AI is great but let's say for you to understand what AI is saying and for you to act on it, maybe you need a certain technology which is an interpreter between you and AI. So there are a variety of adjacent technologies which come in and help. And there'll be more that will keep coming in.

What AI was 10 years ago is not the same today. A lot of advancements have taken place. A lot of embellishments will come and enablement will come in terms of other technologies. So we have for a long heard about software development. So, both are very much tied together.

This is because, at the end of the day, what you'll use is a software product but at the heart of the software product is AI. For you to understand something much better, maybe you might need certain visualisations so that you can grasp the complexity of a situation better. Now, this is where AI and the visualisation technology will work together.

Currently, how many clients do you have?

Kumar: Right now, we are working with 130-140 companies. Seventy percent of our clients are from Fortune 1,000 companies.

In the consumer product goods category, among the top 10 companies, we work with six of them. Majority of business is in the US right now. We do have some business in Australia, Asia, Singapore, the Philippines, the UK and Malaysia. We recently started working with some government entities.

In India, we work with Tata Steel and with Star Network at Disney India. We are just starting our first project with the government of Bihar, which revolves around AI. In the US, we have clients like PepsiCo, The Hartford (The Hartford Financial Services Group) and Nestle. We are about to start work with Citibank.

Continued here:

Tiger Analytics: Generative AI has improved productivity - Storyboard18

Read More..

Better Data Decisions on the Journey to a Single Source of Truth – Spiceworks News and Insights

The enterprise journey to the single source of truth (SSOT) has been long and windy. The benefits of this direction are many, from a common data set to enterprise governance to unlocking valuable use cases and transformation across the enterprise, says Paula Hansen of Alteryx.

We can all relate to the desire for any employee to arrive at the same answer, once and for all putting to rest who has the better, more accurate data. But while organizations are on this journey, should they wait to pursue analytic insights and automation?

The exponential growth of data within the enterprise and the accelerating pace of business both suggest that pausing analytic automation until the single source of truth is completed will separate analytic leaders from laggards. Data is often scattered across various databases and applications, and different departments will migrate to a single source in a phased approach. Many business use cases, competitive decisions and transformation opportunities can be lost in this wait.

Simply put, driving business value through data and becoming analytically mature can advance in parallel with data centralization.

The reality is most enterprises today are pulling data from 6 input sources, from legacy databases and applications to modern cloud data warehouses and cloud platforms. A CFO pursuing a reduction in the time to close the quarter or a Head of Supply Chain wanting to optimize complex logistics is certainly pulling data from multiple sources and in multiple formats. In fact, the triangulation of different data sets and scenarios increases the quality of insights, creates better models and promotes enterprise collaboration.

For example, the development of a global go-to-market strategy likely pulls data from multiple CRMs, ERPs, legacy spreadsheets, third-party data sets and SaaS applications across a global enterprise. Different stakeholders within the business also want different insights from the data, ranging from partner insights to product insights to geographic insights to margin insights. The list goes on and on.

To reach the highest levels of analytic maturity what the International Institute of AnalyticsOpens a new window refers to as analytical nirvana and accelerate time-to-insight, it is important to democratize data and analytics to all of its employees. Organizations across every industry are increasing data literacy through analytics, using governance to scale responsibly, and evaluating the latest technologies like cloud, machine learning and generative AI to drive business outcomes faster than ever before.

See More: Data Warehouses: Why a Single Source of Truth Is Necessary for Customer Analysis

Centralized data science teams, while very valuable, lack the specific business expertise needed to effectively solve business challenges in each department. Data scientists are not trained as accountants, HR professionals, marketing experts or supply chain managers. Business context is required to ask relevant questions about your data and to reduce the time from insight to action.

The first step to increasing analytic maturity is to increase organizational data literacy. In this scenario, all employees are empowered to marry their domain expertise with the ability to ask more precise questions of the data, accurately analyze the data, and draw out valuable insights all through self-service technology that meets them where they are regardless of analytic skillset or analytic language preference.

Once you start your analytic program, there are many ways to democratize analytics and encourage greater data literacy. One approach is through gamification to increase employee engagement and upskilling. Jones Lang LaSalle (JLL), for example, established a gamification program that incorporates training to learn the functions of their analytics platform, provides users with challenges to work on their problem-solving skills, and issues certifications for awards and recognition of capability.

See More: Rise of Digital Banking Poses New Security Risks to Mobile Apps

Often, disaggregated analytics tools lack interoperability and create silos that result in complexity, duplication, and inefficiency for users and IT. Hybrid architectures that span on-premise and cloud-based data and applications are becoming the norm. Therefore, companies must implement the right tools and processes to pull disparate data types and place them into analytics processes wherever their data resides.

Further, the cloud can play a pivotal role in accelerating democratization through its flexibility, scalability, speed, and self-service. Using cloud-based analytic platforms streamlines IT management, removes overhead costs, and makes it easier for users to collaborate on their analytics solutions.

In todays fast-paced world, the ability to communicate insights quickly and effectively to stakeholders is critical. Generative AI will supercharge time-to-insight through its core value proposition and ability to rapidly accelerate content creation. Additionally, generative AI will accelerate analytic best practices and ML models across the enterprise.

For example, generative AI can automate communications that synthesize and deliver trusted analytical insights to stakeholders. The tone and language of the communications can be selected based on the intended audience. Instead of spending hours drafting reports or presentations for stakeholders, you can instead focus on absorbing insights and planning the best course of action based on results. This benefits users in several ways, including improved time-to-value, operational efficiency, and decision-making.

Governance is also a key consideration when scaling analytics across the enterprise. Governance doesnt mean democratization will come to a standstill. Proper governance helps organizations strike the balance of democratization at the speed the business demands with the controls that IT requires.

Establishing data governance means creating frameworks that define who can take what actions, with what data, in what situations, and what methods they can use. These principles guide data analytics at every stage of the collection of data and will be unique to each organization depending on the type of data, data systems, and regulatory requirements.

Data governance also helps to ensure that data is usable and accessible for analytics. This becomes even more important as companies adopt generative AI, where large language models rely on the quality of data inputs.

Digital transformation agendas have put data analytics at the center of every organizations focus in every industry. Whether your data is in a single location or not, empowering employees across the enterprise to make data-driven decisions through the use of analytics will directly impact company performance. Plus, employees stay longer in organizations that invest in their development and help them to automate the mundane and focus on value-add. Improving organizational analytic maturity can be done while maintaining proper governance. Analytic leaders will leverage their insights to uncover new revenue streams, improve operational efficiency, and stay hyper-competitive in todays evolving world.

What innovative data analytics strategies are you using to drive operational efficiency and decision-making? Share with us on FacebookOpens a new window , XOpens a new window , and LinkedInOpens a new window . Wed love to hear from you!

Image Source: Shutterstock

Read more from the original source:

Better Data Decisions on the Journey to a Single Source of Truth - Spiceworks News and Insights

Read More..

The role of data skills in the modern labour market – CEPR

The increasing use of data and advanced analytics across countries has driven demand for new types of jobs (Acemoglu and Restrepo 2017). Individuals who master those skills are in high demand in the labour market and typically earn a wage premium (Sostero and Tolan 2022). With rapid technology changes, it is less clear which jobs are becoming more digital, and which occupations and industries hire people with these skills. Traditional labour market statistics provide only limited possibilities to explore these questions, as they often lack detailed information on skill requirements.

In recent years, online job advertisement data have gained popularity as an alternative data source relating to labour markets, as they provide timely and granular information. In many cases, they have advantages over existing occupation classifications and are often complementary to official employment or vacancy statistics (Atalay et al. 2022). Recent studies using online job advertisements showed that digital skillsets have evolved over the past decade and can be found at the core of some traditionally non-digital domains (Sostero and Tolan 2022). Online job advertisements have also contributed to an improved understanding of the impact of crises on labour markets, such as the COVID-19 recession in the US and Canada (Soh et al. 2022, Bellatin and Galassi 2022) and the war in Ukraine (Pham et al. 2023).

In a recent paper (Schmidt et al. 2023), we aim to estimate the data intensity of occupations and sectors (i.e. the share of data-related jobs involved in the production of data). First, we put forward a novel methodology applying natural language processing (NLP) to online job advertisements from Lightcast to generate occupation- and industry-level estimates of data intensity. Second, the methodology can be used to advance cross-country comparable results on measuring the value of data assets in the data economy and the evolution of digital skills in the labour market. Third, the NLP algorithm is flexible and can be applied to concepts that are difficult to capture in traditional labour market statistics, such as green and AI-related jobs. The algorithm can also be adapted to over 66 languages, meaning the scope of the analysis could be broadened.

The NLP techniques enable the extraction of relevant skills and tasks from the raw text of the online job advertisements. Due to the high granularity of the data, the algorithm can classify the extracted information into data entry, database and data analytics related activities. Finally, the methodology computes occupation and industry-level aggregates of the share of jobs involved in data production activities. The methodology relies on an open-source NLP pipeline provided by the 'spacy' python library (spaCy 2022), which allows for an efficient treatment of large amounts of text data, and a flexible deployment of NLP models.

Results illustrate that the percentage of data analytics skills is higher than those of other data-related skills in all three countries. At the sectoral level, the emerging picture is more heterogeneous across countries. While the information and communication industry as well as the finance industry are highly data-intensive, larger differences in data intensity exist for agriculture, mining and quarrying, and electricity, gas, steam and air conditioning supply. Differences in labour demand mostly explain these variations, with low data-intensity professions contributing most to economy-wide data intensity. Preliminary estimates show that the results remain stable for pre-COVID years, too.

In the UK, the occupation with the highest level of data intensity is data scientist, with a rate of 92.3% in 2020. Following closely are data engineer at 69% and data entry clerk at 68% (Figure 1). Most of these occupations revolve primarily around data analytics skills, with a few exceptions. For instance, data entry clerks and database administrators exhibit data intensities largely linked to data-entry and database-related capabilities. In general, the highly data-intensive occupations tend to be specialised, technology-oriented professions, with occupations such as biostatistician and clinical data manager showing connections to fields such as biology and medicine. Similar trends are observed in Canada and the US.

Figure 1 Data intensity at occupation level in the UK is linked to data analytics skills

Per cent of labour demand, 2020

Financial and insurance, and information and communication activities are the two most data-intensive industries in all three countries, with shares close to or above 10% in 2020 (Figure 2). Shares are similar in most sectors with low data intensity, in particular accommodation and food service activities, construction, and transportation and storage. This is consistent with Calvino et al. (2018), who use a different methodology.

Figure 2 Data intensity in the US, Canada, and the US by industry

Per cent of labour demand, 2020

However, these numbers can mask some structural differences across countries. For instance, in the finance and insurance sector, the UKs share is almost on par with those of the US and Canada, with data mining analysts making the largest contribution to the data intensity of the sector in all three countries. However, the high demand for data mining analysts in the sector more than compensates for the lower average data intensity of the profession in the UK (30% as compared to 70% in 2020 in the US and Canada). Overall, the contribution of the profession to the data intensity of the sector is about twice as high in the UK (0.8 percentage point), compared to Canada (0.3 percentage point) and the US (0.4 percentage point).

Differences in the data intensity of individual industries are noticeable in all three countries in professional, scientific and technical activities, with data intensity much higher in the US, and to a lesser extent Canada, than in the UK. Similarly, the data intensity in agriculture and forestry and electricity, gas, steam and air conditioning supply differs across countries, with labour demand being more data-intensive in the UK than in Canada or the US. In a few sectors, such as mining and quarrying, arts, entertainment and recreation, and public administration and defence, the US and Canada exhibit similar data intensity rates, which are much lower than in the UK.

At the economy-wide level, the UK and Canada appear to be less data-intensive than the US (Figure 3). The overall share of data-intensive jobs in the UK was estimated to be 3.4% in 2020, weighting the data intensity at occupation level by the number of job advertisements posted. This compares to 3.9% in Canada and 4.6% in the US.

Figure 3 Low data-intensity occupations contribute most to data intensity in the UK

Aggregate data intensity, per cent, 2020

In the UK, the low data-intensity occupations are those that count most for the overall data economy more than medium and highly data-intensity jobs (Figure 4, Panel A). For instance, office assistants, which represent 6% of the labour demand in the low data-intensity occupation class, contribute as much to the overall data intensity of the economy as data scientists.

Figure 4 Data intensity across occupations

Data intensity of an occupation in per cent, contribution to aggregate data intensity in percentage points, 2020

A) United Kingdom

B) Canada

C) United States

In Canada and the US, medium data-intensity occupation classes contribute the largest proportion to the overall data intensity, and the level of data intensity across professions is generally higher (Figure 4, Panel B and C). A data scientist has a data intensity score of 94.5% in Canada and 95.1% in the US, compared to 92.3% in the UK. Among the high data-intensity professions in Canada, data entry clerks, database administrators, and data mining analysts contribute most to aggregate data intensity, next to professions such as the business management analyst and software developer at the medium level. The US has the widest range of professions contributing at the medium and high data intensity level, amongst them network system analysts and computer system engineers.

Acemoglu, D and P Restrepo (2017), Robots and Jobs: Evidence from US Labor Markets, NBER Working Paper No. 23285.

Atalay, E, S Sotelo and D Tannenbaum (2022), The geography of job tasks, VoxEU.org, 12 November.

Bellatin, A and G Galassi (2022), What COVID-19 May Leave Behind: Technology-Related Job Postings in Canada, Bank of Canada Staff Working Paper 2022/17.

Calvino, F, C Criscuolo, L Marcolin and M Squicciarini (2018), A taxonomy of digital intensive sectors, OECD Science, Technology and Industry Working Paper 2018/14.

Pham, T, O Talavera and Z Wu (2023), Labour markets during wartime: Evidence from online job advertisements, VoxEU.org, 22 July.

Schmidt, J, G Pilgrim and A Mourougane (2023), What is the role of data in jobs in the United Kingdom, Canada, and the United States? A natural language processing approach, OECD Statistics Working Paper 2023/05.

Soh, J, M Oikonomou, C Pizzinelli, I Shibata and M Mendes Tavares (2022), Did the COVID-19 Recession Increase the Demand for Digital Occupations in the United States? Evidence from Employment and Vacancies Data, IMF Working Paper 2022/195.

Sostero, M and S Tolan (2022), Digital skills for all? From computer literacy to AI skills in online job advertisements, JRC Working Papers Series on Labour Education and Technology.

spaCy (2022),"Language Processing Pipelines".

Excerpt from:

The role of data skills in the modern labour market - CEPR

Read More..

11 most in-demand gen AI jobs companies are hiring for – CIO

Generative AI is quickly changing the landscape of the business world, with rapid adoption rates across nearly every industry. Businesses are turning to gen AI to streamline business processes, develop proprietary AI technology, and reduce manual efforts in order to free up employees to take on more intensive tasks. A recent survey of senior IT professionals from Foundry found that 57% of IT organizations have identified several areas for gen AI use cases, 25% have started pilot programs, and 41% are engaged in training and upskilling employees on gen AI.

In the next six to 12 months, some of the most popular anticipated uses for gen AI include content creation (42%), data analytics (53%), software development (41%), business insight (51%), internal customer support (45%), product development (40%), security (42%), and process automation (51%). Organizations are also optimistic that gen AI will boost productivity and improve business outcomes, with 58% saying that they believe gen AI will make employees more productive, 55% saying that gen AIinfused products lead to better business outcomes, and 55% saying that gen AI enables employees to focus more on value-adding tasks.

As this technology becomes more popular, its increased the demand for relevant roles to help design, develop, implement, and maintain gen AI technology in the enterprise. Foundrys AI survey also identified several roles that companies are looking to hire to help with the integration of gen AI in the workplace. Here are the top 11 roles companies are currently hiring for, or have plans to hire for, to directly address their emerging gen AI strategies.

As companies embrace gen AI, they need data scientists to help drive better insights from customer and business data using analytics and AI. For most companies, AI systems rely on large datasets, which require the expertise of data scientists to navigate. Responsibilities include building predictive modeling solutions that address both client and business needs, implementing analytical models alongside other relevant teams, and helping the organization make the transition from traditional software to AI infused software. Its a role that requires experience with natural language processing, coding languages, statistical models, and large language and generative AI models. According to the survey, 28% of respondents said they have hired data scientists to support generative AI, while 30% said they have plans to hire candidates.

Machine learning engineers are tasked with transforming business needs into clearly scoped machine learning projects, along with guiding the design and implementation of machine learning solutions. This role is responsible for training, developing, deploying, scheduling, monitoring, and improving scalable machine learning solutions in the enterprise. Its a role that requires a wide range of skills including model architecture, data and ML pipeline creation, software development skills, experience with popular MLOps tools, and experience with tools such as BERT, GPT, and RoBERTa, among others. The goal of a machine learning engineer is to ultimately make machine learning more accessible across the organization so that everyone can benefit from the technology. According to the survey, 22% of respondents say they have already hired machine learning engineers to support generative AI, while 28% say they have plans to hire for the role.

AI is new territory for businesses, and theres still a lot to discover about the technology, which is why theyre looking to hire AI researchers to help identify the best possible applications of AI within the business. AI researchers help develop new models and algorithms that will improve the efficiency of generative AI tools and systems, improve current AI tools, and identify opportunities for how AI can be used to improve processes or achieve business needs. AI researchers need to understand data and automation infrastructure, machine learning models, AI tools and algorithms, data science, programming, and how to build AI models from scratch. According to the survey, 31% of respondents say they have already hired AI researchers to support generative AI, while 19% say they have plans to hire for the role.

Algorithm engineers, sometimes referred to as algorithm developers, are tasked with building, creating, and implementing algorithms for software and computer systems to achieve specific tasks and business needs. The role of algorithm engineer requires knowledge of programming languages, testing and debugging, documentation, and of course algorithm design. These engineers are responsible for solving complex computational problems in the organization, often working with large data sets to design intricate algorithms that address and solve business needs. Businesses rely on algorithm engineers to help navigate gen AI technology, relying on these experts to scale and deploy gen AI solutions, consider all the ethical and bias implications, and ensure theyre aligned with all compliance and regulatory requirements. According to the survey, 16% of respondents say they have already hired algorithm engineers to support generative AI, while 31% say they have plans to hire for the role.

Deep learning engineers are responsible for heading up the research, development, and maintenance of the algorithms that inform AI and machine learning systems, tools, and applications. Deep learning is a subset of AI, and vital to the development of gen AI tools and resources in the enterprise. This role is responsible for building and maintaining powerful AI algorithms, identifying data requirements, and finding better ways to automate processes in the business to improve performance. Technologies such as chatbots, virtual assistants, facial recognition, medical devices, and automated cars rely on deep learning to create effective products. As companies continue to embrace gen AI, deep learning engineers are critical for businesses that want to capitalize on AI and integrate it into business processes, services, and products. According to the survey, 16% of respondents say they have already hired deep learning engineers to support generative AI, while 28% say they have plans to hire for the role.

Natural language processing (NLP) engineer is a vital role for embracing gen AI in any organization. Gen AI relies heavily on NLP to improve communication and to create chatbots and other AI services that need to communicate effectively with users, no matter the query. This role is responsible for training NLP systems, developing models, running experiments, identifying proper tools and algorithms, and performing regular maintenance and analysis of the models. Candidates typically have experience in big data, coding, model selection and customization, language modeling, language translation, and text summarization using NLP tools. NLP plays a big role in technologies such as text-to-speech (TTS) and speech-to-text (STT), chatbots and virtual assistants, and other gen AI tools that are designed to interact in real-time with users. According to the survey, 15% of respondents say they have already hired NLP engineers to support generative AI, while 27% say they have plans to hire for the role.

Chatbots are one of the earliest and most common uses of gen AI in a business setting its very likely that you have interacted with an AI chatbot at some point in the past several years. They help direct customers to the right associates, connect users with important documentation, and can alleviate some of the load on customer service representatives. With gen AI, chatbots are becoming even more sophisticated, with the rise of services such as ChatGPT, Bard, Replika, Cleverbot, and others, which have shown to be powerful tools that are useful to businesses. Chatbot technology is in demand across every industry, and businesses are eager to develop their own chatbot tools to help streamline customer service, appointment scheduling, social media engagement, user support, and even marketing and promotions. According to the survey, 15% of respondents say they have already hired AI chatbot developers to support generative AI, while 27% say they have plans to hire for the role.

Prompt engineers are responsible for ensuring that tools using gen AI, especially text-to-text and text-to-image AI models, can accurately assess user prompts and deliver the correct information. Its a role that requires extensive knowledge of natural language processing, coding, natural language queries, and artificial neural networks. Examples of prompt engineering can be seen in tools such as ChatGPT, which takes user queries and generates a unique response, and AI image tools such as Midjourney, which produces unique art and imagery based on user requests. For businesses interested in leveraging AI, especially with chatbots, automated assistants, and image generators, prompt engineering is a vital role to ensure those tools are effective and useful. According to the survey, 11% of respondents say they have already hired prompt engineers to support generative AI, while 26% say they have plans to hire for the role.

Chief AI officer is a relatively new senior executive position that helps organizations tackle the rapid progress of and demand for AI in the workplace. There are so many considerations when integrating AI into the workplace, especially around security, bias, compliance, and privacy. A chief AI officer is responsible for overseeing AI strategy development by navigating and overseeing the development and implementation of AI in the business. Other responsibilities include overseeing data management and governance, business unit collaboration, ethics and compliance, risk management, talent acquisition and team building for AI, and monitoring overall performance and analytics reporting on AI tools. According to the survey, 11% of respondents say they have already hired a chief AI officer to support generative AI, while 21% say they have plans to hire for the role.

More companies are turning to AI for content creation, including writing blog posts, product descriptions, and more. But the results arent always perfect and often need a human eye to edit and rework gen AI results into something that sounds more human and relatable to readers. Companies are looking for experienced writers and editors who can help turn around content quickly, using generative AI, while ensuring that the content is well-written and easy to understand by the intended audience. According to the survey, 10% of respondents say they have already hired AI writers to support generative AI, while 21% say they have plans to hire for the role.

AI art is one of the newer applications of gen AI, with tools such as Midjourney and Stable Diffusion taking off in the past year. These tools can take a prompt, or an image, and either create entirely unique content, or make specific edits to already-existing imagery. There is a lot of potential for organizations looking to create marketing materials, product images, stock images, and other art-related content. Organizations are seeking experienced artists and graphic designers who can take that expertise and knowledge to get the most out of image generation tools. Artists have the right knowledge and expertise to create prompts that will garner better results from generative AI. They know the lingo, terminology, and nuances of various artistic areas be it film, artwork, or visual graphics which can help ensure that businesses get the results they want from these services. According to the survey, 7% of respondents say they have already hired AI artists to support generative AI, while 15% say they have plans to hire for the role.

Read this article:

11 most in-demand gen AI jobs companies are hiring for - CIO

Read More..

Data and Privacy Challenges Most Marketers Are Still Faced With – MarTech Series

From public to private, the customer experience has evolved over these years, and with the demise of third-party cookies, the privacy experience is leading the way ahead for customers as well as for organizations.

In the era of heightened data privacy concerns, building an efficient privacy experience that aligns equally with your business objectives, customer expectations, and legal regulations is challenging. It requires businesses to consider several factors because the entire digital ecosystem is lingering around consent and privacy.

Let us walk you through the emerging factors that pose a challenge while dealing with privacy concerns, but they also offer opportunities for enterprise-scale companies to craft a privacy strategy that sustains even during volatile times.

Companies over all these years have spent considerable about of time, money, and other resources to become privacy-compliant. Nevertheless, the ownership of such initiatives is being passed from one department to another. What started as a task for legal teams, gradually passed to the data and IT teams, and now, the baton is in the hands of marketing and sales teams, as they are primary generators and users of customer data.

Data breaches and privacy violations make up a small part of possible customer data privacy violations, some other functions at risk are violating data privacy laws while collecting, managing, processing, storing, using, and disposing of customer data.

All the internal stakeholders in an organization should work together to align and build a privacy-first organization. Data privacy should be an independent function that can drastically improve the chances of alignment. The scope of such a privacy function could be:

Privacy is a complex and ever-evolving issue. While trying to operationalize privacy protocols, organizations must consider multiple stakeholders that use data to accomplish their tasks, for example, UX and product design, data science, marketing, sales, legal, HR, procurement, finance teams, and more.

Cross-functional collaboration works but it takes an approach beyond just partnering with all the departments. There needs to be an operational alignment with existing workflows, technical integration with multiple systems along with following legal compliance with a myriad of ever-changing laws.

A robust privacy roadmap must contain the design, plan, and implementation strategy that comply with the current laws and is agile enough to evolve with the changing needs of the business.

Marketing Technology News:MarTech Interview with Derek Slager, Co-founder & CTO and Amperity

As third-party cookies come to an end, collecting data from different channels has become all the more challenging. However, the constant data storm is coming at a rapid pace in terms of frequency, volume, and diversity from a spiralling set of data channels, sources, platforms, and devices entails marketers are to struggle in controlling and streamlining data management workflows across all touchpoints.

Marketing leaders should address the problem of data deluge with innovative strategies. For generation and collection, we must focus on premium zero- and first-party data, and to reconcile customer data from several digital and physical sources for a seamless customer experience.

Blockchain, generative AI, AR, and VR have made their way through all industries and sectors. Unfortunately, companies still do not have clarity on how to process the data collected or shared with these technologies following compliance and protocols.

Generating and utilizing data using AI has become a double-edged sword for marketing and sales teams. AI has offered intelligent automation, but it creates look-alike identities that may violate anyones privacy and security. Marketers must own the responsibility of how their model uses the data and control how to share it in the public domain.

How do you create a privacy tech stack? An ideal tech stack is built around tools and technologies that run and grow on privacy-related initiatives like consent, compliance, and preference. While the martech and fintech tools in your ecosystem have certain privacy features, the privacy-first organizations should further invest in building an independent privacy tech stack to facilitate an advanced range of privacy use cases.

Coping with privacy concerns is becoming challenging day by day. The field is interconnected and evolving. Modern enterprises must strive to take it from a mere compliance task to a strategic initiative as their commitment to positive privacy experiences.

Addressing the challenges mentioned above can go a long way in creating a sustainable privacy strategy that can withstand the volatility in the market and offer a competitive edge to the brand over a long period.

Marketing Technology News: The Human Touch: Elevating AIs Role in Marketing Campaigns

Read more:

Data and Privacy Challenges Most Marketers Are Still Faced With - MarTech Series

Read More..