Page 752«..1020..751752753754..760770..»

Data Architect Job Description, Skills, and Salary in 2023 | Spiceworks – Spiceworks News and Insights

Data architects are technical professionals in charge of an organizations data systems. They use their IT and design skills to plan, analyze, and implement data solutions for internal use and user-facing applications. Years of study and experience are required to become a data architect; however, its an in-demand role that can get you a six-figure pay package.

Sample Data Architecture Certification from IBM

Source: Algirdas Javtokas CertificationsOpens a new window

Data is currently among the most valuable enterprise assets in the world today. NewVantage Partners 11th annual survey of chief data officers (CDOs) and chief data and analytics officers indicates that 82% of organizations intend to increase investments in data modernization in 2023.

Investment in data products, artificial intelligence (AI), and machine learning was cited as a top priority. This suggests that most businesses require experts who can ensure that their data is readily accessible, organized, and constantly updated. That is where the role of a data architect comes in.

A data architect specializes in developing and optimizing database models to hold and easily access company data. These professionals analyze system specifications, build data models, and guarantee the datas integrity and security.

Data architects are typically part of a companys data science team. They oversee data system initiatives and work closely with data analysts, other data architects, and data scientists. They typically report to the data system and data science leaders. If a company has a CDO, a senior data architect could report directly to that individual.

So, why are data architects necessary in the first place, given that companies could hire data scientists, data analysts, and other data experts? The role of data architects is vital. Without these specialists, data security compliance and data flow can be compromised. This is because they provide the framework for data infrastructure protocols. The architect determines how the other members of the data science team and the company as a whole will construct these systems and manage stored data.

Data architects are specialists at collecting and storing enormous amounts of data. They use their understanding of data collection, analysis, and storage to create an enterprise-wide data structure. Being excellent at math, having the capacity to solve complex problems, and possessing programming expertise are all necessary for this position. They can work for government agencies, universities, IT, financial, or even technical services firms.

Data is at the core of all applications and software. Over time, improperly structured data can look like a pile of spaghetti. This can lead to extended disorder in the software development process and long-term issues for developers. This is where the role of a data architect is important.

Contrary to the widespread belief that data architects solely focus on databases, much more is there to their job than simply creating structured query language (SQL) tables. In the software development process, the role of a data architect is to convert business requirements into established guidelines and standards. This requires deciphering obscure details and transforming them into something logically tangible. Usually, this theoretically tangible construction assumes the shape of technical specifications, modeling of data structures, their interrelationships, connections, and long-term viability concerning business requirements.

When coders create databases ad hoc, the resulting construction could become unstable over time. A data architect would analyze current requirements and design data pipelines that are versatile enough to accommodate future changes, feature additions, or any other need that may arise over time.

While a data architect and engineers roles are closely related, they are not the same.

Together, data architects and engineers create a companys data framework. The data architect conceptualizes the entire framework while the data engineer implements the plan. Data architects prepare the data and construct the framework that data scientists or analysts utilize. Data engineers assist data architects in developing the search and retrieval architecture.

Regarding the difference between data analysts and data architects, the former operates more on the business side of things. Their daily responsibilities include cost estimates, business case writing, stakeholder consultations, and high-level content, which is closer to marketing and sales functions.

The function of a data architect is more hands-on and positioned closely to the software development divisions. Their technical proposals often get turned into software. A data architects day would be spent organizing, refactoring, and unraveling data at a macroscopic level. This includes reorganizing or establishing new data structures for an app and resolving how these models are standardized, passed on, and utilized.

However, like analysts, a data architect can navigate between the various business layers and stakeholders. This activity aims to collect specifications and convert them into a suitable format for software development.

To become a data architect, aspirants need to follow the following steps:

1. Obtain a bachelors degree in a related discipline

A bachelors degree in computer engineering, computer science, information technology, or a comparable field is typically necessary for data architects. Masters degrees can prove helpful but are not mandatory. Usually, data architects have many years of experience in application design, system development, and data management. Therefore, you should successfully finish coursework in these areas.

2. Apply for a summer internship while in college

Data architecture isnt generally an entry-level position. As such, you should gain as much experience as possible early on to prepare for this role. Look for apprenticeships in IT that will help you develop application frameworks and network administration skills. Most leading technology companies offer summer internships to seniors in college, which can give you a leg up in your career as a data architect.

3. Get certified

Being certified always helps. The Institute for Certification of Computing Professionals offers the most popular certification, Certified Data Professional (CDP). Before taking a certification test, applicants must possess at least two years of IT work experience and a bachelors degree.

4. Build on your experience

Those interested in data architecture may require three to five years of work experience and proven project success. Apply for entry-level positions in programming and database management. Keep honing your database development, design, management, modeling, and warehousing capabilities. This is a good time to take on gig projects to add to your portfolio, such as helping a small business migrate its data systems.

5. Apply for a data architect job

After four to five years of experience, youre ready to apply for a data architect position. Look for work in financial markets, educational institutions, healthcare and insurance firms, and other organizations that gather and analyze massive amounts of client data. Software as a service (SaaS) and artificial intelligence companies also employ data architects to power their applications.

See More: What Is Data Science? Definition, Lifecycle, and Applications

Like a regular architect, a data architect designs an organizations data layout blueprint. These designs are then used to create databases as well as other systems. However, this is just a basic explanation, and the roles and responsibilities of a data architect are much more.

A data architect bridges business and IT. Consequently, the data framework they create must conform to both their organizations goals and broader industry standards. For instance, C-suite executives would want to enhance the accuracy and availability of data insights to make better decisions. As a result, a data architect will prioritize this. They will also offer counseling or arguments if something is technically impossible.

The foundations of an organizations IT infrastructure are data models, metadata structures, and pipelines. Throughout the organizations life cycle, they recommend how data is collected, used, controlled, shared, and restored. In addition, they ensure compliance with regulatory requirements and data security. A data architect establishes and distributes a common data vocabulary alongside more technical artifacts. This helps maintain consistency across the organization, even in non-technical teams.

Data architects must track and sustain system health by performing regular tests, fixing issues, and quickly fixing bugs. In addition, they identify key performance indicators (KPIs) to gauge and track the efficacy of the data infrastructure and its individual components. If KPI targets arent met, a data architect would need to suggest methods, such as new technologies, that can improve the current framework.

A data architect calculates how information is safeguarded and who can control it. Further, this professional must ensure compliance with data-related rules, regulations, and guidelines. Consider healthcare data that contains confidential details, referred to as protected health information (PHI), and is bound by HIPAA regulations. If an organization works with medical records and paperwork not stored in medical facilities, it is the data architects job to set up access controls, data encryption, anonymity, and additional security measures.

The General Data Protection Regulation (GDPR) is for gathering, storing, and processing personal information in the European Union. This privacy legislation must be taken into consideration when creating any data architecture.

A data architect oversees the tasks of data engineers, comparable to how a building architect supervises a construction crew setting the foundation for a new building. This ensures their databases, apps, or other data systems conform to the framework. Depending on the development of their data unit, they would also need to coordinate with third-party data suppliers to develop architecture-compliant guidelines.

A data governance policy is an annotated document that details the objectives, processes, and company standards for data management. It outlines metrics and best practices to guarantee data quality, confidentiality, and security. This document ensures all parties agree on who is liable for what reasons and how information must be administered at various phases of its lifecycle. While data architects arent the only individuals who create policies, they significantly contribute to developing data-related regulations and norms.

See More: How to Build Tech and Career Skills for Web3 and Blockchain

Like most technical professionals, data architects need both hard and soft skills to succeed in their roles. The top skill requirements for a data architect include:

A data architects daily tasks and responsibilities involve direct collaboration with data engineers or data scientists. This professional must, therefore, be intimately familiar with an extensive range of data-related technologies such as SQL/NoSQL databases, ETL/ELT tools, etc. Furthermore, a data architects experience with popular tools such as Microsoft Power BI and Tableau is a major asset.

A data architect would frequently need to fix several complexities with data systems, quickly locate the root cause of an issue, and create efficient solutions. In addition, data architects serve as mediators between organizations and data science experts. Their goal is to match technical specifications with business requirements. To succeed in this challenging endeavor, they must demonstrate critical thinking abilities. This facilitates the identification of a companys objectives and the use of its technical expertise to reduce expenses and maximize profits.

Data management reveals the value of a companys data, and it is the duty of a data architect to ensure that metadata rules are relevant to all of the companys data. This means that a data architect must have a solid understanding of data lifecycle management (DLM) and how metadata is applied during each phase of DLM.

Even though data architects rarely need to write code, proficiency in various prominent programming languages is necessary. This is because they must adapt data architectures for various applications in different programming languages.

The top skills here include:

Data architects increasingly need to understand AI, machine learning, natural learning processing, and pattern recognition. This is because AI solves real-world use cases just like data-related issues. An understanding of these tools is also required since they facilitate the use of clustering in text mining and data administration by data architects.

A key skill set for any data architect is data modeling. It entails depicting data flow with structured and architecturally correct diagrams to simplify an elaborate software system. Before creating an app, data models help stakeholders find and address flaws or vulnerabilities.

See More: 5 Hottest Tech Jobs To Go For in 2023

A data architect is a mid-level or a senior-level role. As a result, this professional commands a high salary of $131,375 annually in the US, according to Glassdoor data last updated on June 21, 2023. On top of that, these professionals can earn additional cash compensation of $23,271 on average from bonuses, commissions, etc.

Data architect salaries can be as high as $200,000 annually or more, depending on the company one joins. For example, on average, Cisco pays its data architects $224,214 annually, while IBM salaries are close to $180,000. Further, this is an in-demand role in the financial services industry, with jobs available at leading banks such as JP Morgan and Bank of America.

Healthcare providers such as HCA Healthcare and Intermountain Health also employ data architects at a six-figure salary. Therefore, it is worth putting in the hard work, getting certified, honing your data architecture skills, and gaining experience since there is much room to grow in this career.

See More: Five Best Career Choices For Certified Data Scientists

Typically, data architects join as data architecture associates and move on to a more senior position until they ascend to the chief data officer (CDO) role. However, your skills as a data architect are highly transferable, and several other jobs can also be explored.

See More: 9 Skills You Need to Become a Freelance Data Scientist in 2021

Data architecture is now an in-demand role that companies such as Salesforce and IBM offer certifications for. Information is now central to nearly every business process, and enterprise applications need to utilize data meaningfully. A data architect can be a valuable asset to an organization and command a high salary by successfully mobilizing and monetizing information.

Did this article answer your queries about a data architect? Tell us on FacebookOpens a new window , XOpens a new window , and LinkedInOpens a new window . Wed love to hear from you!

Image Source: Shutterstock

Continued here:

Data Architect Job Description, Skills, and Salary in 2023 | Spiceworks - Spiceworks News and Insights

Read More..

Internet Of Things (IOT): Application In Hazardous Locations … – Data Science Central

Introduction to Internet of Things (IOT):

Internet of Things (IoT) represents the fourth-generation technology that facilitates the connection and transformation of products into smart, intelligent and communicative entities. IoT has already established its footprint in various business verticals such as medical, heath care, automobile, and industrial applications. IoT empowers the collection, analysis, and transmission of information across various networks, encompassing both server and edge devices. This information can then undergo further processing and distribution to multiple inter-connected devices through cloud connectivity.

IoT is used in the Oil and Gas Industry for two basic reasons: First low power design, a fundamental requirement for intrinsically safe products, Second two-way wireless communication. These two advantages are a boon for the products used in Oil and Gas industries. The only challenge is for the product design to meet the hazardous location certification.

An intrinsic safe certification is mandatory for any device placed in hazardous locations. The certification code depends on the type of protection, zone, and the region where the product shall be installed.

In the North American and Canadian markets, the area classification is done in three classes:

Class I: Location where flammable gases and vapors are present.

Class II: Location where combustible dust is present.

Class III: Location where flying is present.

The hazardous area is further divided into two divisions, based upon the probability that a dangerous fuel to air mixture will occur or not.

Dvision-1: Location is where there is a high probability (by underwriting standards) that an explosive concentration of gas or vapor is present during normal operation of the plant.

Division-2: Location is where there is a very low probability that the flammable material is present in the explosive concentration during normal operation of the plant; so, an explosive concentration is expected only in case of a failure of the plant containment system.

The GROUP is also one of the meaningful nomenclatures of the hazardous area terms.

The four gas groups were created so that electrical equipment intended to be used in hazardous (classified) locations could be rated for families of gases and vapors and tested with a designated worst-case gas/air mixture to cover the entire group.

The temperature class definitions are used to designate the maximum operating temperatures on the surface of the equipment, which should not exceed the ignition temperature of the surrounding atmosphere.

Areas classified per NEC Article 505 are divided into three zones based on the probability of an ignitable concentration being present, rather than into two divisions as per NEC article 501. Areas that would be classified division 1 are further divided into zone 0 and zone 1. A zone 0 area is more likely to contain an ignitable atmosphere than zone 1 area. Division 2 and zone 2 areas are essentially equivalent.

Zone-0: Presence of ignitable concentration of combustible gases and vapors continuously, or for long periods of time.

Zone-1: Intermittent hazard may be present.

Zone-2: Hazard will be present under abnormal conditions.

IoT-based products can be designed for various applications, a few of them are listed below:

A typical block diagram of the IoT application is shown below:

Figure 1: IOT Block Diagram

An IoT product might consist of a battery as a power source or can be powered externally from either 9V ~ 36V DC supply available in the process control applications or 110/230Vac input.

The microcontroller can be selected based on the applications, power consumption, and the peripheral requirements. The microcontroller converts the analog signal to digital and based on the configuration can send the data on wired/wireless to the remote station. Analog signal conditioning stands as a pivotal component of the product, bridging the connection between the sensor and facilitating the conversion of analog signals for compatibility with the microcontroller. The Bluetooth interface suggested in the example is due to its wide acceptance and low power consumption. The wireless interface depends on the end-application of the product.

The electronics design of an IoT product for a hazardous location is very complex and needs a careful selection of the architecture and base components as compared to the IoT developed for commercial applications. In case the IoT is for a hazardous location, the product must be intrinsically safe and should not cause an explosion under fault conditions. The product architecture should be designed considering various mechanical, and electronics requirements as defined in the IEC 60079 standards, certification requirements and the functional specifications.

Power Source: This is one of the main elements in an IoT-based product. Battery selection should meet the overall power budget of the product, followed by the battery lifetime. In case of intrinsic safety, special consideration is required for where the battery in charged. IEC 60079-11 clause 7.4 provide details for the type of battery and its construction details. Separation distance from the battery and electrical interface should be done as per Table-5 of IEC 60079-11. If the battery is used in the compartment, sufficient ventilation must be provided to ensure that no dangerous gas accumulation occurs during discharge or inactivity periods. In scenarios where IoT operates on DC power sources such as 9~36Vdc (nominal 24Vdc), the selection of power supply barrier protection becomes a critical consideration, particularly when catering to intrinsic safety norms. This necessitates a thorough analysis of the products prerequisites and the mandatory certifications. Adding to the complexity is the existence of IoT devices functioning on 230Vac, demands intrinsic safe calculations and certifications aligned with Um = 250V.

Microcontroller: Its central processing unit of the IoT product. The architecture of the microcontroller, power, and clock frequency processing must be carefully selected for a particular application. The Analog to Digital Conversion (ADC) part of the microcontroller should be selected based on the required accuracy, update rate, and resolution. Microcontroller should have enough sleep modes so that the power is optimally utilized for IoT applications and should have sufficient memory/peripheral interface to meet the product specifications.

Analog Signal Conditioning: The front-end block should meet the intrinsic safe requirements as per the IEC 60079 standards and should also protect the product from EMI-EMC testing. Barrier circuit should provide enough isolation for meeting the spark-gap ignition requirements and impedance requirement of the transducer. Also, along with the safety requirements, the designer should ensure that extracted sensor signal is not degraded from the excessive noise present in outside environment. All the sensors used for collecting data from the process parameters to the signal conditioning block must be certified for the particular zone.

Wireless Communications: There are various wireless options available for sending data from the IoT product to the sensor such as (6LOWPAN, ZigBEE, ZWave, Bluetooth, Wi-Fi, Wireless HART). Selection of a particular wireless interface requires knowledge of end application, RF-power, antenna, and protocol. Selection of the interface for a particular IoT application should be done keeping these basic things in mind:

In case of intrinsic safe applications, its important to note that the use of certified modules does not directly confer suitability for deployment in hazardous locations. The product must undergo fresh testing within an intrinsic safe lab to assess both quantifiable and non-quantifiable ffaults, along with spark testing. or the countable and non-countable faults and spark testing. The RF power transmitted from the devices should be limited as per Table-1x of IEC 60079-0.

When building IoT solutions for hazardous locations, special conditions relating to creepage and clearance, encapsulation, and separation distance must be carefully considered. Also, when battery and RF signals are used, its expected the designer should be aware of the applicable standards and limitation of these standards for such products.

With more than 25 years of experience in designing mission-critical and consumer-grade embedded hardware designs, eInfochips is well poised to make products which are smaller, faster, reliable, efficient, intelligent and economical. We have worked on developing complex embedded control systems for avionics and industrial solutions. At the same time, we have also developed portable and power efficient systems for wearables, medical devices, home automation and surveillance solutions.

eInfochips, as an Arrow company, has a strong ecosystem of manufacturing partners who can help right from electronic prototype design, manufacturing, production, and certification. eInfochips works closely with the contract manufacturers to make sure that the designs are optimized for testing (DFT) and manufacturing (DFM) to reduce design alterations on production transfer. To know more about this contact us.

einfochips can help product-based companies to develop intrinsically safe products and get the product certified by lab for various certifications like ATEX/IECEx/CSA.

Kartik Gandhi, currently serving in the capacity of Senior Director of Engineering, possesses a distinguished career spanning over two decades, marked by a profound expertise in fields including Business Analysis, Presales, and Embedded Systems. Throughout his professional journey, Mr. kartik has demonstrated his proficiency across diverse platforms, notably Qualcomm and NXP, and has contributed his talents to several esteemed product-based organizations.

Dr. Suraj Pardeshi has more than 20 years of experience in Research & Development, Product Design & Development, and testing. He has worked on various IoT-enabled platforms for Industrial applications. He has more than 15 publications in various National and International journals. He holds two Indian patents, Gold Medalist and Ph.D (Electrical) from M.S University, Vadodara.

See the article here:

Internet Of Things (IOT): Application In Hazardous Locations ... - Data Science Central

Read More..

UB takes AI chat series to Grand Island Central School District – University at Buffalo

BUFFALO, N.Y. Artificial intelligence has the potential to drastically alter education systems.

It can provide students personalized learning experiences and instantaneous feedback, as well as deliver data to teachers on how to better engage students and improve curriculum.

But there are concerns, everything from students using AI to write essays, unintentional bias within AI programs, and job loss among teachers.

These topics and more will be discussed Thursday at Grand Island Central School District during the second installment of UB | AI Chat Series, Advancing Education with Responsible AI.

News media are invited to the panel discussion, as well as AI demonstrations and a poster session that will follow.

When: 6-7:30 p.m. on Thursday, Oct. 19.

Where: Grand Island Senior High School, 1100 Ransom Road, Grand Island, New York, 14072.

Best time for visuals: From 7-7:30 p.m., UB students will demonstrate AI programs and display posters that describe their work.

Who: The panel discussion will feature:

Suzanne Rosenblith, dean of the UB Graduate School of Education, will moderate the discussion. Brian Graham, superintendent of Grand Island Central School District will deliver a welcome address. And Venu Govindaraju, SUNY Distinguished Professor and UB vice president for research and economic development, will provide opening remarks.

Background: The two-year AI chat series will feature faculty-led and moderated discussions that explore how UB researchers from a wide variety of academic disciplines are harnessing artificial intelligence for the betterment of society.

It will spotlight significant new projects underway at UB such as the National AI Institute for Exceptional Education, which the National Science Foundation (NSF) funded with $20 million in January, as well as nearly $6 million in NSF-sponsored research to help older adults recognize and combat online scams and disinformation, among other endeavors.

Go here to see the original:

UB takes AI chat series to Grand Island Central School District - University at Buffalo

Read More..

Elke Rundensteiner Receives the Prestigious IEEE Test-of-Time … – WPI News

Elke Rundensteiner, the William Smith Deans Professor in Computer Science and Founding Head of WPIs Data Science Program, recently received the InfoVis 20-Year Test-of-Time Award from the Institute of Electrical and Electronics Engineers (IEEE) for her pioneering work on data visualization and visual analytics in 2003.

This award honors articles published at previous IEEE VIS (Visual Identification System) conferences, in this case in 2003, that have withstood the test of time by remaining useful 20 years later and that have had significant impact and influence on future advances within and beyond the visualization community, according to the awards organizers. Award selection is based on measures such as the numbers of citations, the quality and influence of its ideas, and other criteria.

Rundensteiner and her team, which included the late computer science professor Matthew Ward and former PhD students Jing Yang and Wei Peng, are being honored for their work on interactive hierarchical dimension ordering, spacing, and filtering for the exploration of high-dimensional datasets.

I fondly remember my close research collaboration with my colleague Matt Ward over a 17-year time span from 1998 to 2014 that resulted in a series of 7 National Science Foundation (NSF) research grants and one National Security Agency (NSA) grant for our work at the intersection of visualization and data analytics, Rundensteiner said. This allowed us to collaborate with countless joint PhD students, contributing cutting-edge advances to the then-newly emerging area of visual analytics, which led to this inspiring award. Matt was not only a creative thinker at the forefront of his time, he was a supportive colleague and generous friend and remains a true inspiration for me.

According to the award selection committee, the work that Rundensteiner and her team undertook presents a thoughtful, elegant, and powerful approach to managing the complexities of high-dimensional data and reducing clutter in visualizations such as parallel coordinates. The teams research provided insight by clustering the dimensions of high-dimensional data sets into a hierarchical structure (instead of just clustering the data itself), which then can be exploited to make sense of this complex data more efficiently. The paper, Interactive heirarchical dimension ordering, spacing and filtering for exploration of high dimensional datasets, laid the groundwork for subsequent research and influenced the design of other tools and techniques, the award committee noted.

Citations to the original paper have increased over time, showing evidence of lasting value, and the ideas introduced in the work are still relevant today, the award committee wrote. The paper shows us how we can solve a problem through interactive visualization design and presents convincing options for future analysts and designers. These ideas underpin subsequent research on synthesizing new summary dimensions, contribute to contemporary thinking on explainability, and have influenced the design of many other high dimensional visualization tools and techniques.

Read this article:

Elke Rundensteiner Receives the Prestigious IEEE Test-of-Time ... - WPI News

Read More..

The AUC Data Science Initiative partners with Mastercard to further … – PR Newswire

Through a $6.5M grant, Mastercard will support the expansion of data science education and research efforts across the nation's Historically Black Colleges and Universities.

ATLANTA, Oct. 16, 2023 /PRNewswire/ -- The Atlanta University Center (AUC) Data Science Initiative announces the launch of a new partnership with Mastercard at 12:00 PM on October 18th at the AUC Robert W. Woodruff Library. The event will detail the innovative partnership which is supported by a $6.5 million grant from Mastercard to drive the expansion of data science across the nation's Historically Black Colleges and Universities (HBCUs).

"The AUC Data Science Initiative has had great success engaging AUC students and faculty resulting in significant national impacts, primarily increasing the presence and employment of Black data scientists in the workforce," said David Thomas, Ph.D., chair of the Atlanta University Center Consortium Board of Trustees and Morehouse College president. "This partnership with Mastercard will amplify these efforts by providing a resource to all HBCUs creating pathways of innovation in data science."

Through a $6.5M grant, Mastercard will support the expansion of data science education and research efforts at HBCU's.

"As technology advancements in the field of data science impact both our local and global economic foundation, we need to ensure we are enabling the future workforce with pathways in data science knowledge that prioritize equitable access to opportunity for all," said Salah Goss, senior vice president for social impact for the Mastercard Center for Inclusive Growth.

The partnership seeks to develop new or reframed courses created across HBCUs guided by industry needs. New computer science faculty will be hired at an AUC institution and will work across HBCUs to strengthen data-specific curriculum and programming. This partnership will expand successful AUC Data Science Initiative programs.

Dr. Talitha Washington, Ph.D., Director of the AUC Data Science Initiative, will lead collaboration with other HBCUs to create new innovations in curricula and research. "There is a growing workforce need for data scientists and other professionals who possess data science skills," said Washington. "Data science impacts everything that we do, and we need all talent at all HBCUs to drive innovations."

The $6.5 million investment builds on and is informed by Mastercard's previous work with HBCUs leveraging Mastercard's unique expertise to create industry-informed programs to increase student placement in the workforce.

Learn more about the AUC Data Science Initiative: https://datascience.aucenter.edu and to attend the Oct 18th Mastercard partnership event: https://tinyurl.com/MastercardDSI

Media Contact:[emailprotected]

SOURCE Atlanta University Center Data Science Initiative

Go here to see the original:

The AUC Data Science Initiative partners with Mastercard to further ... - PR Newswire

Read More..

Opinion: The Rise of the Data Physicist – American Physical Society

In the search for new physics, a new kind of scientist is bridging the gap between theory and experiment.

By Benjamin Nachman | October 13, 2023

Traditionally, many physicists have divided themselves into two tussling camps: the theorists and the experimentalists. Albert Einstein theorized general relativity, and Arthur Eddington observed it in action as bending starlight; Murray Gell-Mann and George Zweig thought up the idea of quarks, and Henry Kendall, Richard Taylor, Jerome Freidman, and their teams detected them.

In particle physics especially, the divide is stark. Consider the Higgs boson, proposed in 1964 and discovered in 2012. Since then, physicists have sought to scrutinize its properties, but theorists and experimentalists dont share Higgs data directly, and theyve spent years arguing over what to share and how to format it. (Theres now some consensus, although the going was rough.)

But theres a missing player in this dichotomy. Who, exactly, is facilitating the flow of data between theory and experiment?

Traditionally, the experimentalists filled this role, running the machines and looking at the data but in high-energy physics and many other subfields, theres too much data for this to be feasible. Researchers cant just eyeball a few events in the accelerator and come to conclusions; at the Large Hadron Collider, for instance, about a billion particle collisions happen per second, which sensors detect, process, and store in vast computing systems. And its not just quantity. All this data is outrageously complex, made more so by simulation.

In other words, these experiments produce more data than anyone could possibly analyze with traditional tools. And those tools are imperfect anyway, requiring researchers to boil down many complex events into just a handful of attributes say, the number of photons at a given energy. A lot of science gets left out.

In response to this conundrum, a growing movement in high-energy physics and other subfields, like nuclear physics and astrophysics, seeks to analyze data in its full complexity to let the data speak for itself. Experts in this area are using cutting-edge data science tools to decide which data to keep and which to discard, and to sniff out subtle patterns.

Machine learning, in particular, has allowed scientists to do what they couldnt before. For example, in the hunt for new particles, like those that might comprise dark matter, physicists dont look for single, impossible events. Instead, they look for events that happen more often than they should. This is a much harder task, requiring data-parsing at herculean scales, and machine learning has given physicists an edge.

Nowadays, the experimentalists who manage the control rooms of particle accelerators are seldom the ones developing the tools of machine learning. The former are certainly experts; they run colliders, after all. But in projects of such monumental scale, nobody can do it all, and specialization reigns. After the machines run, the data people step in.

The data people arent traditional theorists, and theyre not traditional experimentalists (though many identify as one or the other). But theyre here already, straddling different camps and fields, proving themselves invaluable to physics.

For now, this scrappy group has no clear name. They are data scientists or specialized physicists or statisticians, and they are chronically interdisciplinary. Its high time we recognize this group as distinct, with its own approaches, training regimens, and skills. (Its worth noting, too, data physics discreteness from computational physics. In computational physics, scientists use computing to cope with resource limitations; in data physics, scientists deal with data randomness, making statistics what you might call phystatistics a more vital piece of the equation.)

Naming delivers clout and legitimacy, and it shapes how future physicists are educated and funded. Many fields have fought to earn this recognition, like biological physics, sidelined for decades as an awkward meeting of two unlike sciences and now a full-fledged and vibrant subfield.

Its the data wranglers turn. I propose that we give these specialists a clear identity the data physicists. Unlike a traditional experimentalist, a data physicist probably wont have much hands-on experience with instrumentation. They probably won't spend time soldering together detector parts, a typical experience for experimentalists-in-training. And unlike a theorist, they may not have much experience with first-principles physics calculations, outside of coursework.

But the data physicist does have the core skills to understand and interrogate data complete with a strong foundation in data science, statistics, and machine learning as well as the computational and theoretical background to relate this data to underlying physical properties.

The data physicists have their work cut out for them, given the enormous amount of data being churned out by experiments in and beyond high-energy physics. Their efforts will, in turn, improve the development of new experimentation methods, which are today often developed from simpler, synthetic datasets that dont map perfectly to the real world.

But this data will go underutilized without a skilled cohort of scientists who can deftly handle it with new tools, like machine learning. In this sense, Im not merely arguing for name recognition. We need to identify and then train the next generation, to tackle the data we have right now.

How? First, we need the right degrees: Universities should develop programs explicitly for data physicists in graduate school. I expect the data physicist to have a strong physics background and extensive training in statistics,data science, and machine learning. Take my own path as a starting point: I studied computational aspects of particle theory as a masters student and took many courses in statistics as a PhD student, which led to naturally interdisciplinary research between physics and statistics/machine learning and between theorists and experimentalists.

The right education is a start, but the field also needs tenure-track positions and funding. There are promising signs, including new federal funding to help institutions launch Artificial Intelligence Institutes dedicated to advancing this research. But while investments like this fuel interdisciplinary research, they dont support new faculty not directly, at least. And if youre not at one of the big institutions that receive these funds, youre out of luck.

This is where small-scale funding must step in, including money for individual research groups, rather than for particular experiments. This is easier said than done, because a typical group grant, which a PI uses to fund themselves and a student or postdoc, forces applicants to adhere to the traditional divide: theory or experiment, or hogwash. The same goes for the Department of Energys prestigious Early Career Award there is no box to check for interdisciplinary data physics.

As tall an order as this funding is, it could be easier to achieve than a change in attitude. Physicists might well be famous for many of humanitys greatest discoveries, but theyre also notorious for their exclusionary, if not outright purist, suspicion of interdisciplinary science. Physics that borrows tools and draws inspiration from other fields from cells in biological physics, say, or from machine learning in data science is often dressed down as not real physics. This is wrong, of course, but its also a bad strategy: A great way to lose brilliant physicists is to scoff at them.

Not all are skeptical; far more, in fact, are excited. Within APS, the Topical Group on Data Science (GDS) is growing rapidly and might soon become a Division on Data Science, a reflection of the fields growing role in physics. My own excitement about working directly with data inspired me to become an experimentalist myself, although I realize now how restrictive that label was.

As available data grows, so does our need for data physicists. Lets start by calling them what they are. But then lets do the hard work: educating, training, and funding this brilliant new generation.

Benjamin Nachman is a Staff Scientist at Berkeley Lab, where he leads the Machine Learning for Fundamental Physics Group, and a Research Affiliate at the UC Berkeley Institute for Data Science. He is also a Secretary of the APS Topical Group on Data Science.

The author wishes to thank the Editor, Taryn MacKinney, for her work on this article, and David Shih for coining the term 'data physicist' at a recent Particle Physics Community Planning Exercise.

Read more from the original source:

Opinion: The Rise of the Data Physicist - American Physical Society

Read More..

KnowBe4’s Vice President of Data Science and Engineering Paras … – PR Web

KnowBe4's vice president of data science and engineering, Paras Nigam, has been recognized with a in the leaders in AI and analytics industry category.

Nigam is a seasoned entrepreneur with a keen interest in cybersecurity and data science. He serves as the vice president of data science and engineering at KnowBe4, overseeing the development of cutting-edge AI and data-driven products and aligns them with the organization's overarching AI adoption strategy. Additionally, he leads KnowBe4's SecurityCoach product teams across India. Nigam is dedicated to building a high-caliber AI team, with a particular focus on generative AI, and fostering a culture of innovation within KnowBe4. He was one of 8,400 nominations for the 2023 3AI Zenith Awards.

"I am humbled to be recognized as an inspiring leader within the AI and analytics industry," said Nigam. "I am grateful for my family's unwavering support and the invaluable guidance of my mentors. I also want to express my heartfelt gratitude to KnowBe4's CEO Stu Sjouwerman, all of the research and development leaders and my team who have entrusted me with the opportunity to drive the AI practice at KnowBe4. Let's continue to inspire and innovate together!"

For a full list of the 2023 3AI Zenith Award recipients, visit here. To learn more about KnowBe4 and view open positions, visit here.

About KnowBe4 KnowBe4, the provider of the world's largest security awareness training and simulated phishing platform, is used by more than 65,000 organizations around the globe. Founded by IT and data security specialist Stu Sjouwerman, KnowBe4 helps organizations address the human element of security by raising awareness about ransomware, CEO fraud and other social engineering tactics through a new-school approach to awareness training on security. The late Kevin Mitnick, who was an internationally recognized cybersecurity specialist and KnowBe4's Chief Hacking Officer, helped design the KnowBe4 training based on his well-documented social engineering tactics. Organizations rely on KnowBe4 to mobilize their end users as their last line of defense and trust the KnowBe4 platform to strengthen their security culture and reduce human risk.

Media Contact

Amanda Tarantino, KnowBe4, (727) 748-4221, [emailprotected], https://www.knowbe4.com/

SOURCE KnowBe4

Link:

KnowBe4's Vice President of Data Science and Engineering Paras ... - PR Web

Read More..

Ducera Partners and Growth Science Ventures Announce the Formation of Ducera Growth Ventures – Yahoo Finance

NEW YORK, October 16, 2023--(BUSINESS WIRE)--Ducera Partners LLC ("Ducera"), a leading investment bank, and Growth Science Ventures, a data science focused venture capital firm, today announced the launch of Ducera Growth Ventures ("Ducera Growth").

Ducera Growth Ventures will focus on identifying, analyzing, and managing innovation-based venture capital investments in funds that include strategic corporate clients. The platform will be led by Michael Kramer, Founding Partner and Chief Executive Officer of Ducera, and Thomas Thurston, one of the worlds leading data scientists, Founder of Growth Science Ventures, and a Senior Advisor to Ducera.

Unlike traditional venture capital investing, Ducera Growth will combine Duceras investment banking expertise with Growth Sciences proprietary analytics and big data systems to identify unique early-stage growth companies. With an adherence to classic disruption theory and competitive threat regression analyses, the platform will deploy growth capital on behalf of its strategic corporate clients in future market leaders and next-generation companies that demonstrate the potential to produce new customers, reduce costs, and/or create new markets that are consistent with a clients long-term vision.

Michael Kramer, Chief Executive Officer of Ducera Partners, said, "Ducera continues to evolve as a full-service investment bank and strategic advisory firm, and we are focused on providing our clients with access to innovative solutions that we believe have the ability to add significant value to their businesses. Ducera Growth Ventures is an exciting new initiative that I believe will disrupt traditional venture capital investing and redefine how companies access and/or acquire innovative external technologies."

Thomas Thurston, Founder of Growth Science Ventures and Senior Advisor to Ducera, added, "Innovation is now happening at speeds, scales, and levels of complexity that only advanced computing can adequately analyze and make sense of. Yet with the right technological tools and a mastery of how to use them, companies can identify and capture growth from disruptive opportunities more rapidly and consistently than ever before. Utilizing data science and artificial intelligence presents a significant opportunity for companies to enhance their productivity and decision making in support of their organic and inorganic growth strategies. I am thrilled to partner with Michael to form Ducera Growth Ventures and look forward to working with a broad array of Duceras corporate clients in support of their venture investing interests."

Story continues

Mr. Thurston and Ducera previously launched the first of a six-part mini-series focused on how Ducera is using data science and artificial intelligence to advise clients in their development of corporate innovation and growth. Learn more by visiting: https://ducerapartners.com/news/thomas-thurston-partner-and-founder-of-growth-science-ventures-has-joined-the-firm-as-a-senior-advisor/

About Ducera Partners

Ducera Partners is a leading investment banking advisory practice with expertise in restructuring, strategic advisory, liability management, capital markets, wealth management, and growth capital. Since its founding in June 2015, Ducera Partners has advised on over $750 billion in transactions across various industries. Ducera Partners has offices in New York, Los Angeles, and Stamford. For more information about Ducera Partners, please visit http://www.ducerapartners.com.

About Ducera Growth Ventures

Ducera Growth Ventures focuses on identifying, analyzing, and managing innovation-based investments across a broad array of market segments and industries. The platform seeks to combine Duceras investment banking expertise with Growth Science Ventures proprietary analytics and big data to identify early-stage growth companies that have the potential to be successful over the long term. With an adherence to classic disruption theory and competitive threat regression analyses, Ducera Growth Ventures invests growth capital on behalf of its strategic corporate clients in future market leaders and next-generation companies that have the potential to produce new customers for Duceras clients, reduce costs, and/or create new markets that are consistent with a clients long-term vision. For more information about Ducera Growth Ventures, please visit http://www.ducerapartners.com

About Growth Science Ventures

Growth Science Ventures was founded by Thomas Thurston, one of the worlds leading data scientists. The firm utilizes data science to identify disruptive startups and counsels clients in connection with the development and launch of new products and services. For nearly 20 years Growth Science has continued to evolve its proprietary analytics, capabilities, and AI infrastructure through research collaborations with more than 60 of the world's largest, market-leading multinational companies spanning more than 1,000 market segments. For more information about Growth Science Ventures, please visit http://www.gsventures.com.

View source version on businesswire.com: https://www.businesswire.com/news/home/20231016545173/en/

Contacts

Mike GellerProsek Partnersmgeller@prosek.com

View post:

Ducera Partners and Growth Science Ventures Announce the Formation of Ducera Growth Ventures - Yahoo Finance

Read More..

Berkeley Space Center at NASA Ames to become innovation hub for … – UC Berkeley

The University of California, Berkeley, is teaming up with NASA's Ames Research Center and developer SKS Partners to create research space for companies interested in collaborating with UC Berkeley and NASA scientists and engineers to generate futuristic innovations in aviation, space exploration and how we live and work in space.

The Berkeley Space Center, announced today (Monday, Oct. 16), aims to accommodate up to 1.4 million square feet of research space on 36 acres of land at NASA Ames' Moffett Field in Mountain View, leased from NASA.

The new buildings, some of which could be ready for move-in as early as 2027, will house not only state-of-the-art research and development laboratories for companies and UC Berkeley researchers, but also classrooms for UC Berkeley students. These students will benefit from immersion in the Silicon Valley start-up culture and proximity to the nation's top aeronautical, space and AI scientists and engineers at Ames.

"We would like to create industry consortia to support research clusters focused around themes that are key to our objectives, in particular aviation of the future, resiliency in extreme environments, space bioprocess engineering, remote sensing and data science and computing," said Alexandre Bayen, a UC Berkeley professor of electrical engineering and computer sciences and associate provost for Moffett Field program development.

"We're hoping to create an ecosystem where Berkeley talent can collaborate with the private sector and co-locate their research and development teams, he added. And since we will be close to NASA talent and technology in the heart of Silicon Valley, we hope to leverage that to form future partnerships."

Ever since Naval Air Station Moffett Field was decommissioned in 1994 and NASA Ames acquired an additional 1,200 acres, NASA has been focused on developing those acres into a world-class research hub and start-up accelerator. Initiated in 2002, NASA Research Park now has some 25 companies on site, including Google's Bay View campus.

"We believe that the research and the capabilities of a major university like Berkeley could be a significant addition to the work being done at Ames," said NASA Ames Director Eugene Tu. "In a more specific way, we would like the potential of having proximity to more students at the undergraduate and graduate level. We would also like the possibility of developing potential partnerships with faculty in the future. The NASA mission is twofold: inspiring the next generation of explorers, and dissemination of our technologies and our research for public benefit. Collaboration between NASA and university researchers fits within that mission."

UC Berkeley hopes eventually to establish housing at Moffett Field to make working at the innovation center easier for students without a 47-mile commute each way. Bayen noted that Carnegie Mellon University already occupies a teaching building at Moffett Field. With the addition of UC Berkeley and the proximity of Stanford University, he expects the intensity of academic activities in the area, both instructional and research, to increase immensely.

"We have major facilities here at Ames the world's largest wind tunnel, NASA's only plasma wind tunnel to test entry systems and thermal protection systems, the agency's supercomputers and the university will likely build facilities here that that we might leverage as well. So, I look at that as a triad of students, faculty and facilities," Tu added. "Then the fourth piece, which is equally important: If the project is approved to move forward, the university will likely bring in partners, will bring in industry, will bring in startups, will bring in incubators that could be relevant to NASA's interest in advancing aeronautics, science and space exploration."

"What they're doing at NASA Ames is transformational, but in order to make it heroic, in order to make it even larger than what is now possible, they have to use the combined resources of the number one public university in the world, private industry and the most innovative place on the planet, which is Silicon Valley," said Darek DeFreece, the projects founder and executive director at UC Berkeley.

Bayen emphasized that many academic institutions are now becoming global universities: New York University has demonstrated the ability to operate independent campuses on different continents the Middle East and Asia while Cornell has successfully opened a second campus in Manhattan, five hours from Ithaca. In the same vein, UC Berkeley is innovating by launching this research hub that, over the decades to come, could evolve into a campus as instructional and research and development activities grow.

This expansion of Berkeleys physical footprint and academic reach represents a fantastic and unprecedented opportunity for our students, faculty and the public we serve, said UC Berkeley Chancellor Carol Christ. Enabling our world-class research enterprise to explore potential collaborations with NASA and the private sector will speed the translation of discoveries across a wide range of disciplines into the inventions, technologies and services that will advance the greater good. We are thrilled. This is a prime location and a prime time for this public university.

Claire Tomlin, now professor and chair of electrical engineering and computer sciences at UC Berkeley, conducted her first research on automated collision avoidance systems for drones at Moffett Field, and foresees similar opportunities there for UC Berkeley students, especially those enrolled in the College of Engineerings year-old aerospace engineering program.

"With our new aerospace engineering major, it is the right time to get started at Moffett Field. It offers an outdoor testbed for research on how to integrate drones or other unpiloted aerial vehicles, which are being used increasingly for aerial inspection or delivery of medical supplies, into our air traffic control system," she said. "I anticipate great collaborations on topics such as new algorithms in control theory, new methods in AI, new electronics and new materials."

Tomlin envisions research on networks of vertiports to support operations of electric autonomous helicopters or e-VTOLs (electric vertical takeoff and landing vehicles), much like UC Berkeley's pioneering research in the 1990s on self-driving cars; collaborative work on how to grow plants in space or on other planets to produce food, building materials and pharmaceuticals, similar to the ongoing work in UC Berkeley's Center for the Utilization of Biological Engineering in Space (CUBES); and collaborations on artificial intelligence with top AI experts in the Berkeley Artificial Intelligence Research lab (BAIR).

"This is the decade of electric automated aviation, and the Berkeley Space Center should be a pioneer of it, not just by research, but also by experimentation and deployment," Tomlin said. "We're interested in, for example, how one would go about designing networks of vertiports that are economically viable, that are compatible with the urban landscape, that are prone to public acceptance and have an economic reality."

"Advanced air mobility and revolutionizing the use of the airspace and how we use drones and unpiloted vehicles for future air taxis or to fight wildfires or to deliver cargo are other areas of potential collaboration," Tu added.

Hannah Nabavi is one UC Berkeley student eager to see this proposed collaboration with NASA Ames and industry around Silicon Valley, even though she will have graduated by the time it comes to fruition. A senior majoring in engineering physics, she is the leader of a campus club called SpaceForm that is currently tapping NASA Ames scientists for research tips on projects such as how materials are affected by the harsh environment on the moon.

"I think one of the primary advantages to UC Berkeley of having this connection is it allows students to obtain a perspective on what's happening in the real world. What are the real-world problems? What are the goals? How are things getting done?" said Nabavi, who plans to attend graduate school on a path to a career in the commercial space industry. "It also helps students figure out what they want to focus on by providing an early understanding of the research and industrial areas in aerospace."

But beyond the practical benefits, she said, "I think that seeing all of these scientists and engineers tackling issues and questions at the forefront of aerospace can serve as a huge inspiration to students."

In addition, data science and AI/machine learning are rapidly disrupting the aviation and space industry landscape as it evolves toward automation and human-machine interaction and as ever bigger datasets are being produced. The workforce needs retraining in these rapidly evolving fields, and UC Berkeleys College of Computing, Data Science, and Society (CDSS) is well positioned to provide executive and professional education to meet these needs.

"Berkeley Space Center offers the possibility for CDSS students to work on these new challenges, particularly in the fields of aeronautics and astronautics, planetary science and quantum science and technology," said Sandrine Dudoit, associate dean at CDSS, professor of statistics and of public health and a member of the Moffett Field Faculty Steering Committee.

DeFreece noted that there are NASA collaborations already happening on the UC Berkeley campus. Many leverage the mission management and instrument-building skills at the Space Sciences Laboratory, which is responsible for the day-to-day operation of several NASA satellites and is building instruments for spacecraft that NASA will land on the moon or launch to monitor Earth and the sun.

UC Berkeley researchers are already investigating how to print 3D objects in space, how to create materials to sustain astronauts on Mars, how to test for life-based molecules on other planets and moons, and whether squishy robots could operate on other planets. UC Berkeley spin-offs are developing ways to monitor health in space and provide low-cost insertion of satellites into orbit.

"The Berkeley Space Center could be a place where half of the day students are collaborating with center neighbors, and the other half of the day they might be taking classes and seeing their mentors who are supervising class projects on the satellite that is hovering over their heads at that very moment," Bayen said. "Experiences like these just don't exist anywhere else at the present time."

UC Berkeley's Haas School of Business and Berkeley Law are also working on issues surrounding the commercial exploitation of space, including asteroids and other planets, and the laws that should govern business in space.

"Space law and policy are also areas where I think there's some tremendous opportunities to collaborate with the university," Tu said. "What are we going to do when we find resources on the moon, and other countries do as well, and companies want to make money from that?"

In return for its investment and partnership, UC Berkeley will receive a portion of the revenues that the real estate development is projected to generate. While market-based returns are always subject to change, the joint venture conservatively estimates that the research hub will receive revenues more than sufficient to ensure that Berkeley Space Center is self-sustaining, as well as provide new financial support to the core campus, its departments and colleges, and faculty and students.

UC Berkeley also expects significant additional revenue from other, project-related sources, including new research grants, industry participation and partnerships, and the incubation and commercialization of emerging companies born from translational research and technologies created at the site.

SKS Partners, a San Francisco-based investor and developer of commercial real estate properties in the western U.S., will lead the venture. The planning team for the Berkeley Space Center will pursue LEED certification for its buildings a mark of sustainability by using solar power, blackwater and stormwater treatment and reuse, and emphasizing non-polluting transportation.

While construction is tentatively scheduled to begin in 2026, subject to environmental approvals, UC Berkeley is already creating connections between Silicon Valley companies on the NASA Ames property, including executive education programs.

"In the next couple of years, we could conceivably have a semester rotation program, where UC Berkeley students spend one semester at Berkeley Space Center, take three classes taught there, do their research there, are temporarily housed there for a semester, just like they would do a semester abroad in Paris," Bayen said. "Ultimately, we hope to build experiences that currently do not exist for students, staff and faculty and create an innovation ecosystem where breakthroughs that require public-private partnerships are enabled."

The development team includes as co-master planners HOK, an architecture, engineering and planning firm, and Field Operations, an interdisciplinary urban design and landscape architecture firm.

See the article here:

Berkeley Space Center at NASA Ames to become innovation hub for ... - UC Berkeley

Read More..

IEO evaluation of the Bank of England’s use of data to support its … – Bank of England

Foreword from the Chair of Court

Data are critical to the work of a central bank. The Bank of England has long recognised this. Most recently, we defined decision-making informed by the best available data, analysis and intelligence as a timeless enabler of our mission. And, to deliver on that, in 2021 we made modernise the Banks ways of working a strategic priority for the years 2021 to 2024.

At the same time, the pace of innovation in data and analytics continues to increase, as the recent advances in the capabilities of large language models make clear. Every day, the Bank makes decisions that affect millions of the UKs people and businesses the Banks data and analytics capabilities support and power that decision-making process. It is therefore vital that we stand back and consider whether our data capabilities will remain fit for purpose in a rapidly changing world such that we deliver our timeless enabler and ultimately our mission.

To that end, in October 2022 the Banks Court of Directors commissioned its Independent Evaluation Office (IEO) to conduct an evaluation of the Banks use of data to support its policy objectives.

The IEOs report is clear. Overall, and despite many positive steps, looking forward the Bank must ensure that its data capabilities advance to match its ambition, especially as data and analytics best practice advances rapidly. While the Bank is not alone in facing this challenge, addressing it is strategically critical. The Bank will therefore need to set itself up for success by stepping up the pace of change, investing in its technology and people, and overcoming the barriers that will impede progress in a rapidly changing data and analytics landscape.

The IEOs recommendations provide a foundation for doing so. They make 10 detailed recommendations, grouped into three broad themes: committing to a clear vision for data and analytics, supported by a comprehensive strategy and effective governance; overcoming the institutional, cultural and technological barriers faced by organisations as they move to new and emerging data-centric ways of working to keep in step with a changing world; and ensuring the Banks staff have the support and skills they need.

At our 22 September meeting, Court welcomed the Banks commitment in taking forward these recommendations. We will monitor their implementation as part of the IEOs follow-up framework.

David Roberts, Chair of CourtOctober 2023

Data have long been at the heart of central banking. But the availability of data and the capabilities to draw insights from them have developed rapidly over the past decade or so. These changes, when coupled with expanding remits and global shocks, have created both opportunities and challenges for central banks. In that context, in October 2022 the Court of Directors (the Banks board) commissioned its IEO to conduct an evaluation of the Banks use of data to support its policy objectives.

In response to rapid change, central banks have innovated in multiple dimensions, from institutional structures to technological infrastructure, to new analytical methods and data sources. But, like many organisations, they have faced a range of challenges along the way, whether from legacy systems, established working practices or the practicalities of cloud migration.

The Bank of England has been on a similar journey to peer central banks. It made data a prominent feature of the 2014 One Bank strategy and in the strategic priorities for the next three years that it set out in 2021. It created the role of Chief Data Officer supported by an expanding team. It has developed a sequence of data strategies (Box A), founded on credible problem statements. It has rolled out new analytical and storage capabilities with associated training for staff. And, supported by its emerging centres of excellence, it has done pioneering analysis with new techniques and data sources with examples ranging from the use of machine learning to predict individual bank distress from regulatory returns and plausibility checking returns from regulated firms, through to tracking the macroeconomy at high frequency during the pandemic with unconventional measures of activity. footnote [1] footnote [2]

The Banks current data and analytics operating model devolves a large amount of responsibility for data and analytics to its business functions. The central Data and Analytics Transformation Directorate, led by the Chief Data Officer, is responsible for enabling those areas in their delivery of the central data strategy, which is half of one of the Banks seven strategic priorities for 202124. This model is currently in transition, partly in response to our evaluation and partly as a result of leadership change, with a new Chief Data Officer having started in role in April 2023.

Our evaluation took the overarching research question Is decision-making to support the Banks policy objectives informed by the best available data, analysis and intelligence, and can it expect to be so in the future?. We adapted this from the Banks timeless enabler on data, which was set out alongside the 202124 strategic priorities. We broke that down into four detailed areas of investigation, covering broad questions of strategy and governance and three detailed areas of data management: acquisition and preparation; storage and access; and analysis and dissemination. Our evidence gathering involved: conducting around 175 interviews, across the Bank and a range of other organisations, including peer central banks and regulators; a staff survey, complemented by targeted focus groups; and consulting an advisory group of senior Bank staff and, separately, two external expert advisors.

With best practice in data and analytics advancing rapidly, the Bank will need to step up the pace of change and associated investment if it is to take advantage of new opportunities. While progress has been made using a devolved operating model, data capabilities are inconsistent across the organisation and in some cases the current approach is sub-optimal. To progress further, management will need to systematically address a range of foundational technology and process issues and build the capabilities necessary to enable the Bank to take advantage of new data tools so it can be in the best position to deliver on the Bank's mission. We make 10 detailed recommendations, which we grouped into three broad themes: committing to a clear vision, supported by a comprehensive strategy and effective governance; breaking down institutional, cultural and technological barriers to keep in step with a changing world; and ensuring staff have the support and skills they need.

Theme 1: Agree a clear vision for data and analytics, supported by a comprehensive strategy and effective governance.

1. Agree and champion a vision for data use, matching funding to ambition.

2. Collaboratively design deliverable Bank-wide and local business area data strategies to meet measurable business outcomes.

3. Ensure governance structures can support the agreement, co-ordination and monitoring of data transformation, with clear accountability for delivery.

Theme 2: Break down institutional, cultural and technological barriers to keep the Bank's data and analytical practices in step with a changing world.

4. Improve day-to-day collaboration across the business on data and analytics.

5. Agree the approach to sharing data and analytics inside and outside the Bank.

6. Narrow the gap with modern data and analytics practices, with the most impactful initial step being a phased migration to cloud.

7. Systematically monitor and experiment with new approaches and technology for data and analytics.

Theme 3: Ensure staff have the support and skills they need to work effectively with data.

8. Embed common standards to make data and analysis easily discoverable and repeatable.

9. Provide staff with the easily accessible support and guidance they need across the data lifecycle.

10. Develop a comprehensive data skills strategy encompassing hiring, training, retention and role mix.

In addition, the Bank is currently taking a range of actions to strengthen key foundational enablers. Successful execution of these wider initiatives will be crucial to fully delivering the Banks data ambitions: i) improvements to the approach to setting organisational strategy, prioritisation and budgeting; ii) tackling technology obsolescence; and iii) strengthening of the Banks central services and change management capabilities. The appointment of an Executive Director to lead a new Change and Planning function, the delivery of the Central Services 2025 programme, and future iterations of the Banks wider talent strategy will contribute across these areas of focus.footnote [3]

The evaluation was conducted by a dedicated project team reporting directly to the Chair of Court.footnote [4] The IEO team benefited from feedback and challenge from a Bank-wide senior-level advisory group (including Bank Governors). David Craig (founder and former CEO, Refinitiv, former Head of Data and Analytics, LSEG, and Executive Fellow, London Business School) and Kanishka Bhattacharya (Expert Partner, Bain & Company, and Adjunct Associate Professor, London Business School) provided support and independent challenge to the team and reviewed and endorsed the findings in this report.

This report was approved for publication by the Chair of Court at the September 2023 Court meeting.

Data have long sat at the heart of central banking, including at the Bank of England. At least since the heyday of the gold standard, monetary policy makers have drawn on data to determine the stance of monetary policy. The Banks Quarterly Bulletin, the Banks flagship publication from its introduction in 1960 through to the 1993 launch of the Inflation Report, offered a detailed commentary on economic and financial developments, supported by an extensive range of statistics. These days, the Monetary Policy Report and Financial Stability Report continue to provide detailed coverage of the data and analytics that have gone into policy formulation. Supervisors now work with a broad range of regulatory returns, with the volume of supervisory data available having increased materially since the global financial crisis.

Nonetheless, the world of data has been changing rapidly and central banks have had to adapt to at least three continuing developments. Global events have presented new policy challenges, most notably the global financial crisis and Covid-19 pandemic. Central banks have often broadened their focus, with some taking on additional macroprudential, microprudential and supervisory roles. More broadly, technological change has led to both vastly more data being available to central banks and the development of powerful new tools to interpret them.

Central banks have had to innovate in response to these developments, although this has not always been easy. They have explored institutional change, including appointing chief data officers, adopting data strategies and experimenting with a range of structures for data governance and management. Many have migrated to cloud. In 2020, 80% of respondents to a BIS survey said that they were using big data sources, up from 30% in 2015.footnote [5] But, at the same time, they have struggled with legacy systems and migrating to new technology, including the unfamiliar IT arrangements that this can involve. New analytics and data practices have needed to fit into existing policy frameworks, including generating reliable results that can be interpreted by policymakers.

The Bank of England has been on the same journey as its major peers. It acquired new responsibilities following the global financial crisis, including: a statutory committee responsible for macroprudential policy; and microprudential policy for, and supervision of, banks, insurers and financial market infrastructures. It has had to adapt to major economic events, including the global financial crisis, the UKs exit from the European Union, the Covid-19 pandemic and, most recently, Russias invasion of Ukraine. And over the past decade or so the Bank has acquired large amounts of new data including microdata on firms and households, regulatory data on banks and insurers and asset or even transaction-level data on key financial products and unconventional data from operational, administrative and digital sources.

The Bank made data a prominent feature of its 2014 One Bank strategy and the strategic priorities for the next three years that it set out in 2021, with both strategies supported by credible assessments of the Banks analytics and data capabilities. In 2014 it created the role of Chief Data Officer, initially at a relatively junior level, before it was made an Executive Director role in 2019. Its data transformation efforts have been supported by an expanding team; from a small Division reporting to the Chief Information Officer it has grown to a full Directorate, bringing together data transformation with Divisions that were already part of the Monetary Policy area, covering advanced analytics and the collation and publication of statistical and regulatory data.

Over the past decade the Bank has taken significant steps to enhance data and analytics. It launched a rationalised and improved suite of analytical tools, which allowed it to focus support and training resources more effectively. It has expanded the range of storage options available, most notably introducing the Data and Analytics Platform to host large data sets. In 2014 it created an Advanced Analytics Division, to act as a centre of excellence. Together, these steps have facilitated increased uptake of programmatic analytical tools and have allowed further centres of excellence to emerge across the organisation.

As a result, the Bank has been able to conduct innovative data and analytics work. Notable examples include: embedding machine learning into the plausibility checking of returns from regulated firms; a predictive analysis tool to support selection of regulated firms for the Prudential Regulation Authoritys (PRAs) Watchlist; tools to analyse insights from firms management information; and, during the pandemic, the rapid adoption of high-frequency indicators from unconventional sources to track economic developments.footnote [6]

The Banks current approach to delivering its mission of promoting the good of the people of the United Kingdom is summarised in its strategic priorities, which support cross-Bank prioritisation. Data appear twice within the current strategic plan, both as a timeless enabler of the Banks mission decision-making informed by the best available data, analysis and intelligence and as Strategic Priority 7, modernise the Banks ways of working. Strategic Priority 7 has two sponsors at Executive Director level, the Chief Data Officer and the Chief Information Officer, as the majority of actions fall to the Data and Analytics Transformation (DAT) and Technology Directorates. The Banks data strategy is an integral part of Strategic Priority 7 and is led by the Chief Data Officer and DAT.

Strategic priorities 202124

The 2021 data strategy had three broad strands. The first focused on enabling, consisting primarily of expanding and refining existing offerings around data collection, storage, support and training. The second consisted of targeted improvements in business outcomes, such as the work in the PRA on RegTech. The third was the Transforming Data Collection programme, run jointly with the Financial Conduct Authority (FCA), which aimed to ensure regulators get the data they need to fulfil their mission, at the lowest possible cost to industry.footnote [7]

DAT, under the ultimate oversight of the Deputy Governor for Monetary Policy and (since 2022) the Chief Operating Officer, is a central function with four roles, all relevant to delivering the data strategy:

DSID plays a key role in several components of the Banks data strategy. They provide key support services, including: management of the Data and Analytics Platform; a Data and Analytics email-based helpdesk; provision of training and guidance to support analytical and data management best practice; and ownership of the Data Catalogue, which is intended to support data governance and act as a repository of key data sources in use across the Bank. DSID also partners with business areas to help them to deliver their priority outcomes through better use of data and analytics. That is supported by the Data and Analytics Business Partners team, which provides a formal link between business areas and experts in the data function, and the Analytics Enablement Hub (AEH), which works with business areas on targeted projects. AEH also provides training and guidance to support the use of a range of modern analytical tools (eg R, Python), strategically selected for the Banks use cases. DSID and the PRA are also working with the FCA and industry to transform data collection from the UK financial sector and have recently established a cross-Bank taskforce to more effectively combine expertise.footnote [8]

More broadly, many of DATs functions are intended to enable business areas to deliver the central data strategy in line with business-area priorities. The management of data assets and many data transformation initiatives sit with individual business areas. For example, after DAT processes and plausibility-checks collections, regulatory data on banks are held and managed by the PRA, while Monetary Analysis manages a database of macroeconomic time series. Business areas across the Bank have begun to experiment with different approaches to strengthening data science skills, whether through training or hiring. Many areas have developed their own centres of analytical excellence specialising in data science techniques. For example, the PRA RegTech team has developed natural language processing tools and acts as a co-ordinating hub for other data specialist teams working in PRA supervision and policymaking areas. Similarly, the Financial Markets Infrastructure Directorates Data team has developed expertise in techniques required to analyse large transaction-level markets data sets. With extensive autonomy, some areas have well-developed data strategies focused on business area priorities in addition to the transforming data collection agenda, the PRAs data strategy covers how regulatory data are accessed, the development of dashboards for supervisors, as well as coaching and digital skills while others have more minimal arrangements. This dispersion of responsibilities was mirrored in oversight, which at the outset of this evaluation was spread across a large number of data or (for investment) programme boards, as well as strategic committees like the Executive Policy Co-ordination Committee, the Executive Operational Co-ordination Committee and the Operations and Investment Committee.

Overall, the current operating model for data and analytics is in transition, partly in response to our evaluation and partly as a result of leadership change.footnote [9] More broadly, important enablers of the Banks data activities are undergoing change: a new plan is being drawn up to tackle technology obsolescence, alongside a cloud migration strategy; central services are being upgraded through the Central Services 2025 (CS2025) programme; and the Banks change capabilities are being strengthened by the appointment of an Executive Director for Change and Planning. Successful execution of these wider initiatives will be crucial to fully delivering the Banks data ambitions.

We took the overarching research question Is decision-making to support the Banks policy objectives informed by the best available data, analysis and intelligence, and can it expect to be so in the future?. We adapted this from the Banks timeless enabler on data, which was set out alongside the 202124 strategic priorities (Figure 1). We broke that down into four evaluation criteria, each underpinned by a set of benchmarks:

We conducted an extensive evidence-gathering exercise, drawing on three main sources:

Launched in 2014, the One Bank strategy was intended as a transformative strategic plan to help the Bank, which had recently expanded to accommodate the newly created Financial Policy Committee and Prudential Regulation Authority, operate successfully as a single organisation.footnote [10] One of the plans four pillars was dedicated to analytic excellence, including making creative use of the best analytical tools and data sources to tackle the most challenging and relevant issues. The strategy saw the creation of the Banks first Chief Data Officer and the Advanced Analytics Division. Specific actions included: external partnering to explore the use of big data and advanced inductive analytics capabilities; and the creation of a One Bank data architecture. The One Bank data architecture aimed to: integrate all the Banks data under the common oversight of the Chief Data Officer; increase the efficiency of data collection and management; share data more widely inside the Bank; and make greater use of third-party providers with economies of scale.

The National Audit Office (NAO) evaluated progress on the One Bank strategy in 2017.footnote [11] It found that of the 15 initiatives planned as part of the strategy, only one was substantively incomplete, the One Bank data architecture. The NAO found that: This turned out to be much more complex than expected, with the Bank identifying that the new IT would need to support around 182 data systems and 2,700 data sets.

Vision 2020, launched in 2017, was the successor strategic plan to the One Bank strategy. Formally, Vision 2020 had a reduced focus on data relative to its predecessor, with data touched on only in the context of data visualisation, under Creative, targeted content, and data sharing, under Unlocking potential. However, in parallel to Vision 2020 in 2017, a data programme was developed as a successor to the One Bank data architecture. Recognising that the previous initiative had been very ambitious relative to the available expertise, budget and planned timescales, the scope agreed in 2017 was narrower and focused on providing self-service tools. Despite the reduced ambition, the 2014 strapline for the programme was preserved: an integrated data infrastructure across the Bank, to enable information sharing. When the programme closed in mid-2020 it was considered to have substantively delivered on the narrower 2017 objectives, though delivery of the Data and Analytics Platform was separated out and was not fully rolled out until 2022.

In 2018, the Bank commissioned Huw van Steenis to write a report on the future of finance.footnote [12] His 2019 report recommended that the PRA embrace digital regulation, including developing a long-term strategy for data and regulatory technology. In its response, the Bank committed to develop a world-class regtech and data strategy.footnote [13] Specific commitments included: consulting supervised firms on how to transform the hosting and use of regulatory data; enhancing the Banks analytics, including peer analysis, machine learning and artificial intelligence; proofs of concept around enhanced analytics and process automation; and making the PRAs rulebook machine readable. As part of delivering its response, the Bank elevated the role of Chief Data Officer to Executive Director level and created the Data and Analytics Transformation Directorate, bringing together a range of existing Divisions.

Launched in 2021, the Banks strategic priorities for 202124 include Strategic Priority 7, modernise the Banks ways of working.footnote [14] This has two elements, one focused on data (described in more detail under current operating model) and one on strengthening the Banks technology. The data elements of the Banks response to the Future of Finance report were incorporated into Strategic Priority 7.

The Bank has consistently set itself a high level of ambition on data and analytics over the past decade and the effective use of data appears prominently in its current strategic plan. The Banks ambitions have been grounded in convincing assessments of the Banks data and analytics capabilities. However, progress has been inconsistent across the organisation with variation in the degree to which ambition has been matched by resources, plans and management oversight. This highlights the challenges inherent in a more devolved data operating model, especially during times of significant transformation in the external data and analytics landscape. Prompted by emerging findings from this evaluation and the arrival of a new Chief Data Officer in April, this area has seen the most change since our evaluation began, which offers a strong foundation for addressing our Theme 1 recommendations around vision, resources, strategy and governance.

Ensuring consensus around the Banks vision for data will be vital, because delivering it will require funding that consistently matches ambition and more concerted championing. The broad support that we heard from the Banks Executive for the current level of ambition suggests the Banks existing timeless enabler could continue to serve as a benchmark for the Banks ambitions. But the renewed strategic conversation currently occurring is needed to restate the case and galvanise support. A renewed vision will need to be met with plans and budgets that consistently match the level of ambition, even if the Bank faces competing priorities. The Bank will also need to review how it tracks spending on data and analytics and make sure funding remains consistent with plans. The Banks senior leaders will need to build on the efforts of the new Chief Data Officer to ensure the centrality of effective data use to the Banks mission is understood both inside and outside the Bank. The PRA has piloted a data skills coaching programme for senior leaders which, if extended Bank-wide, would help support championing efforts.

With an agreed vision and commitment to funding (Recommendation 1), the Banks central functions and business areas will need to work together to refresh the Banks data strategy. This collaborative approach will require common understanding of the art of the possible and of the Bank-wide and individual business areas target operating models, encompassing data, technology and skills. These inputs would support the development of enterprise, data and technology architectures describing their current and target states.footnote [15] As the Bank-wide strategy is refreshed, business areas will need to develop local strategies that are embedded within that and champion them. Nor can the data strategy stand alone it will need to be consistent with, and perhaps developed alongside, supporting strategies for technology and people. In order to build trust, it is important that stakeholders can see measurable progress. That would be aided by: quantified and planned expected benefits ex-ante; mechanisms to track progress; and evaluated outcomes ex-post.

The newly established Data and Analytics Board fills an executive-level gap identified in the early stages of the evaluation and will need to ensure its membership, terms of reference and supporting structures allow the refreshed data strategy (Recommendation 2) to be developed, co-ordinated and monitored during implementation.footnote [16] The Board is a promising development; as it becomes established, its co-chairs the Chief Data Officer and Chief Information Officer and membership will want to ensure that it: remains a forum that effectively convenes central functions and business areas; forges consensus on Bank-wide data and analytics priorities, including the details of the strategy, such as benefits, deliverables and timescales; ensures all the Banks data and analytics transformation activities are consistent with the organisational strategy; keeps abreast of the latest technological developments (Recommendation 7); monitors progress; and holds its members to account for delivery of the strategy and key dependencies. As part of this, it will need to review what supporting structures it requires, including: subcommittees (for example, to ensure data and technology initiatives are consistent with agreed strategies and architectures); monitoring tools (for example, executive scorecards); and accountability devices (such as published documents and member objectives).footnote [17] Our external advisors recommended that governance structures should evolve over time as data maturity increases, suggesting the Bank requires stronger central direction at the early stages of the journey before it can move to a more decentralised approach.

The data analysis produced by Bank staff is highly regarded. Policy committees praise the staffs outputs and its centres of excellence conduct innovative analysis with data. The Bank also continues to rank among the most transparent of central banks. But, not unlike other specialist organisations, the Bank has wrestled in recent years with a range of barriers to making the most of the large amounts of data it acquires. Notwithstanding progress made in recent years, difficulties remain. As with other large, specialist organisations the Bank has found it difficult to combine different types of expertise and to collaborate effectively across business areas, and between business areas and central functions, with local areas preferring to develop their own solutions. A perhaps understandable risk aversion has contributed to: a relatively constrained approach to data sharing, both internally and externally, beyond that necessary due to statutory prohibitions; and a nervousness around adopting new technologies, notably cloud solutions, or working practices. Our recommendations focus on breaking down these institutional, cultural and technical barriers, through: strengthening collaboration, particularly through the use of partner roles linking central functions and local areas; articulating principles to guide greater sharing of data and analytics, internally and externally; strengthening the technological foundations of the Banks data and analytics, particularly by migrating to cloud; and finding ways to draw more extensively on external technical expertise. Continued development of a unified data and technology architecture, supported by improved governance structures, will also be crucial.

The Bank should consider structures that could strengthen collaboration and more effectively combine expertise. This applies across business areas, between business areas and central functions, and between different professions (particularly data, change management and technology specialists). Its business partnerships programme if fully implemented offers a promising start, focused on building collaboration between business areas and the data function. This could play a crucial role in helping: central functions understand desired business outcomes; business areas understand what is possible; and the Bank in ensuring that data projects can be incorporated within Bank-wide data and technology architectures. The Bank will need to monitor progress on business partnerships, including a balanced assessment of how business areas have engaged with it, perhaps at the Data and Analytics Board. Further action may be needed to reinforce cultural change. The Bank should review lessons from the partnerships and the newly established cross-Bank AI and data collection taskforces when considering the most effective ways to bring together people with common interests and expertise. More broadly, we came across interesting models at peer central banks, including those focused on combining data and technology experts with business area specialists to produce repeatable products. There are also a range of models (eg guilds, tribes) and delivery frameworks (eg the Data Management Capability Assessment Model) established in the data management profession for combining expertise.footnote [18] This will have wider implications, since CS2025 also proposes partnership models for the Banks Technology and People Directorates.

Greater openness around data and analytics, internally and externally, would foster greater scrutiny and challenge, helping the Bank gain additional insights and keep up with a rapidly evolving world the Bank produces large amounts of extremely valuable data and analysis, but much of it is only easily available to subsets of Bank staff. The Bank could adopt a presumption of sharing, but would need to further consider the implications and appropriate guardrails. The Bank has important legal obligations and constraints when it comes to sharing data but, within those, it should articulate and highlight a set of principles for disseminating data and analytics, internally and externally. Guiding principles would allow the Bank to consider how to safely open up wider access to data and analysis and might facilitate external collaboration. This is consistent with the IEOs Research Evaluation, which recommended that the Bank needed to support access to data for external co-authors to broaden the expertise and perspectives that the Bank can draw on. We note that some other organisations that face similar binding restrictions have found means to facilitate access to internal data, for example the ONS Secure Research Service.footnote [19] Moving to cloud (Recommendation 6) could help overcome technical barriers to sharing.

The Bank will need to develop an achievable plan to modernise most of its data and analytics practices, to avoid falling further behind a rapidly evolving frontier. A reliance on inefficient manual processes generates risks and staff frustration. A move to cloud would be the most powerful technological step the Bank could take to close this gap, supported by common standards (Recommendation 8) and upskilling (Recommendation 10). While a move to cloud is no panacea and brings new challenges, we have seen that peers and other organisations have been able to unlock capabilities through the use of modern tools and provide increased computing capacity. Cloud offers a range of enhanced capabilities that could improve data collection, discoverability (for example, automation of data cataloguing) and analysis, and allow some embedding of modern data management practices (Recommendation 8). This could include being more open to buying in off the shelf tools than is currently the case. Peers experience suggests cloud migration might also help with other issues such as efficient use of licenses, facilitating access for external experts (Recommendation 5) and obsolescence, with cloud providers keeping tools up to date. As a late adopter of cloud, the Bank can learn lessons from other organisations experience of making the transition.

The Bank will need to consider the role of its emerging centres of excellence in raising data maturity; centres of excellence like Advanced Analytics and local business area data science hubs have brought deep expertise into the Bank and, through collaboration, have helped develop others skills. The Bank should ask whether there is more it can learn from others experience of innovation hubs and how their role should evolve as maturity rises. Further mechanisms might include an external advisory board made up of experienced experts to provide challenge to the Data and Analytics Board; the Bank has used such arrangements effectively in a number of areas and the Monetary Authority of Singapore have used it for data and technology.footnote [20] footnote [21] Coaching senior staff on data could be expanded Bank-wide (Recommendation 1) and draw on lessons from elsewhere, for example reverse-mentoring, where talented analysts and data specialists coach senior staff on the art of the possible.

The Bank has long understood the importance of enabling individuals to work effectively with data. The 2014 data strategy focused on enabling business areas and encouraging individuals to get and make better use of data within their roles is a key outcome in the Banks 2021 data strategy, with DAT identifying data, tools, platforms, training and other support services as important enablers to facilitate that outcome. While the Bank has introduced data specialist roles and built up its training offering, developing data skills and embedding new, more modern approaches takes time. Recent technological developments have increased analytical capabilities and offer the potential to automate more processes, freeing staff time to focus more on higher value-added analysis. This may have implications for the skills the Bank wants to develop and how staff best work with each other. We have identified three recommendations to help the Bank make progress by ensuring staff have the support and skills they need: embed common standards to make data and analysis easily discoverable and repeatable; provide staff with accessible support and guidance across the data lifecycle; and develop a comprehensive data skills strategy encompassing hiring, training, retention and skills mix.

The Banks data and analytics guidance needs to be comprehensive, easier to find and better incentivised. The Bank has recently refreshed its guidance on data management, which will be launched this year. It would benefit from doing the same for analytical common standards, not least to ensure that greater use of programmatic analytical tools is suitably resilient. When these are established, the Bank should consider how to raise engagement and adherence. Training is one option, and is standard at induction in professional services firms, consultancies and banks. Other options include greater leadership championing, recognition of good practice in performance reviews, mandatory training, audits and individuals attesting to compliance with the standards through the Our Code process.footnote [22] In the past, the Bank has used the performance management process to influence behaviour, or enforced compliance top-down. Best practice could be built into future data and analytics platforms as part of cloud migration (Recommendation 6), which could include automation of data cataloguing which our advisors indicated was widely adopted in data-mature private sector firms.

The Bank could go further in joining up existing data and analytics support. This would materially enhance the support accessible to staff, who currently struggle to navigate the fragmented range of services available. A single front door, clearly visible on the desktop or intranet front page and spanning the data lifecycle, could effectively triage data and analytics requests, directing them to existing resources or escalating to deeper support, as appropriate. The Bank already operates elements of this approach in interactions between hubs, helpdesks and users joining it up would significantly ease the experience of staff. The Bank should define how this service would relate to that offered by business partners and how those business partners are resourced to meet any increased demand.

The Bank should develop a comprehensive data skills strategy, embedded within wider initiatives around talent and skills. Such a strategy will need to consider the career proposition for data specialists, including both data scientists and the technology specialists that support data work. It will need to articulate where skills should be developed inside the Bank, across all levels of seniority and supported by a training offer, or hired into the organisation. It will need to be informed by a clearly defined operating model (Recommendation 2) that articulates the role of data specialists relative to other skillsets in the organisation, including analysts and technology specialists. This should be linked to the People Directorates wider talent strategy. The Bank can draw lessons from the mix of approaches business areas have adopted when experimenting with building area-wide data skills.

Read more here:

IEO evaluation of the Bank of England's use of data to support its ... - Bank of England

Read More..