Page 517«..1020..516517518519..530540..»

High-Performance Computing as a Service Market interpreted by a new report – WhaTech Technology and Markets News

Asia-Pacific region held the highest share in 2022.

High Performance Computing Market, Riding the Cloud Wave

Driving Forces and Challenges:

Technological advancements, including 3D imaging, AI, and IoT, coupled with the influx of data for analysis, contribute significantly to the growth of the HPCaaS market. The adoption of these services for real-time data processing in analyzing stock trends and streaming live sports events is on the rise. However, the report highlights concerns about the safety and legitimate use of data as a major challenge for global market growth.

http://www.maximizemarketresearch.com/market-ket/63767/

High Performance Computing as a ServiceMarket Overview:

A prominent global market research organization has recently published an all-encompassing market research report focusing on theHigh Performance Computing as a Servicemarket. This comprehensive report presents a wealth of data and visually engaging representations, facilitating an in-depth analysis of both regional and global markets. It provides valuable insights into the market's objectives while offering detailed information about leading competitors, their market value, current trends, strategies, targets, and product portfolios. Moreover, the report highlights the recent market growth and includes historical data, providing valuable information to stakeholders and decision-makers.

High Performance Computing as a ServiceMarket Scope:

The research report delves deep into the analysis of trending competitors, their growth patterns, and the dynamic nature of the market. It offers valuable insights into the regional and global market values and demands, fostering a better understanding of the competitive landscape and the market's potential in terms of production, demand, and supply. The segmentation analysis takes into account crucial factors such as psychographic, demographic, geographic, and behavioral aspects, influencing marketing strategies, targeted products, offers, and customer experiences. Porter's analysis is utilized to assess organizations' competitive positions and improve profitability. Additionally, Pestle analysis is conducted to validate the relevance of existing products and services within the contextual data. SWOT analysis is employed to evaluate internal and external factors contributing to a company's strengths, weaknesses, advantages, and disadvantages. Overall, this report offers a comprehensive and informative overview of theHigh Performance Computing as a Servicemarket.

http://www.maximizemarketresearch.com/requestmple/63767

Segmentation Analysis:

by Component

SolutionsServices

by Deployment Type

Private CloudPublic CloudHybrid Cloud

by Industry Vertical

ManufacturingBFSIHealthcareGovernmentMedia & EntertainmentOthers the incorporation of state-of-the-art capabilities in military AI systems are propelled by ongoing developments in AI algorithms, deep learning models, and software architectures.

MajorPlayers are:

Regional Analysis:

The report provides formal, functional, and vernacular regional analyses, with a focus on high-demand regions such as Asia Pacific, North America, Latin America, the Middle East, Europe, and Africa. The analysis offers insights into distinct targets, strategies, and market values specific to each region.

Report: http://www.maximizemarketresearch.com/requestmple/63767

Key Questions Answered in theHigh Performance Computing as a ServiceMarket Report:

Key Offerings:

Excerpt from:
High-Performance Computing as a Service Market interpreted by a new report - WhaTech Technology and Markets News

Read More..

8 Hybrid Cloud Security Challenges and How to Manage Them – TechTarget

While AI is taking the world by storm, the gold at the end of the rainbow for many CIOs follows digital transformation initiatives that lower operational costs and transition legacy systems to virtual environments on private and public clouds.

Consumer websites and development environments run applications in privately controlled data centers and take advantage of the compute, network and storage resources of public cloud service providers (CSPs), better known as infrastructure as a service (IaaS) providers. Attracted by the flexibility and cost savings, businesses use this "cloud burst" pay-as-you-go model for high-volume data processing, load balancing and redundancy, avoiding downtime during peak demand, such as a holiday selling period.

But, for many organizations, connecting private and public clouds over the internet using a dedicated network connection isn't that simple. Business transitions, incompatible technology environments and rapid changes in dynamic public cloud services can cause hybrid cloud security challenges.

Single hybrid cloud is now multiple clouds, said Mark Buckwell, executive cloud security architect at IBM, during last April's RSA Conference. It's not unusual for organizations to run Microsoft Active Directory as a managed service on AWS and connect to on-premises workloads, he told the audience during a session called "Architecting Security for Regulated Workloads in Hybrid Cloud."

"They still don't want to move the crown jewels of the organization off premise into cloud," Buckwell said, "so we end up integrating different parts of an application with different components, sitting on different technologies ... and this seems to be the way the world is going. And that just makes the whole solution a lot more complex because now we have data flowing in all sorts of different places." The result, he added, is different policies, depending on the technology and cloud provider, as well as a potential "split of responsibilities" among the cloud provider, other third parties and the organization.

Legacy systems might work with some public cloud services and not others. Security teams need to ensure on-premises security controls and processes coexist with native-cloud technologies to meet business and compliance requirements.

The "SANS 2022 Multicloud Survey" of IT professionals reported that 86% of respondents said their organizations used services from multiple cloud providers and 28% used private cloud for at least one-fourth of their compute workloads. "You're juggling clouds, each from a different vendor, wanting flexibility and the best tools," said the survey's co-author, Kenneth Hartman, owner of Lucid Truth Technologies, a digital private investigation agency and forensic consulting firm. "Sounds cool, right? But there's a catch, like a security gremlin hiding in each cloud. That gremlin is complexity."

Organizations need a centralized security architecture and governance to keep those "gremlins" in check, Hartman advised, without blowing the budget on fancy systems that just add to the complexity. It's critical to choose the right cloud providers and individual services, he added, "like picking trustworthy housemates who won't let just anyone in."

Companies focused on solving a business problem might lift and shift existing systems and controls to a CSP. And IT teams might be tasked with addressing tricky integration issues involving technology, protocols and standards after the hybrid cloud environment is up and running. Fixing these issues can cost more than addressing security and compliance upfront. "A weak spot for the cloud, [but] network security is improving," said Dave Shackleford, founder and principal consultant at Voodoo Security.

Fortune 1000 organizations used to bring their existing network security stack with them to meet regulatory requirements. Now, many of these companies use native firewall services and native logging and management tools. "We've done a good job of moving from A to B," noted Shackleford, a SANS instructor and analyst who serves on the institute's board of directors. Still, it's a struggle to find skilled personnel for even basic network security, like firewall administration. "Every tiny bit of processing costs money," he explained. "The expectation of security management is that [IT administrators] are comfortable doing cost optimization around these firewalls."

In theory, API tools and protocols enable web apps, containers and microservices to securely communicate with each other over the internet. But securing APIs remains a major problem. In the SANS survey on multi-cloud, 58.9% of respondents said "poorly configured or insecure APIs or interfaces" was their top concern. APIs can expose the application's back-end logic, as well as sensitive data, making APIs prime targets for attackers. It's almost impossible to have visibility into which APIs are exposed. A critical resource on API vulnerabilities is the Open Web Application Security Project's Top 10 API Security Risks -- 2023.

For most companies, security is ultimately about protection of sensitive data -- where it is, who has access to it and how it's used. Hybrid cloud deployments enable organizations to house sensitive data and applications on private clouds or on premises and take advantage of wider network infrastructure provided by public clouds for managed services, workload distribution and storage. Mapping data flows through these systems, ensuring traceability and understanding how the data is protected in transit and at rest are necessary for legal and regulatory compliance in financial services, healthcare and other industries. Encryption protects data privacy in communication and storage. With IaaS, organizations can specify the physical or geographic storage location, also known as data residency, where their digital data is stored and processed.

Security managers need to have visibility into all resources, systems and data in motion in their organization's hybrid cloud environment. Their number one concern is: "We don't know what is going on," Shackleford said. Better visibility can improve security and compliance. In addition to taking inventory, security teams should monitor all access attempts and configuration changes.

Security teams rely on logs and syslog to monitor application files and network devices for anomalies and potential security events. IaaS providers are starting to offer native security information and event management (SIEM) as a service through tools such as Amazon Detective, Azure Sentinel and Google Chronicle. How do security teams figure out which data to collect for SIEM and which data to leave behind?

"Let's say you get your security logs from a service within four hours of an event of interest. But something changes at the cloud service provider, and now, you're not getting the logs until 12 hours later," Hartman theorized. "Or what if the event never shows up at all?" Threat modeling in the cloud, which has more trust boundaries, can help, he said, adding, "Just make sure that your list of possible threats includes lack of visibility."

The chief information security officer (CISO) protects the company's information assets by setting up a security strategy, policies supporting that strategy and incident response. Mixed environments such as hybrid cloud architecture have a shared security model. Security responsibilities should be documented in contractual agreements with the service provider before a security incident like a data leakage occurs. Supply chains, notorious for security risks, must be compliant with the service provider and the enterprise customer.

Companies focus on the resilience hybrid cloud offers, but "they don't have a cloud strategy," said Lisa McKee, co-founder of consultancy American Security and Privacy. "Where is the data going to go? Who is responsible for patching across these environments? Are access controls going to be outsourced?"

Responsibility for application security is shared with the SaaS provider, but organizations might have limited control over service configuration settings. At the same time, organizations are accountable for platform and application security in IaaS deployments, but the responsibility for configuring and securing the infrastructure is shared. CSPs are responsible for securing their locations and physical assets.

The responsibility for governance, risk and compliance is cross-functional at most organizations, ensuring that business activities align with the company's goals and industry regulations. Guidance is available in frameworks such as HIPAA, Payment Card Industry Data Security Standard, Federal Risk and Authorization Management Program, ISO and NIST 800-83.

CISOs need to align standards and frameworks to overall business and cybersecurity strategies. These efforts will come under the spotlight with the new Securities and Exchange Commission's (SEC) cybersecurity rules. Public companies must report "material cybersecurity incidents" within four days of discovery and provide information on board oversight, cybersecurity policies and procedures in annual reports (10-K and others). In an unprecedented move, the SEC sued SolarWinds, makers of Orion IT management software used by government agencies, alleging the company and its CISO, who is named in the lawsuit, misled and defrauded investors by failing to disclose system vulnerabilities that led to cyberespionage by Russia-backed hackers in 2019.

With the SEC reporting kicking in, Amazon in November offered AWS Cyber Insurance Competency Partners to quantify risk using customer data that's in AWS Security Hub. "This may be a tipping point of an ecosystem of cloud that we never saw coming," Shackleford said.

As hybrid cloud security challenges increase network complexity, CISOs and CIOs face resource cuts. Of the nearly 15,000 IT professionals surveyed in the global 2023 "ISC2 Cybersecurity Workforce Study," 47% said their organizations faced budget constraints. Respondents ranked cloud computing (35%) as the number one skills gap in their security teams, followed by AI and machine learning (32%) and zero-trust implementation (29%).

"The cloud security and operations professionals of today must be able to do so much more than plug in and configure a hardware device," Hartman said. "They need to be very comfortable with infrastructure code up to and including being able to read and write it. They must also have a good grasp of the principles of cloud security architecture and identity and access management systems -- someone who can roll up their sleeves and dive into the details yet keep the big picture in mind."

Organizations need to update their security strategies and design models to better manage their cloud infrastructures, including the following:

Kathleen Richards is a freelance journalist and industry veteran. She's a former features editor for TechTarget's Information Security magazine.

Go here to read the rest:
8 Hybrid Cloud Security Challenges and How to Manage Them - TechTarget

Read More..

IBM and American Tower help enterprises unlock the multi-cloud – ERP Today

IBM has announced a new collaboration with American Tower, a global digital infrastructure provider, to accelerate the deployment of a hybrid, multi-cloud platform at the edge.

As an independent owner, operator and developer of communications real estate solutions, American Tower has a broad portfolio of assets which include almost 225,000 wireless and broadcast towers, rooftops and in-building systems in 25 countries around the globe. This also includes an interconnected footprint of US data center facilities, having acquired CoreSite in 2021.

As part of the collaboration, American Tower plans to expand its neutral-host, Access Edge Data Center ecosystem to include IBM Hybrid Cloud capabilities and Red Hat OpenShift. Both companies will work together to help clients meet the ever-evolving needs of their customers and use technologies such as IoT, 5G, AI and network automation.

With AI and 5G on the rise and creating new business opportunities, both companies will provide the necessary infrastructure for enterprises to help them tap into the full potential of edge computing.

In a blog post, Hillery Hunter, CTO and GM of innovation at IBM Infrastructure and Ed Knapp, CTO and senior VP at American Tower, said: With interest in distributed edge computing on the rise, IBM and American Tower saw an opportunity to leverage their complementary assets and deliver customer value at scale. IBM plans to provide American Tower with a hybrid cloud platform and automated systems to create an edge cloud at American Tower distributed real estate locations.

As a result of this collaboration, we aim to give enterprises more flexibility to deploy applications on public clouds, at the edge, or on premises. This can help to securely process and quickly analyze data closer to the point where it is created.

Across industries, companies are embracing technologies, such as AI and 5G access networks at the edge to heighten innovation and create new business opportunities. American Tower and IBM will be providing the necessary infrastructure for these enterprises to help them tap into the full potential of edge computing.

See more here:
IBM and American Tower help enterprises unlock the multi-cloud - ERP Today

Read More..

Five Key Trends in AI and Data Science for 2024 – MIT Sloan Management Review

Topics AI in Action

This column series looks at the biggest data and analytics challenges facing modern companies and dives deep into successful use cases that can help other organizations accelerate their AI progress.

Carolyn Geason-Beissel/MIT SMR | Getty Images

Artificial intelligence and data science became front-page news in 2023. The rise of generative AI, of course, drove this dramatic surge in visibility. So, what might happen in the field in 2024 that will keep it on the front page? And how will these trends really affect businesses?

During the past several months, weve conducted three surveys of data and technology executives. Two involved MITs Chief Data Officer and Information Quality Symposium attendees one sponsored by Amazon Web Services (AWS) and another by Thoughtworks (not yet published). The third survey was conducted by Wavestone, formerly NewVantage Partners, whose annual surveys weve written about in the past. In total, the new surveys involved more than 500 senior executives, perhaps with some overlap in participation.

Get Updates on Leading With AI and Data

Get monthly insights on how artificial intelligence impacts your organization and what it means for your company and customers.

Please enter a valid email address

Thank you for signing up

Privacy Policy

Surveys dont predict the future, but they do suggest what those people closest to companies data science and AI strategies and projects are thinking and doing. According to those data executives, here are the top five developing issues that deserve your close attention:

As we noted, generative AI has captured a massive amount of business and consumer attention. But is it really delivering economic value to the organizations that adopt it? The survey results suggest that although excitement about the technology is very high, value has largely not yet been delivered. Large percentages of respondents believe that generative AI has the potential to be transformational; 80% of respondents to the AWS survey said they believe it will transform their organizations, and 64% in the Wavestone survey said it is the most transformational technology in a generation. A large majority of survey takers are also increasing investment in the technology. However, most companies are still just experimenting, either at the individual or departmental level. Only 6% of companies in the AWS survey had any production application of generative AI, and only 5% in the Wavestone survey had any production deployment at scale.

Surveys suggest that though excitement about generative AI is very high, value has largely not yet been delivered.

Production deployments of generative AI will, of course, require more investment and organizational change, not just experiments. Business processes will need to be redesigned, and employees will need to be reskilled (or, probably in only a few cases, replaced by generative AI systems). The new AI capabilities will need to be integrated into the existing technology infrastructure.

Perhaps the most important change will involve data curating unstructured content, improving data quality, and integrating diverse sources. In the AWS survey, 93% of respondents agreed that data strategy is critical to getting value from generative AI, but 57% had made no changes to their data thus far.

Companies feel the need to accelerate the production of data science models. What was once an artisanal activity is becoming more industrialized. Companies are investing in platforms, processes and methodologies, feature stores, machine learning operations (MLOps) systems, and other tools to increase productivity and deployment rates. MLOps systems monitor the status of machine learning models and detect whether they are still predicting accurately. If theyre not, the models might need to be retrained with new data.

Producing data models once an artisanal activity is becoming more industrialized.

Most of these capabilities come from external vendors, but some organizations are now developing their own platforms. Although automation (including automated machine learning tools, which we discuss below) is helping to increase productivity and enable broader data science participation, the greatest boon to data science productivity is probably the reuse of existing data sets, features or variables, and even entire models.

In the Thoughtworks survey, 80% of data and technology leaders said that their organizations were using or considering the use of data products and data product management. By data product, we mean packaging data, analytics, and AI in a software product offering, for internal or external customers. Its managed from conception to deployment (and ongoing improvement) by data product managers. Examples of data products include recommendation systems that guide customers on what products to buy next and pricing optimization systems for sales teams.

But organizations view data products in two different ways. Just under half (48%) of respondents said that they include analytics and AI capabilities in the concept of data products. Some 30% view analytics and AI as separate from data products and presumably reserve that term for reusable data assets alone. Just 16% say they dont think of analytics and AI in a product context at all.

We have a slight preference for a definition of data products that includes analytics and AI, since that is the way data is made useful. But all that really matters is that an organization is consistent in how it defines and discusses data products. If an organization prefers a combination of data products and analytics and AI products, that can work well too, and that definition preserves many of the positive aspects of product management. But without clarity on the definition, organizations could become confused about just what product developers are supposed to deliver.

Data scientists, who have been called unicorns and the holders of the sexiest job of the 21st century because of their ability to make all aspects of data science projects successful, have seen their star power recede. A number of changes in data science are producing alternative approaches to managing important pieces of the work. One such change is the proliferation of related roles that can address pieces of the data science problem. This expanding set of professionals includes data engineers to wrangle data, machine learning engineers to scale and integrate the models, translators and connectors to work with business stakeholders, and data product managers to oversee the entire initiative.

Another factor reducing the demand for professional data scientists is the rise of citizen data science, wherein quantitatively savvy businesspeople create models or algorithms themselves. These individuals can use AutoML, or automated machine learning tools, to do much of the heavy lifting. Even more helpful to citizens is the modeling capability available in ChatGPT called Advanced Data Analysis. With a very short prompt and an uploaded data set, it can handle virtually every stage of the model creation process and explain its actions.

Of course, there are still many aspects of data science that do require professional data scientists. Developing entirely new algorithms or interpreting how complex models work, for example, are tasks that havent gone away. The role will still be necessary but perhaps not as much as it was previously and without the same degree of power and shimmer.

This past year, we began to notice that increasing numbers of organizations were cutting back on the proliferation of technology and data chiefs, including chief data and analytics officers (and sometimes chief AI officers). That CDO/CDAO role, while becoming more common in companies, has long been characterized by short tenures and confusion about the responsibilities. Were not seeing the functions performed by data and analytics executives go away; rather, theyre increasingly being subsumed within a broader set of technology, data, and digital transformation functions managed by a supertech leader who usually reports to the CEO. Titles for this role include chief information officer, chief information and technology officer, and chief digital and technology officer; real-world examples include Sastry Durvasula at TIAA, Sean McCormack at First Group, and Mojgan Lefebvre at Travelers.

This evolution in C-suite roles was a primary focus of the Thoughtworks survey, and 87% of respondents (primarily data leaders but some technology executives as well) agreed that people in their organizations are either completely, to a large degree, or somewhat confused about where to turn for data- and technology-oriented services and issues. Many C-level executives said that collaboration with other tech-oriented leaders within their own organizations is relatively low, and 79% agreed that their organization had been hindered in the past by a lack of collaboration.

We believe that in 2024, well see more of these overarching tech leaders who have all the capabilities to create value from the data and technology professionals reporting to them. Theyll still have to emphasize analytics and AI because thats how organizations make sense of data and create value with it for employees and customers. Most importantly, these leaders will need to be highly business-oriented, able to debate strategy with their senior management colleagues, and able to translate it into systems and insights that make that strategy a reality.

This column series looks at the biggest data and analytics challenges facing modern companies and dives deep into successful use cases that can help other organizations accelerate their AI progress.

Thomas H. Davenport (@tdav) is the Presidents Distinguished Professor of Information Technology and Management at Babson College, a fellow of the MIT Initiative on the Digital Economy, and senior adviser to the Deloitte Chief Data and Analytics Officer Program. He is coauthor of All in on AI: How Smart Companies Win Big With Artificial Intelligence (HBR Press, 2023) and Working With AI: Real Stories of Human-Machine Collaboration (MIT Press, 2022). Randy Bean (@randybeannvp) is an industry thought leader, author, founder, and CEO and currently serves as innovation fellow, data strategy, for global consultancy Wavestone. He is the author of Fail Fast, Learn Faster: Lessons in Data-Driven Leadership in an Age of Disruption, Big Data, and AI (Wiley, 2021).

Excerpt from:

Five Key Trends in AI and Data Science for 2024 - MIT Sloan Management Review

Read More..

Five Key Trends in AI and Data Science for 2024 From MIT Sloan Management Review – Yahoo Finance

CAMBRIDGE, Mass., Jan. 9, 2024 /PRNewswire/ -- MIT Sloan Management Reviewreveals the insights of more than 500 senior data and technology executives in"Five Key Trends in AI and Data Science for 2024," a part of its AI in Actionseries.

Artificial intelligence and data science became front-page news in 2023 thanks to generative AI, state coauthorsThomas H. Davenport, the President's Distinguished Professor of Information Technology and Management at Babson College and a fellow of the MIT Initiative on the Digital Economy, and Randy Bean, an industry thought leader who currently serves as innovation fellow, data strategy, for global consultancy Wavestone.

To find out what might keep it on the front page in 2024, Davenport and Bean, during the past several months, conducted three surveys involving more than 500 executives closest to companies' data science and AI strategies to bring to light what organizations are thinking and doing.

"Data science is increasingly critical to every organization. But it's not a static discipline, and organizations need to continually adjust data science skills and processes to get the full value from data, analytics, and AI," said Davenport.

"Expect 2024 to be a year of transformation and change driven by adoption of AI and a reshaping of the data, analytics, and AI leadership role within leading companies," added Bean. "With 33% of midsize to large organizations having appointed or in search of a chief AI officer, and with 83.2% of leading companies having a chief data and analytics officer in place today, it is inevitable that we will witness consolidation of roles, restructuring of responsibilities, elimination of some positions, and some critical rethinking of data and AI leadership expectations during the course of 2024."

"Five Key Trends in AI and Data Science for 2024"culls the surveys to identify developing issues that should be on every leader's radar screen this year:

Story continues

Generative AI sparkles but needs to deliver value. Survey responses suggest that although excitement is high, the value of generative AI has not been delivered. Large percentages of respondents believe the technology has the potential to be transformational; 80% in one survey said they believe it will transform their organizations, and 64% in another survey said it is the most transformational technology in a generation. A large majority of survey takers are also increasing investment in the technology.

Data science is shifting from artisanal to industrial. Companies are investing in platforms, processes and methodologies, feature stores, machine learning operations (MLOps) systems, and other tools to increase productivity and deployment rates. Automation is helping to increase productivity and enable broader data science participation.

Two versions of data products will dominate. Eighty percent of data and technology leaders in one survey said that their organizations were using or considering the use of data products and product management. But they mean two different things by "data products." Just under half (48%) of respondents said that they include analytics and AI capabilities in the concept of data products. Some 30% view analytics and AI as separate from data products and presumably reserve that term for reusable data assets alone. What matters is that an organization is consistent in how it defines and discusses data products.

Data scientists will become less sexy. The proliferation of roles such as data engineers that can address pieces of the data science problem along with the rise of citizen data science, where savvy businesspeople create models or algorithms themselves is causing the star power of data scientists to recede.

Data, analytics, and AI leaders are becoming less independent.In 2023, increasing numbers of organizations cut back on the proliferation of technology and data "chiefs," including chief data and analytics officers (and sometimes chief AI officers). The functions performed by data and analytics executives haven't gone away; rather, they're increasingly being subsumed within a broader set of technology, data, and digital transformation functions managed by a "supertech leader" who usually reports to the CEO. In 2024, expect to see more of these overarching tech leaders who have all the capabilities to create value from the data and technology professionals reporting to them.

The MIT Sloan Management Review article "Five Key Trends in AI and Data Science for 2024"publishes at 8 a.m. ET on Jan. 9, 2024. This column is part of the series AI in Action.

About the AuthorsThomas H. Davenportis the President's Distinguished Professor of Information Technology and Management at Babson College, a fellow of the MIT Initiative on the Digital Economy, and senior adviser to the Deloitte Chief Data and Analytics Officer Program.He is coauthor of All-In On AI: How Smart Companies Win Big With Artificial Intelligence(HBR Press, 2023) and Working With AI: Real Stories of Human-Machine Collaboration(MIT Press, 2022). Randy Beanis an industry thought leader, author, founder, and CEO and currently serves as innovation fellow, data strategy, for global consultancy Wavestone. He is the author of Fail Fast, Learn Faster: Lessons in Data-Driven Leadership in an Age of Disruption, Big Data, and AI(Wiley, 2021).

About MIT Sloan Management ReviewMIT Sloan Management Reviewis an independent, research-based magazine and digital platform for business leaders published at the MIT Sloan School of Management. MIT SMR explores how leadership and management are transforming in a disruptive world. We help thoughtful leaders capture the exciting opportunities and face down the challenges created as technological, societal, and environmental forces reshape how organizations operate, compete, and create value.

Connect with MIT Sloan Management Review on:

Tess WoodsTess@TessWoodsPR.com 617-942-0336

Cision

View original content to download multimedia:https://www.prnewswire.com/news-releases/five-key-trends-in-ai-and-data-science-for-2024-from-mit-sloan-management-review-302029337.html

SOURCE MIT Sloan Management Review

Read more:

Five Key Trends in AI and Data Science for 2024 From MIT Sloan Management Review - Yahoo Finance

Read More..

Five Key Trends in AI and Data Science for 2024 From MIT Sloan Management Review – PR Newswire

CAMBRIDGE, Mass., Jan. 9, 2024 /PRNewswire/ -- MIT Sloan Management Reviewreveals the insights of more than 500 senior data and technology executives in"Five Key Trends in AI and Data Science for 2024," a part of its AI in Actionseries.

Artificial intelligence and data science became front-page news in 2023 thanks to generative AI, state coauthorsThomas H. Davenport, the President's Distinguished Professor of Information Technology and Management at Babson College and a fellow of the MIT Initiative on the Digital Economy, and Randy Bean, an industry thought leader who currently serves as innovation fellow, data strategy, for global consultancy Wavestone.

Orgs need to continually adjust data science skills and processes to get the full value from data, analytics, and AI.

To find out what might keep it on the front page in 2024, Davenport and Bean, during the past several months, conducted three surveys involving more than 500 executives closest to companies' data science and AI strategies to bring to light what organizations are thinking and doing.

"Data science is increasingly critical to every organization. But it's not a static discipline, and organizations need to continually adjust data science skills and processes to get the full value from data, analytics, and AI," said Davenport.

"Expect 2024 to be a year of transformation and change driven by adoption of AI and a reshaping of the data, analytics, and AI leadership role within leading companies," added Bean. "With 33% of midsize to large organizations having appointed or in search of a chief AI officer, and with 83.2% of leading companies having a chief data and analytics officer in place today, it is inevitable that we will witness consolidation of roles, restructuring of responsibilities, elimination of some positions, and some critical rethinking of data and AI leadership expectations during the course of 2024."

"Five Key Trends in AI and Data Science for 2024"culls the surveys to identify developing issues that should be on every leader's radar screen this year:

Generative AI sparkles but needs to deliver value. Survey responses suggest that although excitement is high, the value of generative AI has not been delivered. Large percentages of respondents believe the technology has the potential to be transformational; 80% in one survey said they believe it will transform their organizations, and 64% in another survey said it is the most transformational technology in a generation. A large majority of survey takers are also increasing investment in the technology.

Data science is shifting from artisanal to industrial. Companies are investing in platforms, processes and methodologies, feature stores, machine learning operations (MLOps) systems, and other tools to increase productivity and deployment rates. Automation is helping to increase productivity and enable broader data science participation.

Two versions of data products will dominate. Eighty percent of data and technology leaders in one survey said that their organizations were using or considering the use of data products and product management. But they mean two different things by "data products." Just under half (48%) of respondents said that they include analytics and AI capabilities in the concept of data products. Some 30% view analytics and AI as separate from data products and presumably reserve that term for reusable data assets alone. What matters is that an organization is consistent in how it defines and discusses data products.

Data scientists will become less sexy. The proliferation of roles such as data engineers that can address pieces of the data science problem along with the rise of citizen data science, where savvy businesspeople create models or algorithms themselves is causing the star power of data scientists to recede.

Data, analytics, and AI leaders are becoming less independent.In 2023, increasing numbers of organizations cut back on the proliferation of technology and data "chiefs," including chief data and analytics officers (and sometimes chief AI officers). The functions performed by data and analytics executives haven't gone away; rather, they're increasingly being subsumed within a broader set of technology, data, and digital transformation functions managed by a "supertech leader" who usually reports to the CEO. In 2024, expect to see more of these overarching tech leaders who have all the capabilities to create value from the data and technology professionals reporting to them.

The MIT Sloan Management Review article "Five Key Trends in AI and Data Science for 2024"publishes at 8 a.m. ET on Jan. 9, 2024. This column is part of the series AI in Action.

About the AuthorsThomas H. Davenportis the President's Distinguished Professor of Information Technology and Management at Babson College, a fellow of the MIT Initiative on the Digital Economy, and senior adviser to the Deloitte Chief Data and Analytics Officer Program.He is coauthor of All-In On AI: How Smart Companies Win Big With Artificial Intelligence(HBR Press, 2023) and Working With AI: Real Stories of Human-Machine Collaboration(MIT Press, 2022). Randy Beanis an industry thought leader, author, founder, and CEO and currently serves as innovation fellow, data strategy, for global consultancy Wavestone. He is the author of Fail Fast, Learn Faster: Lessons in Data-Driven Leadership in an Age of Disruption, Big Data, and AI(Wiley, 2021).

About MIT Sloan Management ReviewMIT Sloan Management Reviewis an independent, research-based magazine and digital platform for business leaders published at the MIT Sloan School of Management. MIT SMR explores how leadership and management are transforming in a disruptive world. We help thoughtful leaders capture the exciting opportunities and face down the challenges created as technological, societal, and environmental forces reshape how organizations operate, compete, and create value.

Connect with MIT Sloan Management Review on:

Tess Woods[emailprotected] 617-942-0336

SOURCE MIT Sloan Management Review

Visit link:

Five Key Trends in AI and Data Science for 2024 From MIT Sloan Management Review - PR Newswire

Read More..

The Future Of Clinical Data Science Is Closer Than It Appears – Clinical Leader

By Clinical Leader Editorial Staff

The pharmaceutical and biotech industries have experienced multiple sea changes in the last few years, as emerging science promises advanced therapeutics that require new, complex clinical trial designs. But has the technology that supports clinical research kept up with the science behind it? In 2021, Patrick Nadolny, global head of clinical data management at Sanofi, made several predictions about clinical trial technology in Designing a Data Management Strategy for the Future. Clinical Leader editorial staff recently caught up with Nadolny to revisit his predictions, examine current trends in clinical trial technology, and imagine what innovations will shape the industry in the next few years.

Nadolny predicted that data management would evolve into clinical data science due to the influx of data sources and emerging protocols with new trial designs. This evolution has already begun, and its progress hinges on four main pillars: risk-based methodologies, AI, complex protocol designs, and DCTs.

How Are Risk-Based Methodologies Changing Trial Life Cycle?

First, risk-based methodologies have transformed study conduct. The 2016 ICH-E6 revision on good clinical practice demonstrated the urgency of adopting and adapting to risk-based study monitoring across functions, which is anticipated to be re-enforced in the upcoming revision.1Going beyond study conduct, the ICH E8 revision advocates for quality by design, focusing on both critical quality factors and operational feasibility.2This requires clinical data scientists earlier involvement to design appropriate data collection and review strategies. Likewise, the EMAs recent reflection paper on the use of AI in clinical research is fully risk-based, providing early insights on risk levels and validation needs for AI solutions during the drug development lifecycle.3

Moreover, todays clinical trials generate enormous amounts of data, with a growing proportion being eSource, and Nadolny states that current technology is not as efficient in managing the 5Vs of clinical data as required (volume, velocity, variety, veracity, and value), but its progressing in the right direction.4Also, risk-based approaches interconnect with the other three pillars. For example, companies may turn to AI to manage the large amounts of data generated from these studies to identify risks or automate repetitive activities. Likewise, these protocols are complex and often employ DCTs or hybrid technology in conjunction with risk-based approaches.

In the past, companies used the same processes for multiple studies, Nadolny explained. "But now, the methods developed for one study may not translate to the next. Several years ago, choosing a trial design was like finding a recipe in a cookbook. But now, companies are given a list of ingredients and must decide the best way to combine them to maximize each element while creating a unified whole."

What Is The Role Of AI?

AI is also revolutionizing the pharmaceutical and biotech industries. Nadolny explained that although AI is not a new technology, the advent of generative AI platforms like ChatGPT has accelerated investment and interest.

AI, especially generative AI, can be useful across the entire lifespan of a trial, from recruitment to post-study data management, Nadolny stated. Previously, AI utility was limited to simpler tasks such as identifying data patterns or reading medical images, but generative AI could ultimately also create study plans, read protocols, and suggest potential root causes of a study problem. Additionally, it can assist recruitment by better identifying potential participants. With DCTs, it can review data and identify complex data anomalies from wearable technologies. AI solutions will develop rapidly in the next few years as large and small pharmaceutical and biotech companies discover ways to automate or radically transform their processes to improve study timelines.

According to Nadolny, there are not enough qualified people in the industry to manage the vast volume of information generated by todays studies, and AI is necessary to create insights from billions of data points and various data sources. Many processes could be automated to reduce workload and inefficiencies in clinical trials, expediting data management. However, AI may not be the best solution for every study currently underway. For ongoing large or long-term trials, such as many oncology studies, integrating AI after the study starts would not necessarily save time or effort due to the complexity of transitioning to a new model. However, for new studies, implementing this technology from the beginning can accelerate timelines and enable new processes that companies can use as a template for future studies.

How Will Protocols Become Patient-Driven?

Meanwhile, complex protocol designs present more challenges to clinical data scientists. Umbrella, basket, and adaptive trial designs are just a few study protocols that can accelerate drug development but create operational complexities for data management. For example, evaluating one therapy across several indications simultaneously in a basket study is more efficient than examining one indication at a time but adds complexity to data collection across multiple medical conditions. Likewise, adaptive study design improves the predictability of study outcomes by allowing sponsors to adjust dosages and timing based on individual participant responses to the IP. However, collecting and managing the interconnected web of data these studies generate is an intricate process that today's platforms aren't fully equipped for. Too often, information is siloed, and separate systems must be integrated and reconfigured for each design adaptation, adding time to the study.

Operationally complex protocol designs may also result from the desire to meet requirements from regulatory bodies, such as ensuring patient diversity and patient centricity. Nadolny emphasizes that being patient-driven is a complex issue and is not the same as being patient-centric. For example, decentralized clinical trial procedures appear patient-centric because they allow subjects to participate remotely. However, mandating telehealth technology or wearable devices may burden some participants who would rather go to a traditional clinical setting to receive care.

On the other hand, a truly patient-driven trial would be much more complex. A patient-driven trial considers these factors and creates a flexible operational design that best fits each participant's lifestyle. Subjects would choose between participating in the trial remotely, in-person, or a hybrid mix. However, this hypothetical trial design is not yet possible to deploy efficiently with today's technology because it would create protocols that are too complex to pragmatically operationalize. The push for greater patient centricity and growing recruitment needs may drive the industry toward achieving highly adaptable, customizable trials. Nadolny predicts that technology will adapt to make such trials possible in the next two to five years.

Are Fully Decentralized Trials Possible?

In addition to meeting patients needs, the DCT trend that took off during the COVID-19 pandemic shows no signs of slowing down. Nadolny expects DCTs to continue to rise in popularity in response to other types of emergencies, such as wars or natural disasters, which can otherwise halt studies. By decentralizing trials and running global studies, companies can pivot when factors beyond their control shut down sites. However, Nadolny points out that currently, no single platform can run a fully decentralized pivotal clinical trial, and DCT technology is often a patchwork of integrated solutions. He expects the industry to invest heavily in creating new systems to accommodate the unique needs of DCTs.

The pandemic forced the industry to rethink how we work to become more resilient and adaptable," Nadolny explained. Weve learned to balance the risk of implementing new technology against the risk of doing nothing. The industry is changing, if slowly. There's a divide between the old ways and the new, and we're still coping with legacy systems while investing in the future."

In his 2021 predictions, Nadolny stated that data managers weren't fully utilizing emerging technologies because decentralized workflows and shifting protocol designs were still very new, and users faced challenges adjusting to the new normal. Currently, however, data management is catching up with industry trends.

"Everything that's happened in the past few years has forced us to adapt and maximize all our solutions," Nadolny explained. "Data management has evolved significantly. However, we're still putting patches on things and learning what we can leverage regarding new protocol designs and technologies. We still have room to improve, especially regarding DCT support, but we're moving in the right direction."

What Is The Future Of Data Management?

In 2021, Nadolny stated that clinical data management needed to evolve into clinical data science. That evolution is still necessary and is ongoing. As risk-based methodologies, AI, complex protocols, and DCTs continue to shape the industry, data management platforms must adapt to meet their needs. In addition, resiliency to emergency crises has become an imperative to infuse into our daily operations. Therefore, technology must adapt to ongoing clinical research changes at the study, country, site, and even patient level. At the same time, technology should allow for greater patient-driven solutions by giving subjects more opportunities to participate on their terms. Nadolny is optimistic that these changes will benefit companies, sites, and patients.

The industry will continue to show resiliency as we walk the tightrope between adaptive, highly complex protocol designs and patient centricity, Nadolny states. Well also see the gap close between clinical research and regular standards of care so that we dont have different processes for running a study and caring for patients. The technology we need to cope with todays challenges is still emerging, but were closer today than we were three years ago.

Here is the original post:

The Future Of Clinical Data Science Is Closer Than It Appears - Clinical Leader

Read More..

Symposium highlights UGAs interdisciplinary AI and data science research and scholarship – University of Georgia

Ian Bogost, center, of Washington University in St. Louis speaks during a panel discussion at the AI and Data Science Across Disciplines Symposium on Nov. 30 at the University of Georgia Center for Continuing Education & Hotel. Meg Mittelstadt, left, director of UGAs Center for Teaching and Learning, and Tianming Liu, right, Distinguished Research Professor in the School of Computing, and Youjin Kong (not pictured), assistant professor in the department of philosophy, joined Bogost in discussing advances in artificial intelligence. (Photo by Mike Wooten)

Faculty from across the University of Georgia campus gathered on Nov. 30 to discuss the expanding influence of artificial intelligence, share insights into their research and consider how AI may shape higher education and society in the future.

The universitys inaugural Artificial Intelligence and Data Science Across Disciplines Symposium was hosted by the Institute for Artificial Intelligence with support from the Office of the Senior Vice President for Academic Affairs and Provost, the Office of Research and the Franklin College of Arts of Sciences.

The symposium is part of a series of events geared toward bringing together the AI and data science faculty at UGA, said Khaled M. Rasheed, director of the Institute for Artificial Intelligence and a professor in the School of Computing. It was an exciting opportunity for the AI community at UGA to connect and learn.

The symposium, held at the University of Georgia Center for Continuing Education & Hotel, showcased UGAs significant investments in the fields of artificial intelligence and data science. Those investments include an ambitious presidential interdisciplinary faculty hiring initiative that aims to recruit 70 faculty members with expertise in applying data science and artificial intelligence to some of societys most urgent challenges.

Rather than being housed exclusively in a single department, the majority of UGAs newly recruited faculty will focus on the fusion of data science and AI in cross-cutting areas such as infectious diseases, integrative precision agriculture, ethics, cybersecurity, resilient communities and the environment.

The breadth of experience and expertise at the University of Georgia uniquely positions our institution to advance AI and data science scholarship and research, said Jeanette Taylor, the universitys vice provost for academic affairs. We are able to integrate perspectives from a diverse array of disciplines as we consider not only potential uses for AI but also the ethical and social questions that arise.

Ian Bogost, Barbara and David Thomas Distinguished Professor at Washington University in St. Louis with a dual appointment as professor and director of film and media studies and professor of computer science and engineering, provided the symposiums keynote address.

Bogost urged attendees to avoid viewing generative AI, such as ChatGPT and Dall-E, as a tool for process optimization at the expense of imagination.

AI works best for me when I use it to extend my imagination, he said.

The symposium also featured two lightning rounds of brief talks by UGA faculty members from a wide range of disciplines. Faculty highlighted their use of AI and data science in research topics such as crop modeling and assessment, physics-informed machine learning for infectious disease forecasting, data science in advanced manufacturing and AIs integration into society.

A panel discussion closed the symposium. Participants examined the impact of AI and ChatGPT on teaching and learning at UGA, industries that stand to benefit from AI and the ethics of AI in research and society, among other topics.

Building upon the momentum of the symposium, UGAs Office of Research will host an AI Team Acceleration Event on Feb. 5 at the Delta Innovation Hub. This event will include presentations from research teams funded by Presidential Interdisciplinary Seed Grants and an overview of major university resources available to research teams.

UGA is now gathering input from faculty regarding potential interdisciplinary research collaborations. The Office of Research will filter those responses through AI to identify affinity groups faculty can join, and the AI Team Acceleration Event will include time for those groups to meet and begin discussions of possible research projects.

See original here:

Symposium highlights UGAs interdisciplinary AI and data science research and scholarship - University of Georgia

Read More..

Examining the Influence Between NLP and Other Fields of Study – Towards Data Science

MOTIVATION

A fascinating aspect of science is how different fields of study interact and influence each other. Many significant advances have emerged from the synergistic interaction of multiple disciplines. For example, the conception of quantum mechanics is a theory that coalesced Plancks idea of quantized energy levels, Einsteins photoelectric effect, and Bohrs atom model.

The degree to which the ideas and artifacts of a field of study are helpful to the world is a measure of its influence.

Developing a better sense of the influence of a field has several benefits, such as understanding what fosters greater innovation and what stifles it, what a field has success at understanding and what remains elusive, or who are the most prominent stakeholders benefiting and who are being left behind.

Mechanisms of field-to-field influence are complex, but one notable marker of scientific influence is citations. The extent to which a source field cites a target field is a rough indicator of the degree of influence of the target on the source. We note here, though, that not all citations are equal and subject to various biases. Nonetheless, meaningful inferences can be drawn at an aggregate level; for example, if the proportion of citations from field x to a target field y has markedly increased as compared to the proportion of citations from other fields to the target, then it is likely that the influence of x on y has grown.

WHY NLP?

While studying influence is useful for any field of study, we focus on Natural language Processing (NLP) research for one critical reason.

NLP is at an inflection point. Recent developments in large language models have captured the imagination of the scientific world, industry, and the general public.

Thus, NLP is poised to exert substantial influence despite significant risks. Further, language is social, and its applications have complex social implications. Therefore, responsible research and development need engagement with a wide swathe of literature (arguably, more so for NLP than other fields).

By tracing hundreds of thousands of citations, we systematically and quantitatively examine broad trends in the influence of various fields of study on NLP and NLPs influence on them.

We use Semantic Scholars field of study attribute to categorize papers into 23 fields, such as math, medicine, or computer science. A paper can belong to one or many fields. For example, a paper that targets a medical application using computer algorithms might be in medicine and computer science. NLP itself is an interdisciplinary subfield of computer science, machine learning, and linguistics. We categorize a paper as NLP when it is in the ACL Anthology, which is arguably the largest repository of NLP literature (albeit not a complete set of all NLP papers).

Read this article:

Examining the Influence Between NLP and Other Fields of Study - Towards Data Science

Read More..

Best Data Analytics Courses in the USA to Enroll in 2024 – Analytics Insight

Data analytics is a rapidly growing field that requires a combination of technical and analytical skills. With so many courses available online, it can be challenging to choose the right one. In 2024, for individuals looking for the best Data Analytics courses in the USA, programs typically cover data interpretation, website tracking codes, marketing campaigns, and program management. Emphasis is often placed on integrating analytics into various business facets. The USA, a premier destination for international students, boasts globally ranked universities and diverse study locations. A masters degree follows the completion of an undergraduate program. In this article, we will explore the best data analytics courses in the USA to enroll in 2024.

University: Canisius University

Campus location: Buffalo, USA

Duration: 1-3 years

Tuition fee: US$910/per credit

Course Description: With the MS in Data Analytics in the USA program at Canisius University, set off on a life-changing adventure. The curriculum lasts one to three years and is based in Buffalo, USA. It offers a thorough education in the ever-evolving subject of data analytics for US$910 per credit.

Enroll now

University: Pacific Lutheran University

Campus location: Tacoma, USA

Duration: 9-21 months

Tuition fee: US$1,104 / per credit

Course Description:

Pacific Lutheran University offers a dynamic Master of Science in Marketing Analytics in Tacoma, USA. This 9-21 month program equips students with strategic insights. The tuition is USD 1,104 per credit, ensuring a high-quality education in the heart of the Pacific Northwest.

Enroll now

University: Drew University

Campus location: Madison, USA

Duration: 1-2 years

Tuition fee: US$ 22,248 / per credit

Course Description: Get a data analytics certification program, Drew University offers a cutting-edge Master of Science in Data Science program in Madison, USA. Designed to be completed in 1-2 years, this advanced degree provides a comprehensive exploration of data science, equipping students with the skills and knowledge needed for success in this dynamic field.

Enroll now

University: Alliant International University

Campus Location: San Diego, USA

Duration: 1 year

Tuition fee: US$ 768 / per credit

Course Description: Join the MS program in Healthcare Analytics at Alliant International University in San Diego, USA, to start a life-changing adventure. Immerse yourself in cutting-edge insights at a lively campus in only one year, opening the door to a fascinating career in the rapidly changing field of healthcare analytics.

Enroll now

University: Illinois Institute of Technology

Campus Location: Chicago, USA

Duration: 2 years

Tuition fee: US$ 1,712 / per credit

Course Description:

With the Illinois Institute of Technologys Master of Science in Sustainability Analytics and Management, set off on a life-changing adventure. This two-year, STEM-designated program in the heart of Chicago gives students access to cutting-edge perspectives. It is a forward-thinking investment in a sustainable future, valued at US$1,712 per credit.

Enroll now

University: Southern Methodist University

Campus Location: Dallas, USA

Duration: 2 years

Tuition fee: US$ 74,000

Course Description:

The MS in Applied Statistics and Data Analytics at Southern Methodist University, located in Dallas, USA, spans two years. This program promises a comprehensive exploration of statistical methodologies and data analytics, preparing students for impactful roles in the dynamic field of data science.

Enroll now

University: Mercyhurst University

Campus Location: Erie, USA

Duration: 2 years

Tuition fee: USD 33,000 / per year

Course Description:

Pursue a Master of Science in Applied Intelligence at Mercyhurst University in the United States and set off on a revolutionary adventure. The two-year curriculum, which is based in Erie, provides a dynamic combination of theoretical knowledge and real-world application in the constantly changing field of applied intelligence.

Enroll now

Join our WhatsApp and Telegram Community to Get Regular Top Tech Updates

Read the rest here:

Best Data Analytics Courses in the USA to Enroll in 2024 - Analytics Insight

Read More..