Page 736«..1020..735736737738..750760..»

Partners in data: Q&A with Vice Provost Dennis Manos – William & Mary

The U.S. Department of EnergysHigh Performance Data Facility hubwill make its home in Newport News, asannouncedby the DOE Office of Science last week. Alongtime partner of William & Mary, the Thomas Jefferson National Accelerator Facility (Jefferson Lab or JLab) willleadthe new $300+ million hub specializing in advanced infrastructure for data-intensive science, accelerating scientific research and discovery.

William & Mary strongly supported the Jefferson Labs proposal to lead the new facility. W&M News discussed the importance of this partnership with Dennis Manos, CSX Professor of Applied Science and Physics and vice provost for research.

The interview has been edited for length and clarity.

A: Our relationship with Jefferson Lab goes back to its earliest inception. Historically, the state worked through William & Mary to possess a property that once served as the Space Radiation Effects Laboratory for NASA, where accelerators had been in use earlier. The SREL was on the site that Jefferson Lab currently occupies. At that time, W&M physics faculty membersRobert Siegel,Hans von Baeyer and Franz Gross, working with James McCarthy of University of Virginia, were instrumental in putting together a winning proposal to the Department of Energy.

After a lot of wrangling, by 1986 this effort led to the 4 GeV electron ring now known as theContinuous Electron Beam Accelerator Facility. One very novel feature of the later stages of that proposal was the idea that it should use superconducting cavities to replace what until that time had been normal-conducting copper components. In that decision, a lot of very interesting accelerator-related physics, engineering and material science emerged, allowing Jefferson Lab to make contributions not only in nuclear physics, but in related areas so long as the related work could be justified as serving the nuclear physics mission.

A: Since the time I came to Virginia, we have been working on a variety of topics to help Jefferson Lab to relax the constraints associated with being a single-purpose laboratory. In that regard, building the new High Performance Data Facility will create that change very quickly, permitting it to become a lead laboratory for the national lab system, and allowing JLab scientists to work on a very wide range of technical topics of great importance. That is the administrative importance of this hub.

The scientific, engineering and technology content of the hub will be truly amazing. It will create a platform to develop physical resources to allow exabyte computing (equal to, or greater than a billion-billion arithmetic operations each second). There will also be work on non-traditional computers whose hardware is based on entangled qubits to introducequantum computingmethods and quantum information systems for communications and security applications.

We hope ultimately to see great advances in information storage systems capable of rapidly storing more than yottabytes of information safely and securely.

What this means is that 10 years from now, JLab will be taking on problems that you wouldnt think of taking on now. Using a combination of traditional and quantum information systems, you will be able to take on problems in chaotic dynamics, or immensely complicated ecological or environmental problems, very large network problems, and more. This is all very, very different from what weve been doing so far. Its going to be an exciting time.

A: For the last five years, working with the provost and the president and many other people, William & Mary technologists have been putting together the necessary path to bring together physics, data science, computer science and applied science.

A detailed proposal for this new unit, which will offer every phase of education, including undergraduate and Ph.D. degrees, is almost ready to go forward to the Board of Visitors and the State Council on Higher Education in Virginia. Hopefully, once approved, we can bring it immediately to bear on these interesting problems.

If you look at the W&M strategic plan,Vision 2026, youll find that we intend to bring the best version of high-touch liberal arts education, with both breadth and depth, to all the students who come to our school. And this hub will be a perfect example of tools to bring the full weight of technology to serve our liberal arts programs because every department and program has the capacity to bring something important to and take something important from the data science and engineering methods available in this enormously powerful center.

Some problems are born so hard that we call them wicked, like eliminating inequality, poverty and violent conflict or the problems of sustaining democracy itself. We do not know how to prove that these problems even have solutions, but our faculty seek to use the most powerful technology to examine the development of better predictive models of large-scale organizations that may guide potential directions for policy, law or institutional operations to at least relieve some of the pressures we are experiencing.

The technology that is yet-to-come, like quantum information systems, may help us approach problems that do not map onto simple algorithms. In the coming hub, we hope to seek solutions using machines built from qubits. Such computers dont yet exist at a large enough scale to be tried, and there appear to be many problems in creating large machines that might be able to run long enough to reach an answer. The quest to develop them is worth the effort though, since they just might allow us to predict the future a bit better, by simulating it, to include all of the various branching probabilities.

Prediction is one of the hallmarks of advanced science; good prediction allows better design. Good prediction also allows better control. Good design and control are the hallmark of advanced engineering technology, on which good economies depend. So, it is in the W&M strategic plan to develop data methods to allow us to predict a bit more accurately, and a bit farther out in time, to allow us to control the changes we know must occur. All this serves our W&M internal economy, as it serves the Commonwealth we belong to.

A: Over the past few years, we have seen 2D and 3D artists come over to ourMakerspacesto use our scanning electron microscopes to visualize things that are too small to see in order to render representations of them. Creative artists use whatever tools they can to bring impressions of the world into their own head, to interpret it, redefine it and refine it to create a perception they can share with other people. Thats the very function of the tool we call a language.

Exascale computersprovide yet another such tool, that people in sociology, political science, art, music, religion might be able to use. Such sharing of modeled human experience is not intended to be enjoyed only by the people who create the technology.

The coming data hub will evolve to all good uses in ways that are independent of the mathematics, computer science, applied science and physics, and also independent of the communication device manufacturing.

So, people encountering technology will always find their own purpose, their own enjoyment and their own reactions to it. This is why I am confident that William & Mary is better positioned than most other schools to take full advantage of this hub. Yes, like others, W&M has technologists, but our technologists genuinely appreciate the work of W&M non-technologists.

A: There are very many we could mention. Jefferson Lab has been terribly important in providing opportunities for summer, part-time and full-time research and intern work. So far, JLab engagements have been with graduate students or upper-division undergraduate students, working on accelerator technology or nuclear physics problems. We have done very interesting student work on detector development. One example is the development of high-precision, position-sensitive gamma-ray detectors, which led to medical applications for breast cancer diagnosis and treatment.

Jefferson Lab has long had this peripheral mission of reaching out and making its work relevant to the local community. So, we also have had students involved in educational outreach and K-12 teaching missions to excite the imagination of people who are not themselves inclined to be scientists. As we get closer to the greater variety of applications associated with the coming data science center, broader student and faculty outreach will create much larger communities of interest.

Another W&M strategic initiativeinvolves training and practice in entrepreneurship and job creation. We hope to use internship opportunities for students at W&M to create virtuous cycles of economic uplift for businesses who want to access the untapped talent of our first-generation students from Virginia.

We want to absorb students from every disciplinary interest to have them engage with these advanced tools and learn to use them to maximal advantage for themselves and their employers. Thats our goal. We hope that no one anywhere will think that the JLab data center technology is not available to them. In fact, we want people to know that the less engaged you are with technology now, the more the coming AI tools can be helpful for you.

A: These are questions that many of our recently hired nuclear physics faculty are answering. The key issues will be to continue to challenge theStandard Modeland to look for subtle deviations or variations that might show weve been missing some fundamental physics along the way.

There also will be connections to be made between people who are interested in matter under extreme conditions of exceedingly high pressures or densities, or under exceedingly high or low temperatures. Folks interested in creating high magnetic fields will find kindred spirits with others who are interested in the creation of constrained matter, that is, matter in 1-dimension or 2-dimensions, where a small dislocation in curvature can cause electrons to behave in ways that are indistinguishable from very high magnetic fields. Much of this will come from connections to other national or international laboratories through the new center, which is to be the flagship for all data/computational matters in this realm. All of these scientific interests will translate into engineering improvements for theElectron-Ion Collider.

There is a lockstep in the progress that is self-evident but seems to escape some of the people who invest in research the more you know, the more you will know. The more you do, the more you can do. So, these areas of fundamental science and engineering run step by step with nuclear physics: As you improve one thing, you automatically improve another. And thats why the new hub will not simply be a repository for nuclear physics information. It will be a generator for a new approach for all information.

Antonella Di Marzio, Senior Research Writer

Follow this link:

Partners in data: Q&A with Vice Provost Dennis Manos - William & Mary

Read More..

The Art of Data and Information Visualization in Data Science – Medium

In the realm of data science, where data is king, making sense of the vast and complex information at our disposal is a formidable challenge. The ability to extract meaningful insights from data is the essence of data science, and one of the most powerful tools in this endeavor is data visualization. Data and information visualization is both a science and an art, offering a bridge between raw data and comprehensible insights. In this article, we will explore the importance of data and information visualization in data science, its various techniques, and its role in aiding decision-making processes.

Data visualization is the graphical representation of data and information. It leverages the power of visual perception to help individuals and organizations make sense of data. Here are some reasons why data visualization is vital in the field of data science:

Raw data can be overwhelming and difficult to understand. Visualization simplifies complex data by representing it in a graphical form that is easy to grasp. Charts, graphs, and dashboards provide a clear picture of trends, patterns, and relationships within the data.

Visualizing data helps in identifying trends, outliers, and patterns that might not be apparent when examining raw data. These insights can inform data-driven decision-making.

Data visualization is a powerful tool for communication. It allows data scientists to convey their findings to non-technical stakeholders, such as business leaders and policymakers, in a way that is easily understood.

Visualization aids in the initial exploration of data. By plotting data in various ways, data scientists can quickly gain an understanding of its characteristics and distribution.

Read more from the original source:

The Art of Data and Information Visualization in Data Science - Medium

Read More..

Skills required to excel in a business analytics career … – Data Science Central

In the contemporary business landscape, where data is heralded as the new oil, Business Analytics has emerged as a pivotal domain, steering organizations towards informed decision-making and strategic planning. business analytics encompasses the utilization of data, statistical algorithms, and machine learning techniques to comprehend the business context, forecast future trends, and facilitate optimal decision-making. The multifaceted nature of business analytics necessitates a blend of various technical, analytical, and soft skills, each contributing uniquely to deciphering the complex tapestry of data and deriving actionable insights.

Statistical analysis, the bedrock upon which business analytics is built, involves scrutinizing data, identifying patterns, and interpreting results to facilitate informed decision-making. It is not merely about crunching numbers but understanding the story they tell and the implications thereof. Tools like SPSS, renowned for its user-friendly interface, and R, celebrated for its statistical packages, are instrumental in performing intricate analyses, from regression to hypothesis testing.

Data, in its raw form, is often messy and unstructured. Data management involves cleaning, transforming, and organizing this data to ensure accuracy and consistency, thereby ensuring that the subsequent analyses and insights derived are reliable and valid. This involves handling missing data, detecting outliers, and transforming variables to create a clean, usable dataset.

Data visualization transcends the mere representation of data and ventures into the realm of making data comprehensible and accessible. Tools like Tableau and Power BI enable analysts to create compelling, interactive visualizations, ensuring that the insights are not confined to the technical team but permeate throughout the organization, facilitating data-driven decision-making at every echelon.

In the realm of business analytics, programming languages like Python, celebrated for its simplicity and robust libraries like Pandas and Seaborn, and R, with its unparalleled statistical packages, are indispensable. SQL, with its capability to retrieve, manipulate, and manage data stored in relational databases, is another pivotal skill, ensuring analysts can efficiently interact with and extract data.

Business Intelligence tools like Tableau and Power BI facilitate the creation of interactive, shareable dashboards, ensuring that insights derived from analyses are accessible and actionable across the organization. These tools, with their intuitive interfaces and powerful visualization capabilities, bridge the gap between technical analysts and non-technical stakeholders, ensuring that data-driven insights permeate throughout the organizational structure.

Understanding the business context, including the operations, challenges, and strategic objectives, is paramount for ensuring that the analyses and insights are relevant and actionable and are solid business analytics essentials. This involves not merely understanding the data but comprehending the broader business ecosystem, ensuring that the insights derived align with the organizational objectives and facilitate strategic decision-making.

In the intricate world of business analytics, the ability to translate complex data into comprehensible insights is paramount. Analysts must not only decipher data but also communicate their findings in a manner that is accessible to non-technical stakeholders, ensuring that insights are not lost in translation and facilitating informed, data-driven decision-making across the organization.

Effective communication in business analytics is not monolithic but must be tailored to cater to diverse audiences. This involves adapting the language, medium, and format to ensure that the insights are not merely communicated but are also understood and actionable, whether it be a technical team, managerial personnel, or executive leadership.

Critical thinking in business analytics involves not merely accepting data at face value but scrutinizing it, questioning assumptions, and validating findings. It is about navigating through the myriad of data, identifying patterns and anomalies, and ensuring that the insights derived are robust, reliable, and valid.

Analytical skills involve dissecting problems, identifying underlying patterns, and deriving insights, while creative problem-solving involves thinking outside the box, devising innovative solutions, and navigating through challenges in a manner that is not merely effective but also efficient and innovative.

Domain expertise ensures that the analyses and insights are not merely technically sound but are also relevant and applicable in the specific industry context. It involves understanding the unique challenges, opportunities, and nuances of the industry, ensuring that the business analytics practices are aligned with the industry-specific context.

Different industries, from healthcare to finance, present unique challenges and opportunities. Adapting analytical approaches to cater to these unique demands ensures that the insights derived are not merely theoretically sound but are also practically applicable and facilitate industry-specific strategic decision-making.

The dynamic, evolving realm of business analytics necessitates continuous learning and adaptability, ensuring that practices, tools, and methodologies are not obsolete but are in tandem with the latest market trends, technologies, and industry standards.

This involves pursuing further education, data science and business analytics course certifications, and training, not as a mere formality but as a commitment to continuous learning, ensuring that the skills and knowledge are not stagnant but are continuously evolving and adapting to the dynamic business analytics landscape.

Engage in workshops and seminars, not merely as passive participants but as active learners, networking with peers, engaging with experts, and continuously exploring, learning, and adapting to the ever-evolving realm of Business Analytics.

In an era where data breaches are rampant, upholding data privacy, ensuring that data is handled, processed, and stored securely and is in compliance with legal and ethical standards, is paramount.

Ensuring that data is utilized ethically, avoiding biases, ensuring fairness and transparency, and ensuring that the insights and practices are not merely legally compliant but are also ethically sound, is crucial in the responsible practice of business analytics.

Excelling in a Business Analytics career is not merely about mastering a specific tool or technology but involves a holistic blend of various technical, analytical, and soft skills. It is about navigating through complex, dynamic data, deriving insights, and ensuring that these insights are communicated effectively, are ethically sound, and facilitate strategic, informed decision-making.

Go here to read the rest:

Skills required to excel in a business analytics career ... - Data Science Central

Read More..

ESnet Turns On 400G Circuits to Four DOE National Labs … – Lawrence Berkeley National Laboratory (.gov)

Todays world-changing scientific research is being conducted by collaborators at far-flung national laboratories who require high-speed, low-latency access to high performance computing facilities and specialized instruments. The Energy Sciences Network (ESnet) is proud to announce that it has supercharged the current and future bandwidth for four of the Department of Energys (DOEs) national laboratories and user facilities, unleashing 400 Gigabit per second (400G) capability for Argonne National Laboratory, National Energy Research Scientific Computing Center, Oak Ridge National Laboratory, and Pacific Northwest National Laboratory. With this boost in capacity, scientists can process, analyze, visualize, share, and store the enormous quantities of research data at speeds up to four times faster than previously possible.

Its of vital importance that scientific researchers not be hindered by where they or their projects instruments, computational resources, and data might be located, said Inder Monga, executive director of ESnet. Enabling 400G, which represents the networking industrys current gold standard, will help facilitate that kind of seamless collaboration. We look forward to turning on 400G for more sites and to upgrading to 800G as the technology begins to be available.

The 400G circuit installations were made possible by the 2022 launch of ESnet6, the sixth iteration of ESnets critical data circulatory system for the DOE Office of Science research complex. ESnet6 was specifically designed to support multi-facility collaborations aligned with the DOEs new Integrated Research Infrastructure (IRI) initiative, to help DOE researchers and their international collaborators effectively harness the barrage of data generated by artificial intelligence, high-resolution instrument imagery, complex long-term global studies, and more. ESnets traffic is increasing by a factor of 10 every 5.5 years; in 2022, the total exceeded 1.36 exabytes. (An exabyte is equal to 1,000 petabytes or 1 billion gigabytes.)

Its of vital importance that scientific researchers not be hindered by where they or their projects instruments, computational resources, and data might be located. Enabling 400G, which represents the networking industrys current gold standard, will help facilitate that kind of seamless collaboration.

Inder Monga, executive director of ESnet

Argonne has been involved in several collaborations that demonstrate the efficacy of integrating its supercomputers at the Argonne Leadership Computing Facility (ALCF) with experiments to accelerate scientific discoveries. Under the labs Nexus initiative, Argonne researchers are working with the DIII-D National Fusion Facility to enable on-demand access to ALCF supercomputers for experiment-time data analysis and predictive simulations that can be used to inform the parameters of the facilitys fast-paced plasma physics experiments. Similarly, Argonne researchers are demonstrating IRI capabilities through its ongoing efforts to tightly couple ALCF computing resources with experiments at Argonnes Advanced Photon Source (APS), which is undergoing an upgrade that is expected to increase the volume of data generated by APS instruments by multiple orders of magnitude.

In recent years, weve seen a surge in the near experiment-time analysis workflows being employed by the DOE light sources, fusion research facilities, and other large-scale experiments, said ALCF Director Michael Papka. As these facilities continue to evolve and improve, theyll produce greater data volumes faster than ever before, increasing the demand for high-speed networking to the computing facilities. The ESnet upgrade is essential to keep pace with this growing scientific data deluge and meet future data-intensive research challenges.

Oak Ridge National Laboratory (ORNL) leads the Earth Systems Grid Federation project (ESGF2) to improve the discovery, access, and storage of data used for Earth systems models and simulations and climate change research. The collaborative project, which includes Argonne and Lawrence Livermore National Laboratory, is already taking advantage of ESnet to move the planets largest collection of Earth System model output data between DOE high-performance computing and data facilities. The integration of ESnet is dramatically enhancing data-intensive science, facilitating data sharing, and improving access to simulation and AI platforms.

The ESnet bandwidth upgrades at ORNL will simplify and accelerate data-integration intensive campaigns across DOE and collaborator facilities, further enabling and supporting integrated research infrastructures and projects like ESGF2, said Mallikarjun Shankar, section head for Advanced Technologies at ORNL.

And at the Environmental Molecular Sciences Laboratory, an Office of Science user facility located at PNNL, researchers have embarked on a project known as the Molecular Observation Network, a nationwide effort to understand the processes that govern what happens to carbon in soil. More carbon resides in Earths soil than in the atmosphere and vegetation combined, and its fate plays a huge role in our climate. Its an incredibly active environment, with scientists collecting reams of data about soil processes. The upgrade makes it possible to transfer hundreds of gigabytes of project data in just a few minutes, not the hours previously required. The near real-time, constant data feed makes the autonomous experimentation planned at EMSL easier to implement.

This ESNet6 upgrade presents a tremendously exciting opportunity for pursuing big science at DOE user facilities such as EMSL and at the other laboratories, said Douglas Mans, EMSL director. Accessing, moving, and storing massive data sets more effectively ensures U.S. scientific leadership and economic benefit.

Excerpt from:

ESnet Turns On 400G Circuits to Four DOE National Labs ... - Lawrence Berkeley National Laboratory (.gov)

Read More..

Lecturer Will Discuss the Transformative, Global Power of AI in … – American College of Surgeons

As the influence of artificial intelligence (AI) continues to grow in medicine and healthcare, discussion of the topic can sometimes take on a fever pitch that can obfuscate some of the real impact that AI is making todayand what it can do tomorrow.

In todays Distinguished Lecture of the International Society of Surgery: Artificial Intelligence and the Future of Global Surgery, Ewen Harrison, MBChB, MSc, PhD, FRCS, will explore three areas of AI: enhancing precision care through prediction, democratizing surgical expertise, and thinking deeply about ethical and socioeconomic implications.

Professor Harrison, a professor of surgery and data science and honorary consultant surgeon at the University of Edinburgh in Scotland, will discuss both the strengths and weaknesses of AI based on his experience and what he calls the amazing excitement in this area with some unvarnished truths about expectations.

My takeaway is simple. Clinical staff and AI developers must collaborate more effectively to improve quality and relevance, and we must see more global collaboration in AI and data science to improve the delivery of surgical care everywhere, Professor Harrison said, which builds on the Surgeons United theme of Clinical Congress.

The technology is already making an impact in preoperative planning, intraoperative assistance, and postoperative monitoring, and it could have important implications for global surgery.

As transformative as AI can be, though, it is not a panacea for improving care in surgery, he suggests. There are fundamental issues to understand, tackle, and mitigate against in all aspects of AI, including biases that exist in much of the extant data. The contributing voices in how AI is used need to be equitable.

Across the world, there is great potential for remote assistance in clinical care and in the operating room, for better training and simulation, and for the standardization of care to ensure best practice is adopted broadly, Professor Harrison said. However, as I will show, the Global South has little voice in these issues, something that has to change if we are to see equity in the benefits of these tools.

Such technology as ChatGPT can have profound implications in global healthcare, both positive and negative, Dr. Harrison said. He will share his GPT4 systematic review of AI in surgerymore than 7,700 abstracts processed by modeland discuss its power and limitations.

To take full advantage of AI in surgery, surgeons, clinicians, and other stakeholders need to get involved at the outset, which is already starting to happen with postoperative wearable sensors, for example. Global nursing teams are setting priorities, adapting ideas for different cultures and contexts, and working out how the new AI tool integrates with the existing health system, he said.

Understanding the prevailing attitudes about AI in medicine, Professor Harrison said he wants to impart to the audience and his colleagues that they can and should play a role in seeing what comes next.

People often see AI in healthcare in one of three ways: as a curiosity, a savior, or a threat. But it is, of course, none of those things, Professor Harrison said. Many of the tasks AI will contribute to in healthcare are not yet defined. We must respond to this undifferentiation and co-create what comes next. It is malleable, it is iterative, and it is in our control.

The Distinguished Lecture of the International Society of Surgery was established in 1990 and endowed by the US Chapter of the International Society of Surgery to recognize the Society'simportant activities by honoring distinguished international surgeons.

For those who are unable to attend the lecture at 8:00 am in Room 104ABC of the Boston Convention & Exhibition Center, it also will be made available for on-demand viewing shortly after the live presentation.

Excerpt from:

Lecturer Will Discuss the Transformative, Global Power of AI in ... - American College of Surgeons

Read More..

Preparing for Modernization in the Transportation Workforce – Government Technology

The U.S. government is spending hundreds of billions of dollars to modernize the nations transportation infrastructure, making it safer and more sustainable. But making the best use of modern digital technologies requires a properly trained transportation workforce.

Digital technologies are now woven into the evolution of transportation.

The car of the future is a computer on wheels, says Doug Couto, senior fellow with the Center for Digital Government* (CDG) and formerly Michigans chief information officer. In the years ahead, intelligent technology will play a bigger role in how all vehicles operate.

The main reason for this shift is safety. In 2021, we had 43,000 people die on highways, Couto says. A computer that runs an autonomous vehicle doesnt stay at the bar until 2 a.m. and drive home drunk. Its that simple.

TRENDS IN TRANSPORTATION

Major trends in the digitization of transportation include:

Connected vehicles. Drivers are already using smartphones and onboard navigation systems to optimize travel. The next phase could help vehicles communicate with external sources such as sensors, vendors and other vehicles.

Predictive analytics. Sensor data and learning algorithms will increasingly help transportation planners optimize travel patterns, reducing congestion and improving the travel experience.

Electrification. The rising popularity of battery-electric vehicles will require a network of public chargers that must be implemented, maintained and optimized.

Autonomy. Experiments are producing intriguing results in self-driving cars, trucks, taxis and other vehicles. While full autonomy is probably several years away, vehicle manufacturers will continue to add driver-assistance features that will influence how people travel.

Travel as a service. Bike-sharing, scooter rentals and other micromobility options will require regulatory oversight and safety reviews.

Biometrics. Facial recognition and fingerprint scanning can enhance the security of transportation systems.

PREPARING FOR THE FUTURE

Transportation agency executives and managers must prepare their workforces for technologies that are transforming transportation.

Making sure employees understand the safety protocols and best practices for adoption of these technologies will be paramount to success, says Kristin Hempstead, North American business development manager for data science with Z by HP. She suggests data science training programs that include:

Transportation agency workforces will increasingly deal with sophisticated vehicle systems, as well as applications that automate internal tasks and workflows.

Process automation. Transportation staffers often perform time-consuming manual tasks like data entry and regulatory reviews that can be simplified and streamlined through robotic process automation. Moreover, low-/no-code software applications can help managers and staff create their own automations to remove bottlenecks.

Security. Every vehicle, camera, sensor and networking device is a potential entry point for cyber attacks. Transportation staff will need to understand the threats to their networks and implement effective defense techniques.

AI/ML. Artificial intelligence and machine learning (AI/ML) algorithms are the true frontier of transportation planning and operations. With AI/ML, vehicles will need less human intervention. Were already seeing driverless taxi programs in select cities and experiments with long-haul trucking on interstates.

Transportation agencies must prepare for the implications of AI/ML, Hempstead says. That includes creating infrastructure to accommodate automated vehicles, developing safety regulations and creating policies for the safe integration of AI/ML into the transportation landscape.

MAINTAINING NEXT-GENERATION EQUIPMENT AND HARDWARE

The devices that automate and electrify transportation will require new competencies for transportation professionals.

Sensors. Traffic cameras detect when drivers run red lights. Sensors along expressways can monitor traffic patterns and send real-time alerts for congestion and accidents. These devices are prone to wear and tear and must be kept secure.

Vehicles. The adoption of electric vehicles is a promising green initiative, Hempstead says. But she notes many questions are unanswered. Electrifying our vehicle fleet will require a massive investment in charging stations. The $7.5 billion in federal funding for these stations is a good start, but it does pose new challenges. For instance, how do we ensure charging access in rural areas?

Wireless networking. With 5G mobile networks expanding wireless bandwidth, vehicles will become increasingly interconnected. Sensor networks will generate data that allow vehicles to operate more safely, and vehicles will transmit telemetry data to networks and nearby vehicles. This abundance of real-time data will help leaders make better decisions that improve travel outcomes.

Technology infrastructure and devices. The three tiers of a data center compute, storage and networks will remain the bedrock of transportation technology. Many workloads will move to the cloud for agility, economy and scale. Virtualization will make device management more flexible and efficient.

FUTURE-PROOFING THE TRANSPORTATION WORKFORCE

Transportation agencies must stay ahead of evolving technologies. Workforces will need to acquire new skills as agencies automate everyday tasks and increase their use of data science and analytics-driven decision-making.

Agencies also must ensure employees have the right computing devices to get their work done now and in the future. Hempsteads advice: Make sure they have enough performance and power, whether it be with a GPU or a higher-core processor or more memory or more storage. Make sure they are as productive as possible throughout a three- to five-year refresh cycle.

This article is excerpted from the new Government Technology thought leadership paper, Modernizing the Transportation Workforce: Enhancing Public Sector Skills for Automation and Electrification. Click here to download the full paper.

*Note: The Center for Digital Government is part of e.Republic, Government Technology's parent company.

Read the original post:

Preparing for Modernization in the Transportation Workforce - Government Technology

Read More..

Data Governance Concerns in the Age of AI – RTInsights

As more organizations make use of generative AI, there is growing concern about data governance issues that may emerge with the use of the technology.

A recent survey conducted by Komprise, a leader in analytics-driven unstructured data management, highlights the increasing emphasis on data governance in the context of artificial intelligence (AI) adoption. The third annual Komprise 2023 State of Unstructured Data Management survey reveals that while organizations are embracing generative AI tools, a majority still express significant concerns.

The survey, which gathered insights from 300 global enterprise storage IT and business decision-makers at organizations with over 1,000 employees in the United States and the UK, highlights several key findings:

See also: Automating Data Governance: Leverage AI as Your Digital Doorman

These key themes collectively emphasize the pivotal role of data governance, preparedness for AI adoption, and efficient data management in organizations strategies as they navigate the evolving landscape of generative AI technologies.

The survey underscores the critical role of data governance in the era of AI. As organizations embrace generative AI tools, they are increasingly focused on implementing robust data governance strategies to ensure AI technologies ethical, secure, and efficient use.

The rest is here:

Data Governance Concerns in the Age of AI - RTInsights

Read More..

Data science education mkt in India to rise 58 % to $1.4 billion by 2028: Report – Moneycontrol

According to a study, India's data science education market is expected to grow 57.5 per cent to USD 1.391 billion (about Rs 11,569 crore) by 2028. The size of the sector in 2023 was USD 204.23 million (about Rs 1,698 crore) and is likely to see a compound annual growth rate (CAGR) of 57.5 per cent over the next five years, the report by ed-tech platform Imarticus Learning and Hyderabad-based tech portal Analytics Insight said.

The report projected data jobs to go up 57 per cent in the next five years from 2.1 lakh this year to 3.3 lakh by 2028. Top recruiters for these roles include Amazon and AWS, Bain and Company, Deloitte, EY and Google, The Data Science Education Report 2023 said. The global data science education market is projected to reach USD 378.7 billion by 2030, growing at a CAGR of 16.43 per cent from 2022 to 2030, driven by increasing demand for data science skills in healthcare, finance and retail sectors. The report stated that the demand for data scientists will increase by 25% in the next few years.

"The Data Science Education Report 2023 underscores the rising wave of interest in data science education across India and predicts remarkable growth in the data science education industry," Imarticus Learning Founder and CEO Nikhil Barshikar said. This report estimates that the on-campus data science education market will grow at a CAGR of 56.73 per cent from USD 128.03 million in 2022 to USD 857.57 million in 2027. Meanwhile, the market for online programmes will grow from USD 76.20 million to USD 533.69 million at a CAGR of 58.82 per cent during this period.

Discover the latest business news, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

See original here:

Data science education mkt in India to rise 58 % to $1.4 billion by 2028: Report - Moneycontrol

Read More..

RepRisk and Carahsoft Partner to Bring World-Leading Business Conduct Risk Data to the Public Sector – Yahoo Finance

Public Sector Now Able to Harness the Power of RepRisks Daily Updated Business Conduct and Supply Chain Risk Dataset

ZURICH, Switzerland & RESTON, Va., October 26, 2023--(BUSINESS WIRE)--RepRisk, a data science company that provides transparency on business conduct risks, and Carahsoft Technology Corp., The Trusted Government IT Solutions Provider, today announced a partnership. Under the agreement, Carahsoft will serve as Public Sector Distributor making RepRisks industry-leading dataset on business conduct and supply chain risks available to the Public Sector through Carahsofts reseller partners, NASA Solutions for Enterprise-Wide Procurement (SEWP) V, Information Technology Enterprise Solutions Software 2 (ITES-SW2) and OMNIA Partners contracts.

"True to their public mandate, Government agencies and other public entities are keen to manage their business conduct risks," said Jenny Nordby, Head of Business Development at RepRisk. "Our partnership with Carahsoft and its resellers enables Public Sector organizations to access RepRisks unique, world-leading dataset in order to identify, assess and monitor risks throughout their value chains, thus promoting financially sound decisions."

RepRisk provides the worlds largest dataset on business conduct and supply chain risks, powered by a unique combination of artificial and human intelligence, delivering speed and driving scale without sacrificing data quality or granularity. RepRisk intentionally excludes company self-disclosures to produce comprehensive and actionable insights on business conduct risks. These risks can result in financial losses and reputational damage for a company and its stakeholders.

RepRisk and Carahsoft are working together to help Public Sector entities make risk-informed business decisions and drive innovation by leveraging RepRisks unique event and issues-driven ESG research methodology. RepRisks innovative approach offers users an outsiders perspective when assessing an organizations actual on-the-ground performance and enables effective scalability, risk assessment and governance. By making RepRisk data available to the Public Sector through Carahsofts extensive IT ecosystem and reseller partners, this partnership facilitates the integration and reporting of business conduct risks within operations and across supply chains.

Story continues

"With the addition of RepRisk to our offerings, Carahsoft is now able to support our Public Sector customers with uniquely comprehensive and critical insights into their operation conduct risks," said Alec Wyhs, Sales Director who leads the RepRisk Team at Carahsoft. "This partnership will connect RepRisk to our extensive reseller network and provide access to a one-of-a-kind data collection and analytics service that can both streamline operations and limit risks for our customers."

RepRisks data-as-a-service is available through Carahsofts SEWP V contracts NNG15SC03B and NNG15SC27B, ITES-SW2 Contract W52P1J-20-D-0042 and OMNIA Partners Contract #R191902. For more information, contact the Carahsoft team at (888) 662-2724 or RepRisk@carahsoft.com.

Carahsoft is helping Government agencies connect technology and industry partners with best-of-breed artificial intelligence, machine learning and high-performance computing capabilities to meet mission needs. Learn more about Carahsofts AI and Machine Learning solutions here.

About RepRisk

Founded in 1998 and headquartered in Switzerland, RepRisk is a data science company that provides transparency on business conduct risks like deforestation, human rights abuses, and corruption. RepRisk enables efficient decision-making for clients and supports alpha generation and value preservation for their organization, investments, and business interests. RepRisk is trusted by 80+ of the worlds leading banks, 17 of the 25 largest investment managers, corporates, and the worlds largest sovereign wealth funds for their due diligence processes. RepRisk uses human curation and cutting-edge artificial intelligence to generate the worlds most comprehensive business conduct and biodiversity risk datasets on public and private companies, real assets, and countries. Find out more on reprisk.com.

About Carahsoft

Carahsoft Technology Corp. is The Trusted Government IT Solutions Provider, supporting Public Sector organizations across Federal, State and Local Government agencies and Education and Healthcare markets. As the Master Government Aggregator for our vendor partners, we deliver solutions for Artificial Intelligence, Cybersecurity, MultiCloud, DevSecOps, Big Data, Open Source, Customer Experience and Engagement, and more. Working with resellers, systems integrators and consultants, our sales and marketing teams provide industry leading IT products, services and training through hundreds of contract vehicles. Visit us at http://www.carahsoft.com.

View source version on businesswire.com: https://www.businesswire.com/news/home/20231026877301/en/

Contacts

Gina Walser+41 43 300 54 40media@reprisk.com

Mary Lange(703) 230-7434pr@carahsoft.com

See the rest here:

RepRisk and Carahsoft Partner to Bring World-Leading Business Conduct Risk Data to the Public Sector - Yahoo Finance

Read More..

Easing access to satellite data – Harvard School of Engineering and Applied Sciences

There are so many of the SDGs that can be helped with environmental and planetary monitoring, and satellite imagery is a huge source of information for that. said Rolf, one of five winners out of more than 450 submissions. If you look at the award winners and runners-up from the other categories, it was a surprise to get that recognition as an academic team. This speaks to the amount of effort and consistent work our broader team has put into making this actually helpful and accessible.

Rolf developed MOSAIKS as part of her Ph.D. research in computer science at the University of California-Berkeley, and her research was first published in Nature Communications in 2021. She worked on an interdisciplinary team whose interests included computer science, environmental economics, public policy and statistics, and that same interdisciplinary approach drew her to Harvard for her postdoctoral fellowship. She now uses MOSAIKS frequently with her research with faculty advisor Milind Tambe, Gordon McKay Professor of Computer Science and CRCS Director.

I mostly focus on geospatial problems such as environmental monitoring, she said. The CRCS and HDSI are places where this type of interdisciplinary conversation and development is happening. Its exciting for computer scientists who want their research to be more interdisciplinary or focused on social impact to be able to have these hubs of people. Now theres a network of people that can help and have conversations, and its a network Ive relied on and am now happy to be part of.

Rolf will finish her two-year fellowship this spring, and next fall will become a computer science professor at the University of Colorado-Boulder. She has always wanted to use her computer science training to help the world, and shell soon be able to help impart that passion onto the next generation of computer scientists.

I love math, I love statistics, and I love computing, she said. I find them all very powerful, that we can use math and statistics to describe the world and then combine that with optimization in computer science to make solutions that work fast.

Read more from the original source:

Easing access to satellite data - Harvard School of Engineering and Applied Sciences

Read More..