Page 2,562«..1020..2,5612,5622,5632,564..2,5702,580..»

Research helps Blugold discover how computer science, health care intersect – University of Wisconsin System

Thanks to his undergraduate research, Nichol He found a way to bring his passions for computer science and health care together in his future career. After He earns his degree in software engineering, the Blugold hopes to go to medical school while also continuing his research in bioinformatics. (Photo by Jesse Yang)

Coming into college, Nichol He knew he wanted to study either computer science or pre-med, two very different academic areas that he assumed could only lead to two very different careers.

Since the University of Wisconsin-Eau Claire has strong programs in both areas, He decided hed take a few classes in each discipline and then decide which path to follow.

I was really interested in computer science and I knew I didnt want to give that up, He says, noting that while his longtime goal is to be a physician, the logic of computer science, just really clicks with me. So, I decided to major in software engineering and minor in pre-professional health sciences. That way I could experiment with them both and decide which one I absolutely want to do.

This summer thanks to his work on multiple research projects He realized he doesnt have to choose between his two passions because computer science and health care intersect in ways he never imagined.

All in all, it was a very powerful experience and has had a big influence on what I want to do post-graduation, He says of his research. Coming into the university, my hopes were to go to medical school and become a physician or to graduate and be a computer scientist. I thought I would have to give one of them up. I never thought my interests in computer science and the medical field would ever intersect.

Now, because of my research, Ive found bioinformatics, which really is a combination of my two passions. I really like the intersection between software engineering/computer science and the biology/medical field.

So, his future plans now include going to medical school and specializing in radiology, and also continuing his bioinformatics research during his postgraduate studies and/or after he becomes a physician.

This summer, He worked alongside two faculty research mentors, one in biology and one in computer science.

He worked with Dr. Bradley Carter, an assistant professor of biology, to study the effects of chemicals on the development of zebra fish. And he collaborated with Dr. Rahul Gomes, assistant professor of computer science, on research involving deep learning and bioinformatics.

Both research projects are very different; they are almost complete opposites, says He, a native of Medina, Minnesota. One is really observing behaviors in fish and working with animals and the other is working with data and programs. They were completely different, but together they really gave me an enriching experience.

Through his research with Gomes, He began to understand the powerful intersection between computer science and the natural sciences.

Working with researchers at North Dakota State University, He and Gomes explored methylation markers, which are certain values associated with gene expression that possibly are related and could be a predictor of pancreatic cancer.

So, the deep learning Im doing on this project is looking at all this data and predicting whether or not a person might have pancreatic tumors or if they are healthy, He says.

Gomes and He also are working with Mayo Clinic Health System on a project that uses CT scans and data to help predict outcomes for patients with pancreatic cancer who are being treated with chemotherapy.

My research over the summer was a unique experience for me because I was working in a real-world work environment, He says. And it showed me that I really want to explore deep learning in bioinformatics more.

I went into this research without knowing anything about deep learning or bioinformatics. But Ive found this passion for it and that is really, really cool.

Bioinformatics is the science of storing, extracting, organizing, analyzing, interpreting and using biological information. It incorporates data and analytical approaches from the biological sciences, computer science, data science and mathematics.

The field of bioinformatics grew out of the need to organize and analyze the increasingly large amounts of biological data including data critical to advances in health care being generated. Bioinformatics analyses are increasingly necessary to address many biological questions.

The need for bioinformatics is now greater than ever, Gomes says. As modern technology enables us to collect more and more clinical as well as biological data, we require experts who are not only capable of processing the data to gain information but do so in the most optimized fashion to enable a faster and more accurate response.

A new bioinformatics major an interdisciplinary program that draws on expertise in the universitys biology, computer science and mathematics departments will be available at UW-Eau Claire beginning in fall 2022. It will be the only bioinformatics program of its kind in the UW System.

Gomes describes He as an outstanding student and researcher, who has made valuable contributions to their research despite having no prior background in deep learning or bioinformatics.

Nichol is very perceptive about the challenges while developing a deep learning model and how we can overcome them, Gomes says. During the pancreatic cancer research project with Mayo Clinic, he introduced and implemented a patch-based deep learning model to divide the CT scans in 3D patches, making model training more feasible on our GPU nodes at the Blugold Center for High-Performance Computing.

The fact that Nichol can take in information and also engage in discussions to modify or improve the proposed workflow is simply astounding considering that he is an undergraduate student and with no prior experience in deep learning.

He came to UW-Eau Claire primarily because of the many internship and research opportunities he knew hed find as a Blugold.

Among the opportunities he values most is being a Karlgaard Fellow, a scholarship program that, among other things, supports undergraduate student researchers who are studying computer science or software engineering.

Karlgaard scholars collaborate with computer science faculty on research and produce scholarly publications and/or formal presentations outside of the UW-Eau Claire campus.

I was fortunate enough to be chosen for this scholarship, and its been a big motivator for me to explore different areas of computer science, He says. Knowing there is someone who supports me financially makes me feel motivated to work harder, but it also makes me even more thankful for the opportunities.

David and Marilyn Karlgaard met while both were students at UW-Eau Claire. David Karlgaard graduated in 1967 with degrees in math and physics. He was co-founder, CEO and president of PEC Solutions Inc., an internet technology-consulting firm, which was acquired by Nortel Networks in 2005. Marilyn Karlgaard, who attended UW-Eau Claire from 1965-1968, is a retired human resource manager.

The Karlgaards have made substantial donations to the UW-Eau Claire Foundation, gifts that support multiple scholarships and various other university initiatives.

He says the fellowship programs ties to undergraduate research are especially meaningful because computer science majors often come into the program with a set idea of what they want to do with their degree. Research can help them discover career paths they may not have considered or otherwise known about.

Its a very wide field and a very fun field, so students need to look around and find the niches that fit them, He says of computer science and software engineering. It was being a Karlgaard Scholar that got me into this bioinformatics research. Without that, I wouldnt even know what I want to do in the future.

Here is the original post:

Research helps Blugold discover how computer science, health care intersect - University of Wisconsin System

Read More..

University of Queensland Professor David Abramson Wins High Performance Computing Award – HPCwire

Oct. 13, 2021 ACM, the Association for Computing Machinery and IEEE Computer Society have named David Abramson, a Professor at the University of Queensland, as the recipient of the 2021 ACM-IEEE CS Ken Kennedy Award.Abramson is recognized for contributions to parallel and distributed computing tools, with application from quantum chemistry to engineering design. He is also cited for his mentorship and service to the field.

Technical Contributions

Abramson has performed pioneering research in the design, implementation, and application of software tools for parallel and distributed systems. He has conducted foundational research in distributed and parallel middleware, addressing programmer productivity and software correctness, and has influenced multiple generations of researchers. His papers have been cited more than 12,000 times.

Two highly-regarded tools developed by Abramson include Nimrod, a family of software systems that support the execution of distributed parameter sweeps, searches, and workflows; and Guard, a performance tuning and debugging tool.

The Nimrod template is common in many fields and is well suited to execution in distributed environments. Nimrod makes it possible to write concisely complex parameter sweeps which entail executing an algorithm repeatedly with varying parametersand supports advanced searches that integrate optimization algorithms, design of experiments methods, and scientific workflows. Additionally, the Nimrod project spawned a family of tools that make it easy to specify complex computational experiments and has resulted in a spinoff commercial product called EnFuzion, which has been widely adopted for power grid and simulation.

Abramson designed Guard with a hybrid debugging scheme that tests new versions of a program against reference versions known to be correct. Guard greatly enhances programmers ability to locate and fix errors in new software versions. The technology was licensed to Cray Inc. (now HPE) and is distributed on Cray supercomputers. As a result, it has been deployed at major international supercomputing centers, including the US National Energy Research Scientific Computing Center (NERSC) and the Swiss National Supercomputing Centre (CSCS).

Mentorship

Abramsonhas been an advisor to two dozen graduate students in computer science, as well as countless undergraduate and high school students.

Among his most important initiatives, Abramson has been an international driver of the PRIME initiative, an NSF-funded University of California San Diego program that enables undergraduate students to take research internships abroad. Inspired by the success of PRIME, he has introduced similar programs for Australian undergraduates to travel abroad for internships, and he has organized travels for Australian students to top research centers in the US and the UK annually for over 12 years. Since 2011, he has run a unique program that supports Australian high school students attending SC, the leading high performance computing conference.

Also in the mentoring arena, Abramson started streaming video HPC seminars that have allowed Australian students to engage with world leaders, and he launched the Early Adopters PhD Workshop at SC09. Distinct from other doctoral showcases, the workshop specifically targets research students from fields outside of computer science who are applying HPC tools in their research.

Service to the Field

Over his career, Abramson has been General Chair, Program Committee Chair, or program committee member of many conferences related to performance and programmer productivity (on average about eight per year), including IPDPS, HiPC, HPC Asia, HPDC, ICPADS, Cluster, SC, CCGrid, Grid, and e-Science.

He is currently the Chair of the e-Science Steering Committee and has served in several senior roles in the IEEE/ACM SC series, including Chair of the Technical Papers Committee (2021) and Test of Time Award Committee (2018), Co-chair of the Invited Speakers Committee (2019), and More than HPC Plenary Committee (2020).

ACM and IEEE CS co-sponsor the Kennedy Award, which was established in 2009 to recognize substantial contributions to programmability and productivity in computing and significant community service or mentoring contributions. It was named for the late Ken Kennedy, founder of Rice Universitys computer science program and a world expert on high performance computing.The Kennedy Award carries a US $5,000 honorarium endowed by IEEE CS and ACM. The award will be formally presented to Abramson in November atThe International Conference for High Performance Computing, Networking, Storage and Analysis (SC21).

About ACM

ACM, the Association for Computing Machinery, is the worlds largest educational and scientific computing society, uniting computing educators, researchers and professionals to inspire dialogue, share resources and address the fields challenges. ACM strengthens the computing professions collective voice through strong leadership, promotion of the highest standards, and recognition of technical excellence. ACM supports the professional growth of its members by providing opportunities for life-long learning, career development, and professional networking.

Source: Association for Computing Machinery

See the original post here:

University of Queensland Professor David Abramson Wins High Performance Computing Award - HPCwire

Read More..

Researchers receive grant to predict the mechanics of living cells – EurekAlert

image:(From left) Anuj Karpatne, Department of Computer Science and Sanghani Center for Artificial Intelligence and Data Analytics; Amrinder Nain and Sohan Kale, both in the Department of Mechanical Engineering, meet in the STEP Lab. Photo by Peter Means for Virginia Tech. view more

Credit: Virginia Tech

With advances in deep learning, machines are now able to predict a variety of aspects about life, including the way people interact on online platforms or the way they behave in physical environments. This is especially true in computer vision applications where there is a growing body of work on predicting the future behavior of moving objects such as vehicles and pedestrians.

However, while machine-learning methods are now able to match and sometimes even beat human experts in mainstream vision applications, there are still some gaps in the ability of machine-learning methods to predict the motion of shape-shifting objects that are constantly adapting their appearance in relation to their environment, saidAnuj Karpatne, assistant professor of computer science and faculty at theSanghani Center for Artificial Intelligence and Data Analytics.

This is a problem encountered in many scientific fields, Karpatne said. For example, in mechanobiology, cells change their shape and trajectory as they move across fibrous environments in the human body, constantly tugging or pushing on the fibers and modifying the background environment, which in-turn influences the movement of cells in a perpetual loop.

This is fundamentally different from mainstream applications in computer vision where changes in the background caused by pedestrians and vehicles are far less accelerated than those possible by the movement of living cells governed by the laws of mechanics and biology, he said.

To address this challenge, the National Science Foundation has awarded a team of Virginia Tech scientists a $1 million grant to create a new avenue of research in physics-guided machine learning. The project will, for the first time, systematically integrate the mechanics of cell motion available as biological rules and physics-based model outputs to predict themovement of shape-shifting objects in dynamic physical environments.

As principal investigator, Karpatne will team with co-principal investigatorsAmrinder Nain, associate professor, andSohan Kale, assistant professor in theDepartment of Mechanical Engineering, combining his expertise in machine learning with their specialties in cell mechanobiology and computational modeling, respectively.

The work we are doing at theSTEP Labis a natural overlap, said Nain, who founded the lab and pioneered research in designing nanofiber networkplatforms and experimental imaging to study cell motion.

Cell shapes are highly dynamic and undergo limitless transformations as they sense and react to their environment. In addition, cell motion is constrained by the forces exerted by the cells on the background environment and the complex nature of cell-cell and cell-fiber interactions," Nain said. "While conventional methods for studying cell motion require manual tracking of images' features or running computationally expensive tools, our project will take advantage of our ability to create well-defined suspended nanofiber nanonets and advancements in machine learning to open to a new frontier to automatically describe new rules of cell behavior.

Kale said hisMechanics of Living Materials Labhas alreadydeveloped a computational method to estimate the forces exerted by cells from the deformed shapes of underlying fibers.

This, combined with the deep learning framework from Anuj's group, provides a framework to measure forces directly from experimental images of cells moving on nanofiber networks. Our tool enables the study of cell mechanobiology in fibrous environments in a radically different way than existing approaches in the field, said Kale.

We are fully leveraging the principles of `convergence research in our project by integratingdata, knowledge, and methodologies from our three different disciplines machine learning, experimental cell imaging, and computational modeling, said Karpatne. The ultimate goal is to accuratelypredict and explainhow cells move, interact with each other, and change their appearance in physiological environments inside our body.

The project will contribute foundational innovations by going far and beyond current standards of black-box machine learning for motion prediction in scientific problems. By anchoring our deep learning patterns with scientific theories, our work advances the frontiers of explainable machine learning by discovering new rules of cell behavior that are physically consistent and scientifically meaningful, Karpatne said.

The research has potential impact on several scientific disciplines that routinely involve predicting the trajectories of shape-shifting objects in dynamic physical environments, for example, hurricane prediction, bird migration, and ocean eddy monitoring, he said.

The project will also lead to novel advances in mechanobiology.

Studying cell migration is a major research frontier in the study of embryo development, wound closure, immune response, and cancer metastasis, Nain said. We expect that this research will also serve as a drug discovery, diagnostics, and testing platform in the context of cancer and wound healing biology where the spread of disease or repair of wound result from the constant change of cell and fibrous network shapes.

The research team is committed to supporting Virginia Techs education and workforce development goals, especially toward training a diverse cadre of students who can address complex problems requiring interdisciplinary skills. These students include those majoring in computer science, mechanical engineering, physics, and biological sciences.

Three Ph.D. students will also be working on this project. They areArka Dawin computer science, advised by Karpatne; Abinash Padhi in mechanical engineering, advised by Nain; and Maahi Tulukder in mechanical engineering, advised by Kale.

In conjunction with their research,Karpatne, Nain, and Kale will collaborate with theCenter for Educational Networks and Impactsto create a hands-on exhibition on Artificial Intelligence for Observing Cells for the annual Virginia Tech Science Festival and Hokie for a Day field trip event.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Original post:

Researchers receive grant to predict the mechanics of living cells - EurekAlert

Read More..

These neural networks know what they’re doing – MIT News

Neural networks can learn to solve all sorts of problems, from identifying cats in photographs to steering a self-driving car. But whether these powerful, pattern-recognizing algorithms actually understand the tasks they are performing remains an open question.

For example, a neural network tasked with keeping a self-driving car in its lane might learn to do so by watching the bushes at the side of the road, rather than learning to detect the lanes and focus on the roads horizon.

Researchers at MIT have now shown that a certain type of neural network is able to learn the true cause-and-effect structure of the navigation task it is being trained to perform. Because these networks can understand the task directly from visual data, they should be more effective than other neural networks when navigating in a complex environment, like a location with dense trees or rapidly changing weather conditions.

In the future, this work could improve the reliability and trustworthiness of machine learning agents that are performing high-stakes tasks, like driving an autonomous vehicle on a busy highway.

Because these brain-inspired machine-learning systems are able to perform reasoning in a causal way, we can know and point out how they function and make decisions. This is essential for safety-critical applications, says co-lead author Ramin Hasani, a postdoc in the Computer Science and Artificial Intelligence Laboratory (CSAIL).

Co-authors include electrical engineering and computer science graduate student and co-lead author Charles Vorbach; CSAIL PhD student Alexander Amini; Institute of Science and Technology Austria graduate student Mathias Lechner; and senior author Daniela Rus, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science and director of CSAIL. The research will be presented at the 2021 Conference on Neural Information Processing Systems (NeurIPS) in December.

An attention-grabbing result

Neural networks are a method for doing machine learning in which the computer learns to complete a task through trial-and-error by analyzing many training examples. And liquid neural networks change their underlying equations to continuously adapt to new inputs.

The new research draws on previous work in which Hasani and others showed how a brain-inspired type of deep learning system called a Neural Circuit Policy (NCP), built by liquid neural network cells, is able to autonomously control a self-driving vehicle, with a network of only 19 control neurons.

The researchers observed that the NCPs performing a lane-keeping task kept their attention on the roads horizon and borders when making a driving decision, the same way a human would (or should) while driving a car. Other neural networks they studied didnt always focus on the road.

That was a cool observation, but we didnt quantify it. So, we wanted to find the mathematical principles of why and how these networks are able to capture the true causation of the data, he says.

They found that, when an NCP is being trained to complete a task, the network learns to interact with the environment and account for interventions. In essence, the network recognizes if its output is being changed by a certain intervention, and then relates the cause and effect together.

During training, the network is run forward to generate an output, and then backward to correct for errors. The researchers observed that NCPs relate cause-and-effect during forward-mode and backward-mode, which enables the network to place very focused attention on the true causal structure of a task.

Hasani and his colleagues didnt need to impose any additional constraints on the system or perform any special set up for the NCP to learn this causality it emerged automatically during training.

Weathering environmental changes

They tested NCPs through a series of simulations in which autonomous drones performed navigation tasks. Each drone used inputs from a single camera to navigate.

The drones were tasked with traveling to a target object, chasing a moving target, or following a series of markers in varied environments, including a redwood forest and a neighborhood. They also traveled under different weather conditions, like clear skies, heavy rain, and fog.

The researchers found that the NCPs performed as well as the other networks on simpler tasks in good weather, but outperformed them all on the more challenging tasks, such as chasing a moving object through a rainstorm.

We observed that NCPs are the only network that pay attention to the object of interest in different environments while completing the navigation task, wherever you test it, and in different lighting or environmental conditions. This is the only system that can do this casually and actually learn the behavior we intend the system to learn, he says.

Their results show that the use of NCPs could also enable autonomous drones to navigate successfully in environments with changing conditions, like a sunny landscape that suddenly becomes foggy.

Once the system learns what it is actually supposed to do, it can perform well in novel scenarios and environmental conditions it has never experienced. This is a big challenge of current machine learning systems that are not causal. We believe these results are very exciting, as they show how causality can emerge from the choice of a neural network, he says.

In the future, the researchers want to explore the use of NCPs to build larger systems. Putting thousands or millions of networks together could enable them to tackle even more complicated tasks.

This research was supported by the United States Air Force Research Laboratory, the United States Air Force Artificial Intelligence Accelerator, and the Boeing Company.

Read the original here:

These neural networks know what they're doing - MIT News

Read More..

Faculty Highlights: Recent Grants and Awards | Now – Drexel Now

Last term, Drexel University faculty were recognized for their scholarly research and professional contributions and recognitions. This update offers a snapshot of recent activity, courtesy of the Office of the Provost.

Sponsored Research

Gwen Ottinger, PhD, associate professor in the College of Arts and Sciences, was awarded a $220,428 grant from the Sloan Foundation to support her project Open Science Hardware practice: Transforming the Politics of Scientific Knowledge Production.

Ezra Wood, PhD, associate professor in the College of Arts and Sciences, received a two-year grant from the National Oceanic and Atmospheric Administration for his project Quantification of Ozone Formation Rates in Upper Manhattan. This project is part of a multi-investigator study on air quality in densely populated coastal cities.

College of Computing & Informatics faculty Christopher MacLellan, PhD, assistant professor; Rosina Weber, PhD, associate professor; and Edward Kim, PhD, associate professor, received $999,999 from Defense Advanced Research Projects Agency to study sparse coding and extraction of ultrasound knowledge for explainable point-of-care ultrasound Artificial Intelligence.

Vasilis Gkatzelis, PhD, assistant professor in the College of Computing & Informatics, received a prestigious National Science Foundation Faculty Early Career Development Program Award of $599,782 under the title of Optimal Mechanism Design without Monetary Transfers.

Rajashi Ghosh, PhD, associate professor and chair for Policy, Organization and Leadership in the School of Education received a grant of $307,000 from the National Science Foundation to support her research project titled, Towards a Theory of Engineering Identity Development & Persistence of Minoritized Students with Imposter Feelings: A Longitudinal Mixed-methods Study of Developmental Networks.

School of Educations Toni May, PhD, associate professor, and Kristin Koskey, PhD, visiting scholar, received a grant from the National Science Foundation to support their project, Collaborative Research: Developing and Evaluating Assessments of Problem-Solving in Computer Adaptive Testing Environments.

Ivan Bartoli, PhD, associate professor in the College of Engineering, was awarded a Federal Highway Administration (FHWA) grant as part of the Accelerated Marketing Readiness program. The FHWA Cooperative Agreement will provide $499,835 in funding to develop, and eventually commercialize, wireless sensors to be used for enhancing routine and special bridge inspections which ensure the safe operation of our transportation network and performance of bridges. Such information will be critical in prioritizing repairs and maintenance of our aging transportation infrastructure.

James Tangorra, PhD, professor and department head for Engineering Leadership and Society in the College of Engineering, was part of a team to receive a three-year, $1.5 million grant from the Office of Naval Research titled Locomotion and Transitions of an Amphibious System: Biologic to Robotic. The proposed work will build on and extend fundamental studies of the California sea lions swimming mechanism and thrust production capabilities.

Peter Baas, PhD, professor in the College of Medicine, received a two-year, $833,000 NIH grant for Role of Tau in Microtubule Stability in Adult Neurons.

Christian Sell, PhD, associate professor in the College of Medicine, received a one-year, $310,000 NIH grant for Novel Longevity Enhancing Pathways Regulated by mTOR.

The Andrew W. Mellon Foundation has committed a $500,000 grant to Brandywine Workshop & Archives with Drexel as sub-grantee. Faculty members from the Westphal College of Media Arts and Designs Department of Arts & Entertainment Enterprise Julie Goodman, associate professor; Neville Vakharia, associate professor; and Brea Heidelberg, associate professor will collaborate on succession, business planning and evaluation. School of Education Associate Dean of Research and Associate Professor Jennifer Katz-Buonocontro, PhD, along with graduate students, will conduct ethnographic artist interviews. The Lenfest Center for Cultural Partnerships (led by Associate Director Melissa Clemmer) will fund co-ops, manage the collaboration, and produce a white paper on the expanded digital resource Artura.

Guy Diamond, assistant professor in the College of Nursing and Health Professions, received $147,000 for an 18-month contract from CADEkids to develop an electronic survey to assess the needs of Philadelphia school-age youth and families, related to substance use, behaviors and attitudes.

A collaboration between College of Nursing and Health Professions PI Minjung Shim, PhD, assistant research professor, and co-investigators from the Dornsife School of Public Health (Kathleen Fisher, PhD, professor; Sungchul Park, PhD, assistant professor), College of Arts & Sciences (Fengqing Zhang, PhD, associate professor) and College of Nursing and Health Professions (Arun Ramakrishnan, PhD, director of research labs) and others received $74,000 for At-Home Telehealth Mindfulness-based Dance/Movement Therapy for Older Adults with Mild Cognitive Impairment: A Feasibility Study from Commonwealth Universal Research Enhancement 2021 Formula Grant Program.

Alexis Roth, PhD, associate professor in the Dornsife School of Public Health, received a NIH R01 grant for $4.9 million to conduct a randomized control trial to assess HIV prevention interventions over the next five years.

Jana Hirsch, PhD, assistant research professor in the Dornsife School of Public Health, was awarded a five-year, $4.4 million R01 National Institute on Aging grant for Contribution of Longitudinal Neighborhood Determinants to Cognitive Health and Dementia Disparities within a Multi-Ethnic Cohort. This study aims to identify actionable, community-level interventions to address and remediate racial and socioeconomic inequalities derived from the unequal distribution of environmental supports for healthy aging.

The following faculty were named Louis & Bessie Stein Family Fellowship recipients. The endowed Fellowship supports research, exchange, teaching and collaboration with partners in Israel.

Major Gifts, Honors, Recognition

Myrna Shure, PhD, professor emeritus in the College of Arts and Sciences, was awarded a Lifetime Achievement Award from the Center for the Promotion of Social & Emotional Learning.

Rebecca Clothey, PhD, associate head of global studies and modern languages in the College of Arts and Sciences and associate professor in the School of Education, was elected to the board of the Comparative and International Education Society.

Sharon Walker, PhD, dean of the College of Engineering, has been named Executive Director of ELATES at Drexel, a national leadership development program designed to advance senior women faculty in academic engineering, computer science, and other STEM fields into effective institutional leadership roles within their schools and universities. Walker will assume this new role in addition to her responsibilities as dean.

College of Medicine faculty members Leon McCrea II, MD, associate professor and senior associate dean for diversity, equity and inclusion, and Dennis Novack, MD, professor and associate dean of medical education, received a three-year, $300,000 grant from the Josiah Macy Jr. Foundation to spearhead creation of an online learning module on antiracism.

Several faculty from the College on Nursing and Health Professions have been recognized this year for their commitment to diversity, equity and inclusion. Highlights include:

Michael LeVasseur, PhD, assistant teaching professor in the Dornsife School of Public Health, was awarded by the Office of New York State Governor for co-founding COVIDoutlook.info and for his commitment to providing accurate, scientific information to policymakers about the pandemic.

The City Council of Philadelphia honored and congratulated Sharrelle Barber, PhD, assistant professor in the Dornsife School of Public Health, for her appointment as director of The Ubuntu Center on Racism, Global Movements and Population Health Equity at Dornsife with a resolution.

Read the original:

Faculty Highlights: Recent Grants and Awards | Now - Drexel Now

Read More..

How an AI finished Beethoven’s last symphony and what that means for the future of music – BBC Focus Magazine

When he died in 1827 aged 56, Ludwig van Beethoven left his 10th symphony unfinished. Only a few handwritten notes briefly detailing his plans for the piece have survived, with most just being incomplete ideas or fragments of themes or melodies.

Now, a multidisciplinary team of computer scientists at Rutgers University-based start-up Playform AI have trained an artificial intelligence to mimic the great composers style and used it to write a complete symphony based on these initial sketches.

We spoke to the lead researcher on the project, Professor Ahmed Elgammal, to find out more.

Beethoven left sketches in different forms, mainly musical sketches, but also some written notes with some ideas in as well. Previously, in 1988 [English musicologist] Barry Cooper used the majority of these sketches, about 250 bars of music, that were meant for a first movement [in his attempt to complete the symphony].

But what was left behind is really very little. So basically, like three bars of music here and four bars of music there and some rough sketches, which sound like basically the starting points of the main themes in the movements that he [Beethoven] wanted to write.

When you look at Beethoven and other classical composers, thats usually the case. I mean, usually they work with a main theme and develop it into a sequence of a couple of minutes and then another theme comes. Thats the traditional way of composing, and thats exactly what the AI needed to learn how Beethoven and other classical composers start with a theme and develop it. Like in the Fifth Symphony da da da dah. And then take that and evolve a whole movement around it.

Read more about artificial intelligence and music:

The way AI generates music in general is very similar to the way your email, for example, tries to predict the next word for you. So, when you write an email, you find it jumps into suggesting what you might want to write next.

Its the same concept, basically the AI has to learn from a lot of musical data. It asks what would be the next note given what you just wrote? And if you can predict the next note, then you can predict the next note and the next note and so on. Thats the main concept.

But what we soon realise is that if you start picking up the suggestions from the phone for next word and start writing just based on the AIs suggestions, it doesnt really hold for a long time. And thats what happens with music. If you just give it a starting point and leave it to predict, yes, it can predict a couple of notes. But then after that, it becomes nonsense more or less, and is no longer faithful to the main theme.

So that was the main challenge. How can we let the AI stick to the main theme and develop it? So this is where the role of the human expert working with the AI comes in. So we had to work with human experts to annotate and label a lot of music for us to tell the AI what the theme was and where the development of the theme was in a lot of pieces of music. So basically, the AI learnt as a student. That made a big difference because then the AI could really keep sticking to the theme.

Also, the AI had to compose the music in a specific musical form. So if you are composing for a scherzo movement or a trio part of the movement or a fugue etc, each of these musical forms have certain specific structure. The AI also had to learn how to write a fugue, how to write a trio, how to write a fugue, and how to write a scherzo.

It was very challenging because Beethoven only wrote nine symphonies. Thats a very small dataset compared to the scale of what the AI needed to do. So, the way we approached this was to first imagine ourselves like a young Beethoven learning about music. What he would have listened to?

So, we trained our first version of the AI as if it was somebody living in the 18th Century listening to baroque music like Bach, as well as Hayden and Mozart. And so that was the first version of the AI, which basically would be the kind of music anyone living in that era would study to compose. And then we took that and trained it specifically on Beethoven on old Beethoven sonatas, concertos, string quartets and the symphonies as well, so not only symphonies.

We first trained the AI to generate the composition as two lines of music, not as a full symphony, which is a typical way of a composer works by just composing first and then orchestrating. So then, we had another AI that would take that composition and learn how to orchestrate it. I believe this is very similar to the way humans learn you cannot really master fourth-level college without going through the first and second and third levels first. Its always incremental.

The symphony was premiered by The Beethoven Orchestra Bonn on 9 October 2021 Deutsche Telekom

The way we harmonise music is very similar to how we use AI to translate languages. Like when you use Google Translate or another AI to translate a sentence from one language to another. These kind of models used in translation learn a lot of background sentences. So, what is the sentence in German? What is the sentence in English? And from that, they try to learn how to translate them.

So basically, imagine you have these models [for harmonisation]. You put the melody in one side and on the other side you put in how Beethoven would harmonise it so the AI learns how to translate a melody line into harmonised music.

The thing about music is that its very structured and follows a lot of rules. But this is very hard for us to capture and write down. You really have to have a PhD in musicology with a speciality in Beethoven to really understand that. But the machine is able to capture that statistically and mathematically in a very implicit way and be able to use that to give us this harmonisation.

You got it right. That decision is just an extension of the harmonisation. We wanted the machine to translate the composition into multi-track instrumentation, which we also did by training the AI based on how Beethoven and other composers would do so.

Their response is really mixed. There are people who loved this very much, and love the idea of having an AI that understands music and can help you finish your composition or have you explore different musical ideas.

But on the other side of the spectrum, there are people who just reject even the concept of being able to complete a Beethoven symphony using AI. They are afraid of AI taking their jobs and think that it has nothing to do with this kind of thing.

Yeah. I have no doubt about that, we did that in visual art a couple of years ago where we developed an almost autonomous AI artist we had look at, lets say, the last 500 years of western art. The task was basically to generate new artworks that didnt follow any existing style.

If the AI generated an impressionist or a Picasso kind of art or a Renaissance-style artwork, it could realise and so it would have to learn how to create something new.

The challenge with this project was actually the constraints the fact that the AI was not generating music by itself but generating music that is based on Beethovens genius and also following the sketches. This makes it even more difficult. The high bar, of course, of expectation was due to the sketches coming from Beethoven. But when it comes to generating music autonomously I think thats an easier task.

Listen to the symphony below:

Read more about music:

Read more from the original source:

How an AI finished Beethoven's last symphony and what that means for the future of music - BBC Focus Magazine

Read More..

HSE University researchers explain behaviour of chaotic systems – EurekAlert

image:Illustration of the sandpile cascade view more

Credit: S. Shapoval et al.

Researchers of theLaboratory of Complex Systems Modeling and Controlhave proposed amissing component of the mechanism of self-organized criticality,which will enable the reproduction of power-law patterns observed in the real world. According to the researchers, this can be used to improve ourunderstanding of the the processes leading to strongearthquakes, forest fires, financial marketcrashes, anda suddensynchronization of social networks. The results of the researchwere published in theScientific Reportsjournal.The study was conducted with the support of the Russian Science Foundation.

Complex systems are all around us. From microscopic processes in the human brain to large-scale water flows in the ocean, science can describe the state of each part of asystem, but it is much more difficult to describe its behaviour as a whole. In complex systems, the interactionsbetweenparticular subsystemsare so complex that the overallsystem acquires completely new and unexpected properties that cannot be reduced to those of its individual components.

By controlling such parameters as temperature and magnetization, it is possible tobringa complex system to a critical point thatcauses a phase transition. During aphase transition, the basic properties of a system fundamentally change: for example, water changes from a liquid state into steam, and metal melts and turns into a liquid. The critical point itself is characterized by power-laws. However, there are various examples of processes and systems characterized by power-laws that occur without making anyadjustments: seismic activity indestructive earthquakes, neural and social networks, financial markets, forest fires, etc.

In 1987, scientists Bak, Tang, and Wisenfelddiscovered the phenomenon of self-organized criticality by constructing a mechanism that explains how a system evolves to a critical state without any adjustments to its parameters. Their model, known asthe sandpile or BTW model, is constructed ona square lattice that contains integers interpreted as grains. Once a locally large pile of grainsisformed, an avalanche occurs: grains propagate over the lattice and fall out of it when they reach the edge. The figure shows an avalanche that begins when four grains of sand appear in a cell, andare then transferred to four neighbouring cellsone grain per neighbour. Four new grains are thendistributed according to the same pattern. The video below shows the evolution of the model on a 16x16 lattice; the darker the cell, the more grains it contains. The discovery of self-organized criticality has had a huge impact on the development of entire fields of statistical physics, biophysics, astrophysics, optimization and topology.

Researchers can propose countless implementations of the BTW mechanism. However, out of a wide class of models, it is possible to achieve only a small number of power-laws that arise in a critical state. This stability of the exponents of power laws makes it difficult to apply models of self-organized criticality to real-life problems.

In a paper published in the Scientific Reports journal, researchers of the Laboratory of Complex Systems Modeling and Control established that the clustering of the events in space and time together with the core principles revealed by Bak, Tang, and Wiesenfeld lead to approximately 1/xpower-law in the size-frequency distribution of model events.

1/xsize-frequency relationship has long attracted the attention of researchers with its simplicity, which borders on elegance. The hunt for it has finally ended. The proposed mechanism manifests a fundamental property of the observed systems: the clustering of events in space and time. Therefore, it is natural to think that the mechanism will be in demand in applications, laying the foundation for future research, said Alexander Shapoval, co-author of the paper and Professor of the HSE Faculty of Computer Science.

Video illustrating evolution in a lattice. Different colours correspond to different numbers that appear in the cells. (Author:DayanaSavostyanova(GSSI, Italy)).

Scientific Reports

1/x power-law in a close proximity of the BakTangWiesenfeld sandpile

13-Sep-2021

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read the original here:

HSE University researchers explain behaviour of chaotic systems - EurekAlert

Read More..

It’s Time for Real-World AI on the Edge – HPCwire

Artificial Intelligence has been a key talking point in the high-performance computing and computer science communities for decades now.

First it was a thought experiment and theoretical discussion point, but in recent years, it has become a practical area of focus for many scientists, researchers, and engineers, as well as businesses, universities, and government agencies. As the shift from theory to practice accelerated, so too did the excitement and scope of the expected impact of this technology on our society.

Autonomous vehicles, just-in-time maintenance, and real-time image processing are just a few of the ways you can apply AI at the edge. These and many other use cases are considered realistic deployments of complex deep learning training and inference models.

Until recently, these technologies have relied on large datacenters and inefficient algorithms to build, train, and run these models. For example, a decade ago, IBMs Watson became famous as one of the first modern AI systems. It required ninety 4U servers to generate 80 TFLOPs of performance. But today, you can beat that performance with just a handful of white box GPU-accelerated servers.

This growth in system efficiency and performance has created an environment in the datacenter where machine learning and deep learning models can thrive. Simultaneously, the barrier to entry into AI workflows is being lowered, opening the doors to new use cases, algorithms, and benefits from AI.

Still, we havent seen the world-altering effects of AI outside of digital experiences like social media and search engines. What is keeping more tangible use cases like autonomous vehicles from wide-scale adoption? Challenges in edge computing.

Historically, datacenter environments have had the distinct advantage of having large amounts of equipment, power, cooling, physical space, data storage, and high-performance networking available to provide ever-increasing levels of performance.

Trying to create similar performance at the edge has been a key challenge because engineers lose the luxuries of large, power-hungry, and loud clusters. Edge devices have environmental, operational, and practical limitations that are not present in the controlled environment of a datacenter.

A well-trained and efficient AI model allows lower-power edge devices to handle inference tasks, but they are unable to also train and improve their own algorithms. Instead, workflows must rely on communication between edge devices and the datacenter.

Machine learning and deep learning rely on huge volumes of data that must be stored and processed. To use these technologies at the edge requires a tiered processing system. This allows data to be uploaded to the datacenter for processing, where algorithms can be further refined and downloaded to the edge device. That device, in turn, becomes better able to act or output in a desired fashion, without having to wait for the datacenter to execute a decision. This positive feedback loop is key to building effective and powerful AI applications that can perform on the edge.

Multiple forces are converging to allow this model to operate practically.

Processing power for AI has always been a limiting factor. Now, however, thanks to GPU acceleration and great advances in CPU performance and efficiency, that has changed. The state-of-the-art has moved so much that traditional chip manufacturers are creating processors that are much faster and more efficient than chips from just a few years ago, making them well-suited for edge applications. Some manufacturers are going so far as to make -specific processors for AI on the edge.

Because of these new capabilities, the overall cost of performance has come down substantially in recent years. This is a critical issue for organizations looking to deploy AI on the edge, since the sheer number of potential edge devices that can run AI far exceeds the number of datacenters.

Another area where historical attempts at AI on the edge have fallen short is in the speed and reliability of communication between the datacenter and the edge. For instance, an autonomous vehicle is responsible for the safety of passengers, pedestrians, and other vehicles. It cannot depend on a slow or unreliable network to operate properly. Fortunately, new 5G wireless, wide-area networks will enable workloads that will drive demand for AI and Compute at the edge.

Just because something is possible does not mean it is practical. Organizations looking to implement AI at the edge often face an uphill battle. Without the proper expertise, developing optimized datacenter systems for machine learning training and designing the necessary edge equipment for inference can be unrealistic for most teams.

Ensuring you have a properly balanced, high-ROI cluster for your AI workload is difficult. But the problem becomes even more challenging when planning for the large amounts of data coming in from hundreds or thousands of edge devices.

Then, if youre able to build that system efficiently, you still need to solve for a completely different technological problem: edge device design. Commercial, off-the-shelf hardware is rarely capable of providing the optimal solution for specific workload needs, let alone the additional environmental and form factor challenges edge clusters face.

For example, what COTS system can operate in extreme temperature or weather conditions? What about safety and security aspects like operating temperatures or data encryption? If you find you need to design a custom device, how do you ensure it has everything you need while meeting regulatory requirements?

These are important things to consider as you plan for an edge AI or inference project. Silicon Mechanics and our partner Comark have many decades of experience in datacenter and ruggedized edge solution design experience between them. Together, weve put together a consideration guide on preparing for 5G edge computing that covers many of the key decisions organizations must make when deploying AI on the edge.

Get the white paper here.

Originally posted here:

It's Time for Real-World AI on the Edge - HPCwire

Read More..

First lecture in new series to focus on computing for more sustainable future | Penn State University – Penn State News

UNIVERSITY PARK, Pa. Using computing power to create a better, more sustainable future will be the topic at the first Center for Artificial Intelligence Foundations and Scientific Applications (CENSAI) distinguished seminar. The event will be heldonlineat 4 p.m. on Monday, Oct. 18.

Carla Gomes, the Ronald C. and Antonia V. Nielsen Professor of Computing and Information Science and the director of the Institute for Computational Sustainability at Cornell University, will present a lecture on her work in the fields of artificial intelligence (AI) and sustainability.

Gomes and her research group are investigating how AI can advance scientific discovery for a sustainable future and, in particular, computational sustainability. Computational sustainability has the overarching goal of developing computational models and methods to help manage the balance between environmental, economic and societal needs for a sustainable future.

The talk will include examples of computational sustainability problems,including biodiversity and wildlife conservation, multicriteria strategic planning of hydropower dams in the Amazon basin, and materials discovery for renewable energy materials. Gomes also will discuss cross-cutting computational themes and challengesfor AI at the intersection of constraint reasoning, optimization, machine learning, multiagent reasoning and citizen science.

Gomes received a doctoral degree in computer science in the area of artificial intelligence from the University of Edinburgh. Her research area is AI with a focus on large-scale reasoning, optimization and learning. Gomes is a Fellow of the Association for the Advancement of Artificial Intelligence, a Fellow of the Association for Computing Machinery, and a Fellow of the American Association for the Advancement of Science.

CENSAI is housed in the Institute for Computational and Data Sciences (ICDS) that enables Penn State researchers to explore the use of artificial intelligence as a tool to dramatically accelerate the scientific process.

Click here to learn more about the seminar series, or register here.

Carla Gomes, the Ronald C. and Antonia V. Nielsen Professor of Computing and Information Science and the director of the Institute for Computational Sustainability at Cornell University, will present a lecture on artificial intelligence and sustainability.

Last Updated October 12, 2021

View post:

First lecture in new series to focus on computing for more sustainable future | Penn State University - Penn State News

Read More..

University of Texas Mourns Passing of Peter J. O’Donnell, Jr. – UT News | The University of Texas at Austin

ODonnell was born near Dallas in 1924. He received a bachelors degree in mathematics from The University of the South in Sewanee, Tennessee, followed by an MBA from the Wharton School of the University of Pennsylvania.

Upon graduation, his early path led him into the world of commerce, where he found financial success in the securities industry.

ODonnells formidable intellect and business acumen led him naturally into politics, where he is credited with reshaping the Texas Republican Party during his tenure as the state partys chairman from 1962 to 1969.

During his early 30s, he and Edith founded the ODonnell Foundation, a philanthropic organization focused on higher education support in Texas, which grew to become the fifth largest independent Texan foundation. Over the years they contributed hundreds of millions of dollars to public and private educational institutions, mostly anonymously.

The O'Donnell Foundation focused on four areas: math; science and engineering education; medicine, arts and music education; and K-12 education.

Edith ODonnell was a graduate in psychology from UT Austin. While the two made their impact felt across higher education in Texas, they had special connection to the Forty Acres.

The O'Donnell Foundation and ODonnell family are among the largest donors to UT Austin. Their support included a challenge grant that created 32 one million-dollar chairs in science and engineering at UT Austin. In 2013, the university announced the naming of the O'Donnell Building for Applied Computational Engineering and Sciences, in recognition of their support throughout the years.

The Oden Institute and TACC

Peter ODonnell enjoyed a long friendship with J. Tinsley Oden, professor of mathematics, computer science, mechanical engineering and aerospace engineering and engineering mechanics at UT Austin, and widely considered to be the father of computational mechanics.

ODonnell realized how important high performance computing (HPC) and computational science were to become in higher education, and in Oden he found a partner with the kind of specialized expertise in mathematics, science and engineering to help put both on the map at UT Austin.

O'Donnell and Oden worked closely to create the Institute for Computational Engineering and Sciences in 2002, now called the Oden Institute for Computational Engineering & Sciences and recognized as one of the top computational research institutes in the world.

Thanks to the unwavering generosity of the ODonnells, Oden was able to recruit some of the most talented computational scientists in the field and build a team that could not only expand the mathematical agility of computational science and engineering as a discipline, but also grow the number of potential real-world applications.

O'Donnell also helped UT Austin get ahead in HPC. In 1985, he encouraged university leadership to acquire the powerful CRAY X-MP system, making UT Austin the first university in Texas to have a supercomputer. This led to the establishment in 2001 of a dedicated advanced computing entity, the Texas Advanced Computing Center (TACC).

Encouragement was followed by the foundations generous support of TACC, which now operates two of the most powerful university supercomputers in the world: Frontera and Stampede2. O'Donnell noted, "That's leverage. High performance computing is changing everything."

ODonnell renewed his commitment to TACC in 2012, and to the advancment of data-driven science by supporting the acquisition of high-performance data analysis and storage systems efforts that helped UT Austin become a leader in data science, machine learning and artificial intelligence. The foundation also contributed to the construction of the Advanced Computing Building that houses the bulk of TACCs staff.

See the article here:

University of Texas Mourns Passing of Peter J. O'Donnell, Jr. - UT News | The University of Texas at Austin

Read More..