Page 1,423«..1020..1,4221,4231,4241,425..1,4301,440..»

Organic chemists should place their trust in machine learning’s black … – Chemistry World

Classically, a black box is a system whose inputs are controlled or known, and whose outputs can be harvested, but the internal workings remain a mystery. Take Google search we may know roughly how it works, but details of the search algorithm are kept secret from the public. But when organic chemistry meets computing, we sometimes feel we want to know everything black boxes can be seen as a frustrating and distrusted tool.

Its fair to say that sometimes, comprehensive understanding lets us control all variables to avoid problems. As a student, I expressed concern over the results of a computational exercise, to be dismissed with but the computer says this is what you have to do. Three months of hard work later, we synthetic chemists were vindicated when it was found that thanks to a computing error in a system we couldnt access directly, we had indeed been working on the wrong compounds for all that time. I have had a rigorous dose of scepticism for methods out of our control ever since!

Although caution is very well advised, we should also remember undergraduate thermodynamics, when we learn to deliberately treat chemical systems as black boxes, their complexity reduced to just a few fundamental parameters otherwise we are unable to compute their properties. For the most complex systems a clear understanding of the systems workings needs to be a sufficient substitute for knowing the exact pathways to reach our answer. I am particularly thinking of machine learning methods: systems whose contents are for practical purposes unknowable, and whose reasoning may not make sense to human users. However, making a leap of faith is highly uncomfortable for organic chemists, who are used to having authority and reasoning even over atomic structure. Although we will never hold every detail of an individual neural net in our minds eye, we can learn how they work, how it was created, and which parameters it has been allowed to exploit. An elementary understanding of the tools and some trust in expert collaborators allow us to reduce concerns in abstracted methods.

On some level, humans abstract almost everything we use. Every time you use an LCMS as a synthetic chemist, you dont need to mentally run through a back-to-basics understanding of the relative polarities, UV absorption and ionisability of your substrates. Simply referring to your compound as a tertiary aniline provides sufficient information for an experienced user to expect a certain outcome. These abstractions might even be directly hard-coded; for example, you may have polar and apolar generic methods set up on the instrument. There are countless popular examples of more readily recognising a concept when we give a name to it perhaps name reactions are one case as well as negative examples, such as seeing someone who looks like a yob and falsely making a mental connection to troublemaking. The audiences capability for abstraction is also a useful tool when presenting complex results: data storytelling allows a presenter to build individual bricks of data into conceptual structures, helping the audience to feel they have fewer individual concepts to wrap their heads around.

Abstractions leap to human non-interpretability when they involve computer-speed calculations or too many variables. Luckily, computers excel at these, but it can be a shock when the methods no longer fit inside a human brain. I visualise these superhuman helpers as being another layer on top of the brain, much like a laptop farming out calculations to a supercomputer cluster and then retrieving the results.

We organic chemists are not actually capable of understanding everything

And this is the advantage: some systems we dont understand really are better than us at what they do. Although machine learning is still an emerging tool when applied to organic chemistry, particularly due to our relatively small datasets, its power is clear from our everyday use in facial recognition on our devices to voice recognition on our home assistants. (That said, machine learning is subject to the same biases as its training: it can overuse a go-to catalyst, or more alarmingly, struggle more with darker skin tones on human images.) Something that may not be clear to those outside large companies is how frequently machine learning is found useful within chemistry, too. At the end of the day, what matters is whether the results verifiably work, rather than how we arrived there, although it may be hard to swallow. We have to make a leap of faith and remember that we organic chemists are not actually capable of understanding everything.

The famously not-so-humble world of organic chemistry is being damaged by our egos and our lack of willing to submit to the higher power of abstraction. We could make our field stronger and more useful, and as any computational chemist will tell you, black box processes need not come at the cost of overall insights.

Link:
Organic chemists should place their trust in machine learning's black ... - Chemistry World

Read More..

Abbott and New Global Consortium Partnership Address Viral – CSRwire.com

Published 04-21-23

Submitted by Abbott

ABBOTT PARK, Ill.,April 21, 2023 /CSRwire/- Abbott (NYSE: ABT) announced today that it is partnering with the Climate Amplified Disease and Epidemics (CLIMADE) consortium, a group of more than 100 global scientists in public health agencies, academia and industry focused on using data science technology and diagnostic testing to assess and potentially mitigate the impact climate change has on disease outbreaks.

A changing climate, such as warmer temperatures and a rise in extreme weather events like droughts and floods, has the potential to accelerate the spread of disease, which could fuel a new era of pandemics. Research has found that climate change could impact more than half of known infectious diseases, which commonly spread via water or animals carrying diseases, such as West Nile virus and malaria.1

As part of the consortium, scientists trained in infectious diseases, bioinformatics and data science will develop technologies that can aggregate environmental, weather and viral sequencing data sets to predict if conditions could cause a disease outbreak. If a potential outbreak is identified, resources and rapid surveillance testing can be sent to that location to prevent further spread.

"Imagine being able to track weather patterns to determine if rising floods may lead to a water-borne disease outbreak," said Gavin Cloherty, Ph.D., head of infectious disease research and the Pandemic Defense Coalition in Abbott's diagnostics business. "Abbott's work with CLIMADE is focused on tracking and predicting events so testing and medical resources can be deployed to prevent the spread of disease making a real impact in communities and people's lives."

The CLIMADE consortium will be focused on improving surveillance tools and expanding access to resources to decrease the impact of climate amplified diseases and epidemics. The global group of scientists is led by Tulio de Oliveira, Ph.D., a professor at Stellenbosch University and Director of the Centre for Epidemic Response and Innovation (CERI) in South Africa as well as Luiz Carlos Alcantara, Ph.D., a professor at the Fundao Osvaldo Cruz (FIOCRUZ) in Brazil and Edward Holmes, Ph.D., an evolutionary biologist and virologist and professor at the University of Sydney. CLIMADE members include public health agencies, like the Africa Centers for Disease Control and Prevention (CDC), that bring decades of experience in genomics surveillance and epidemic response, as well as academic organizations such as the Broad Institute, University of Washington and University of Oxford.

Abbott and its partners in the Abbott Pandemic Defense Coalition will provide viral sequencing and testing data as part of the technology being developed and can provide diagnostic testing for potential outbreaks.

"We are bringing together the best minds in the medical, scientific and public health communities to help the world create a robust surveillance system that quickly identifies pathogens and tracks their evolution and spread," said Oliveira. "This collaboration across the private and public sectors is critical to pandemic preparedness and to our ability to go from responding to outbreaks to predicting them before they occur."

CLIMADE's initial work will start with disease surveillance in Africa and expand to countries around the world that are often impacted by infectious disease outbreaks.

Protecting Health in an Evolving ClimateSafeguarding a healthy environment is a longstanding part of Abbott's purpose to help people live fuller lives through better health. Building on our longstanding commitment to minimize our environmental footprint and protect precious resources, we're also focused on taking action to protect people's health in the face of climate change. At Abbott, our work focuses in two areas: tracking and finding solutions for emerging health threats and preparing frontline systems and communities. Across our business and in collaboration with others, we'll work to identify and address emerging health issues, strengthen underlying health systems and help build more resilient communities in a warming world. For more information, visit abbott.com/sustainability.

About AbbottAbbott is a global healthcare leader that helps people live more fully at all stages of life. Our portfolio of life-changing technologies spans the spectrum of healthcare, with leading businesses and products in diagnostics, medical devices, nutritionals and branded generic medicines. Our 115,000 colleagues serve people in more than 160 countries.

Connect with us at http://www.abbott.com, on LinkedIn at http://www.linkedin.com/company/abbott-/, on Facebook at http://www.facebook.com/Abbott and on Twitter @AbbottNews.

About CERIThe Centre for Epidemic Response and Innovation (CERI) is an academic and research entity located within the School for Data Science and Computational Thinking in the Faculty of Science at Stellenbosch University and the laboratories are situated at the state-of-the art facilities at the Faculty of Medicine and Health Sciences based at the Tygerberg Medical Campus. CERI's goal is to strengthen Africa's capacity to quickly identify and control its own epidemics and pandemics before they become a global problem.

Connect with us at http://www.ceri.org.za and on Twitter @ceri_news

References:

1. Mora, Camiloet al. Nature Climate Change 8 Aug 2022. https://www.nature.com/articles/s41558-022-01426-1

About Abbott and the Abbott Fund

The Abbott Fund is a philanthropic foundation established by Abbott in 1951. The Abbott Fund's mission is to create healthier global communities by investing in creative ideas that promote science, expand access to health care and strengthen communities worldwide. For more information, visit http://www.abbottfund.org.

Abbott is a global, broad-based health care company devoted to the discovery, development, manufacture and marketing of pharmaceuticals and medical products, including nutritionals, devices and diagnostics. The company employs nearly 90,000 people and markets its products in more than 130 countries. Abbott's news releases and other information are available on the company's website at http://www.abbott.com.

More from Abbott

See the original post:

Abbott and New Global Consortium Partnership Address Viral - CSRwire.com

Read More..

What is a Machine Learning Engineer? Salary & Responsibilities – Unite.AI

The world of artificial intelligence (AI) is growing exponentially, with machine learning playing an instrumental role in bringing intelligent systems to life. As a result, machine learning engineers are in high demand in the tech industry. If youre contemplating a career in this captivating domain, this article will give you a comprehensive understanding of a machine learning engineers role, their primary responsibilities, average salary, and the steps to becoming one.

A machine learning engineer is a specialized type of software engineer who focuses on the design, implementation, and optimization of machine learning models and algorithms. They serve as a link between data science and software engineering, working in close collaboration with data scientists to transform prototypes and ideas into scalable, production-ready systems. Machine learning engineers play a vital role in converting raw data into actionable insights and ensuring that AI systems are efficient, accurate, and dependable.

Machine learning engineers have a wide range of responsibilities, including:

The average salary of a machine learning engineer can vary based on factors such as location, experience, and company size. According to Glassdoor, as of 2023, the average base salary for a machine learning engineer in the United States is approximately $118,000 per year. However, experienced professionals and those working in high-demand areas can earn significantly higher salaries.

To become a machine learning engineer, follow these steps:

the key traits that contribute to the success of a machine learning engineer.

Machine learning engineers often face complex challenges that require innovative solutions. A successful engineer must possess excellent analytical and problem-solving skills to identify patterns in data, understand the underlying structure of problems, and develop effective strategies to address them. This involves breaking down complex problems into smaller, more manageable components, and using a logical and methodical approach to solve them.

A solid foundation in mathematics and statistics is crucial for machine learning engineers, as these disciplines underpin many machine learning algorithms and techniques. Engineers should have a strong grasp of linear algebra, calculus, probability, and optimization methods to understand and apply various machine learning models effectively.

Machine learning engineers must be proficient in programming languages such as Python, R, or Java, as these are often used to develop machine learning models. Additionally, they should be well-versed in software engineering principles, including version control, testing, and code optimization. This knowledge enables them to create efficient, scalable, and maintainable code that can be seamlessly integrated into production environments.

Successful machine learning engineers must be adept at using popular machine learning frameworks and libraries such as TensorFlow, PyTorch, and Scikit-learn. These tools streamline the development and implementation of machine learning models, allowing engineers to focus on refining their algorithms and optimizing their models for better performance.

The field of machine learning is constantly evolving, with new techniques, tools, and best practices emerging regularly. A successful machine learning engineer must possess an innate curiosity and a strong desire for continuous learning. This includes staying up-to-date with the latest research, attending conferences and workshops, and engaging in online communities where they can learn from and collaborate with other professionals.

Machine learning projects often require engineers to adapt to new technologies, tools, and methodologies. A successful engineer must be adaptable and flexible, willing to learn new skills and pivot their approach when necessary. This agility enables them to stay ahead of the curve and remain relevant in the fast-paced world of AI.

Machine learning engineers frequently work in multidisciplinary teams, collaborating with data scientists, software engineers, and business stakeholders. Strong communication and collaboration skills are essential for effectively conveying complex ideas and concepts to team members with varying levels of technical expertise. This ensures that the entire team works cohesively towards a common goal, maximizing the success of machine learning projects.

Developing effective machine learning models requires a high degree of precision and attention to detail. A successful engineer must be thorough in their work, ensuring that their models are accurate, efficient, and reliable. This meticulous approach helps to minimize errors and ensures that the final product meets or exceeds expectations.

Becoming a machine learning engineer requires a strong foundation in mathematics, computer science, and programming, as well as a deep understanding of various machine learning algorithms and techniques. By following the roadmap outlined in this article and staying current with industry trends, you can embark on a rewarding and exciting career as a machine learning engineer. Develop an understanding of data preprocessing, feature engineering, and data visualization techniques.

Learn about different machine learning algorithms, including supervised, unsupervised, and reinforcement learning approaches. Gain practical experience through internships, personal projects, or freelance work. Build a portfolio of machine learning projects to showcase your skills and knowledge to potential employers.

Visit link:

What is a Machine Learning Engineer? Salary & Responsibilities - Unite.AI

Read More..

Study shows how machine learning can identify social grooming behavior from acceleration signals in wild baboons – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Scientists from Swansea University and the University of Cape Town have tracked social grooming behavior in wild baboons using collar-mounted accelerometers.

The study, published in the journal Royal Society Open Science, is the first to successfully calculate grooming budgets using this method, which opens a whole avenue of future research directions.

Using collars containing accelerometers built at Swansea University, the team recorded the activities of baboons in Cape Town, South Africa, identifying and quantifying general activities such as resting, walking, foraging and running, and also the giving and receiving of grooming.

A supervised machine learning algorithm was trained on acceleration data matched to baboon video recordings and successfully recognized the giving and receiving grooming with high overall accuracy.

The team then applied their machine learning model to acceleration data collected from 12 baboons to quantify grooming and other behaviors continuously throughout the day and night-time.

Lead author Dr. Charlotte Christensen of the University of Zurich said, "We were unsure whether a sensor on a collar would be able to detect a behavior that involves such subtle movements, but it has worked. Our findings have important implications for the study of social behavior in animals, particularly in non-human primates." Two baboons holding on to each other, sitting on a rock surrounded by grass. Credit: Charlotte Christensen

Social grooming is one of the most important social behaviors in primates and, since the 1950s, has become a central focus of research in primatology.

Previously, scientists have relied on direct observations to determine how much primates groom each other, and while direct observations provide systematic data, it is sparse and non-continuous, with the added limitation of researchers only being able to watch a few animals at a time.

Technology like the one used in this study is revolutionizing the field of animal behavior research and allowing exciting new areas of investigation. Two baboons sitting on a rock and looking out at the land below. Credit: Charlotte Christensen

Senior author Dr. Ines Frtbauer of Swansea University said, "This is something our team have wanted to do for years. The ability to collect and analyze continuous grooming data in wild populations will allow researchers to re-examine long-standing questions and address new ones regarding the formation and maintenance of social bonds, as well as the mechanisms underpinning the sociality-health-fitness relationship."

More information: Charlotte Christensen et al, Quantifying allo-grooming in wild chacma baboons ( Papio ursinus ) using tri-axial acceleration data and machine learning, Royal Society Open Science (2023). DOI: 10.1098/rsos.221103

Journal information: Royal Society Open Science

More:
Study shows how machine learning can identify social grooming behavior from acceleration signals in wild baboons - Phys.org

Read More..

Using machine learning to find reliable and low-cost solar cells … – eeNews Europe

We use cookies to enhance your browsing experience, serve personalized ads or content, and analyze our traffic. By clicking "Accept", you consent to our use of cookies

The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.

The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.

The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.

The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.

See the original post here:
Using machine learning to find reliable and low-cost solar cells ... - eeNews Europe

Read More..

Student Showcase Preview: Customizing the ChatGPT Artificial … – CMUnow

Discussions about the benefits and risks associated with artificial intelligence (AI) are everywhere right now, and college campuses are grappling with how to address the rise of chat-based AI software like ChatGPT. At this years Student Showcase, several research projects related to machine learning and AI will be on display. The Student Showcase is a celebration of the creativity, research, innovation, entrepreneurship and artistic performance of Colorado Mesa University and Western Colorado Community College students at both the undergraduate and graduate level. This year will mark the 14th anniversary of the event and there will be 377 sessions, a near record number. Those curious to learn more about how students in the Computer Science Program are working with cutting-edge AI technologies are invited to come learn about their work and ask questions during one of the many sessions focused on AI.

One of these groups, comprised of CMU computer science students Sullivan Frazier, Zackary Mason and Axel Garces, is going under the hood to develop their own machine learning software program as well as experimenting to make the popular ChatGPT chatbot platform more user friendly and approachable. A chatbot is a computer program that simulates human conversation and allows humans to engage with digital devices as if they were speaking with a real person.

Working from the premise that many people find AI intimidating, this group has collaborated to build an interactive web application that allows users to customize the characteristics of the chatbot they interact with. For example, you can choose to have your chatbot assume the characteristics, speech patterns and knowledge of Yoda from Star Wars. In addition to making the chatbot experience more playful and fun, this feature can also allow users to select a chatbot based on their personal language and culture preferences allowing for a chatbot experience that reflects the individual using it.

Through their work the group has grappled with some of the deeper issues that AI presents. Mason explained, machine learning has been around since the 90s, but now we have the computing power to make products that people find useful and its not behind closed doors anymore. ChatGPT isnt creating new things, but it is quickly and accurately sorting through the huge repository of human knowledge that people have put into it, which is something new.

Machine learning has been around since the 90s, but now we have the computing power to make products that people find useful and its not behind closed doors anymore. ChatGPT isnt creating new things, but it is quickly and accurately sorting through the huge repository of human knowledge that people have put into it, which is something new. Zackary Mason

The team is specifically concerned about AI applications in which the programs are forced to make tough decisions where serious tradeoffs have to be considered. They believe that AI is great at collecting and organizing data, but the group argues there still needs to be a human element when the stakes are so high. Sometimes you need an ethical line, you need a moral line, you need a human with a heartbeat making those big decisions, said Frazier. Mason agreed, I dont think AI is going to take all our jobs, but we need to find the balance between humans and technology. The group is excited about the future of computer science, and they are optimistic that humanity will be resilient in the face of the changes and challenges that AI presents.

Frazier is excited to present their research and bring this discussion to the larger CMU community at the Student Showcase. Sometimes it feels like Im a bit cooped up in Confluence Hall in my daily life. I dont talk to a lot of people outside of computer science, and a lot of people dont have a clue as to what were doing and whats going on in here. Going to showcase allows people to come see what youre up to and you get to learn about things happening in totally different fields, said Frazier.

Sometimes it feels like Im a bit cooped up in Confluence Hall in my daily life. I dont talk to a lot of people outside of computer science, and a lot of people dont have a clue as to what were doing and whats going on in here. Going to showcase allows people to come see what youre up to and you get to learn about things happening in totally different fields. Sullivan Frazier

Frazier, Mason and Garces group received guidance and support from their faculty mentor Associate Professor of Computer Science and Co-Director of Cyber Security Center Ram Basnet, PhD. Basnet, along with other CMU computer science faculty, is looking to expand the AI program offerings in coming years and the department currently offers professional certificates in cybersecurity, data science and web application development for students pursuing a degree in computer science.

This years Student Showcase will kick off at 12pm on Friday, Apr. 28 at the Love Recital Hall in the Moss Performing Arts Center. Presentations, performances, demonstrations and exhibits will then take place throughout the day across campus. The day will wrap up with a celebration event at 4:30pm in the University Center Meyer Ballroom.

This event is free and open to the public and more information about this years sessions and parking details are available on the Student Showcase website.

Read more:
Student Showcase Preview: Customizing the ChatGPT Artificial ... - CMUnow

Read More..

Intel v-p: ‘research needed to open machine learning black box’ – Times Higher Education

Academic research is needed togive anethical foundation tomachine learning, according tothe head ofthe AI engineering team atIntel, one ofthe worlds largest semiconductor chip manufacturers.

Speaking at the Digital UniversitiesUK event, held by Times Higher Education inpartnership with the University of Leeds, WeiLi, vice-president and general manager ofartificial intelligence and analytics atIntel, said: Machine learning today isa black box. You get what you get, and you dont really know why.

He added: In some applications [such as healthcare], you will want to know why that system gave you that answer.

Academic research can help to expose fundamental ethical issues in AI, such as in-built bias around gender, DrLi told the event.

There are a lot of unknowns and open questions in machine learning today, and itreally demands fundamental research that universities cando and industry cant, he said.

Dr Li said his team at Intel was working to develop technology that would help to make fair and inclusive AI systems.

But speaking with THE, he warned that industry was more focused on building AI systems than with addressing the ethical questions they provoke. These problems have roots in how the overall machine learning works, he said. These things Ihope people in academia can do something deeper than what were doing.

Despite a recent open letter signed by AI experts and industry executives, including Elon Musk, calling for a pause in the development of AI until we are confident that their effects will be positive and their risks will be manageable, DrLi does not expect AI advancements to slow.

Its not realistic, he said. Its a risk in terms of commercialisation, and its a race to be the fastest and the first in the industry. Thats enough motivation for people to go for these things.

So with the rapid advancement in AI systems, will academic research into machine learning models quickly become obsolete? No, said DrLi, who argued that universitiescould influence the way that future systems are built. Idont expect them [researchers] to dig into ChatGPT and explain ChatGPT thats impossible to do given the state of art we have today. But if people have a better foundation for machine learning, then maybe the next generation can be a safer and less biased model.

When it comes to university teaching, DrLi said, higher education has a challenge and an opportunity to better train the next generation of students in the new AI environment. Institutions should teach students to be more than just simply a messenger for ChatGPT, headded.

The products of a university are the students youre producing. If theyre not better than ChatGPT, then why do we bother to send them to university? DrLi asked.

sara.custer@timeshighereducation.com

Read more:
Intel v-p: 'research needed to open machine learning black box' - Times Higher Education

Read More..

M.Sc. in machine learning & data science in the Efi Arazi School of Computer Science at Reichman University – The Times of Israel

The M.Sc. program in Machine Learning & Data Science at the Efi Arazi School of Computer Science aims to provide a deep theoretical understanding of machine learning and data-driven methods as well as a strong proficiency in using these methods. As part of this unique program, students with solid exact science backgrounds, but not necessarily computer science backgrounds, are trained to become data scientists. Headed by Prof. Zohar Yakhini and PhD Candidate Ben Galili, this program provides students with the opportunity to become skilled and knowledgeable data scientists by preparing them with fundamental theoretical and mathematical understandings, as well as endowing them with scientific and technical skills necessary to be creative and effective in these fields. The program offers courses in statistics and data analysis, different levels of machine -learning courses as well as unique electives such as a course in recommendation systems and on DNA and sequencing technologies.

PhD candidate Ben Galili, Academic Director of Machine Learning and Data Science Program

In recent years, data science methodologies have become a foundational language and a main development tool for science and industry. Machine learning and data-driven methods have developed considerably and now penetrate almost all areas of modern life. The vision of a data-driven world presents many exciting challenges to data experts in diverse fields of application, such as medical science, life science, social science, environmental science, finance, economics, and business.

Graduates of the program are successful in becoming data scientists in Israeli hi-tech companies. Lior Zeida Cohen, a graduate of the program says After earning a BA degree in Aerospace Engineering from the Technion and working as an engineer and later leading a control systems development team, I sought out a graduate degree program that would allow me to delve deeply into the fields of Data Science and Machine Learning while also allowing me to continue working full-time. I chose to pursue the Machine Learning & Data Science Program, at Reichman University. The program provided in-depth study in both the theoretical and practical aspects of Machine Learning and Data Science, including exposure to new research and developments in the field. It also emphasized the importance of learning the fundamental concepts necessary for working in these domains. In the course of completing the program, I began work at Elbit Systems as an algorithms developer in a leading R&D group focusing on AI and Computer Vision. The program has greatly contributed to my success in this position.

Prof. Zohar Yakhini, Head of Machine Learning and Data Science Program.Photo: Gabriel Baharlia

As a part of the curriculum, the students execute collaborative research projects with both external and internal collaborators, in Israel and around the world; One active collaboration is with the Leibniz Institute for Tropospheric Research (TROPOS) in Leipzig, Germany. In this collaboration, the students, led by Prof. Zohar Yakhini and Dr. Shay Ben-Elazar, a Principal Data Science and Engineering Manager at Microsoft Israel, as well as Dr. Johannes Bhl from TROPOS, are using data science and machine learning tools in order to infer properties of stratospheric layers by using data from sensory devices. The models developed in the project provide inference from simple devices that achieves an accuracy which is close to that which is obtained through much more expensive measurements. This improvement is enabled through the use of neural network models (deep learning).

Results from the TROPOS project: a significant improvement in the inference accuracy.Left panel: actual atmospheric status as obtained from the more expensive measurements (Lidar + Radar)Middle panel: predicted status as inferred from Lidar measurements using physical models.Right panel: status determined by the deep learning model developed in the project.

Additional collaborations include a number of projects with Israeli hospitals such as Sheba Tel Hashomer, Beilinson Hospital, and Kaplan Medical Center, as well as with the Israel Nature and Parks Authority and with several hi-tech companies.

Several research and thesis projects are led by students in the program addressing data analysis questions related to spatial biology the study of molecular biology processes in their bigger location context. One project, led by student Guy Attia and supervised by Dr. Leon Anavy addressed imputation methods for spatial transcriptomics data. A second one, led by student Efi Herbst, aims to expand the inference scope of data from spatial transcriptomics, into molecular properties that are not directly measured by the technology device.

According to Maya Kerem, a recent graduate, the MA program taught me a number of skills that would enable me to easily integrate into a new company based on the knowledge I gained. I believe that this program is particularly unique because it always makes sure that the learnings are applied to industry-related problems at the end of each module. This is a hands-on program at Reichman University, which is what drew me to enroll in this MA program.

Learn more

Visit link:
M.Sc. in machine learning & data science in the Efi Arazi School of Computer Science at Reichman University - The Times of Israel

Read More..

Optimization could cut the carbon footprint of AI training by up to 75% – University of Michigan News

Deep learning models that power giants like TikTok and Amazon, as well as tools like ChatGPT, could save energy without new hardware or infrastructure.

Study: Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training

Study: Chasing Low-Carbon Electricity for Practical and Sustainable DNN Training

Open-source software:Zeus on GitHubChase on Github

A new way to optimize the training of deep learning models, a rapidly evolving tool for powering artificial intelligence, could slash AIs energy demands.

Developed at the University of Michigan, the open-source optimization framework studies deep learning models during training, pinpointing the best tradeoff between energy consumption and the speed of the training.

At extreme scales, training the GPT-3 model just once consumes 1,287 MWh, which is enough to supply an average U.S. household for 120 years, said Mosharaf Chowdhury, an associate professor of electrical engineering and computer science.

With Zeus, the new energy optimization framework developed by Chowdhury and his team, figures like this could be reduced by up to 75% without any new hardwareand with only minor impacts on the time it takes to train a model. It was presented at the 2023 USENIX Symposium on Networked Systems Design and Implementation (NSDI), in Boston.

Mainstream uses for hefty deep learning models have exploded over the past three years, ranging from image-generation models and expressive chatbots to the recommender systems powering TikTok and Amazon. With cloud computing already out-emitting commercial aviation, the increased climate burden from artificial intelligence is a significant concern.

Existing work primarily focuses on optimizing deep learning training for faster completion, often without considering the impact on energy efficiency, said Jae-Won Chung, a doctoral student in computer science and engineering and co-first author of the study. We discovered that the energy were pouring into GPUs is giving diminishing returns, which allows us to reduce energy consumption significantly, with relatively little slowdown.

Deep learning is a family of techniques making use of multilayered, artificial neural networks to tackle a range of common machine learning tasks. These are also known as deep neural networks (DNNs). The models themselves are extremely complex, learning from some of the most massive data sets ever used in machine learning. Because of this, they benefit greatly from the multitasking capabilities of graphical processing units (GPUs), which burn through 70% of the power that goes into training one of these models.

Zeus uses two software knobs to reduce energy consumption. One is the GPU power limit, which lowers a GPUs power use while slowing down the models training until the setting is adjusted again. The other is the deep learning models batch size parameter, which controls how many samples from the training data the model works through before updating the way the model represents the relationships it finds in the data. Higher batch sizes reduce training time, but with increased energy consumption.

Zeus is able to tune each of these settings in real time, seeking the optimal tradeoff point at which energy usage is minimized with as little impact on training time as possible. In examples, the team was able to visually demonstrate this tradeoff point by showing every possible combination of these two parameters. While that level of thoroughness wont happen in practice with a particular training job, Zeus will take advantage of the repetitive nature of machine learning to come very close.

Fortunately, companies train the same DNN over and over again on newer data, as often as every hour. We can learn about how the DNN behaves by observing across those recurrences, said Jie You, a recent doctoral graduate in computer science and engineering and co-lead author of the study.

Zeus is the first framework designed to plug into existing workflows for a variety of machine learning tasks and GPUs, reducing energy consumption without requiring any changes to a systems hardware or datacenter infrastructure.

In addition, the team has developed complementary software that they layer on top of Zeus to reduce the carbon footprint further. This software, called Chase, privileges speed when low-carbon energy is available, and chooses efficiency at the expense of speed during peak times, which are more likely to require ramping up carbon-intensive energy generation such as coal. Chase took second place at last years CarbonHack hackathon and is to be presented May 4 at the International Conference on Learning Representations Workshop.

It is not always possible to readily migrate DNN training jobs to other locations due to large dataset sizes or data regulations, said Zhenning Yang, a masters student in computer science and engineering. Deferring training jobs to greener time frames may not be an option either, since DNNs must be trained with the most up-to-date data and quickly deployed to production to achieve the highest accuracy.

Our aim is to design and implement solutions that do not conflict with these realistic constraints, while still reducing the carbon footprint of DNN training.

The study was supported in part by the National Science Foundation grants CNS-1909067 and CNS-2104243, VMWare and the Kwanjeong Educational Foundation, and computing credits provided by CloudLab and Chameleon Cloud.

Read the original here:
Optimization could cut the carbon footprint of AI training by up to 75% - University of Michigan News

Read More..

Can Solana (SOL) overtake Ethereum (ETH)? HedgeUp (HDUP … – Analytics Insight

Is Solana (SOL) the Ethereum (ETH) killer that so many believe it to be, or will Buterin remain king of the Layer 1s? HedgeUp offers their token HDUP for presale, and the ability for people to invest in alternative assets like diamonds, watches and fine art. HedgeUp (HDUP) is in its second round of seed funding.

Most are familiar with the basics of Ethereum (ETH) and Solana (SOL), but for those wanting a recap, Ethereum (ETH) is essentially the first digital computer of crypto, the first chain to offer smart contracts thus expanding the functionality of the blockchain greatly. Ethereum (ETH) is primarily used for DeFi, NFTs and other on chain dapps, and is programmed in Solidity.

Solana (SOL) is a different and newer Layer 1 that is programmed on Rust, a different language that offers some benefits and some downsides but ostensibly is easier to use, but is known by less developers. Solana (SOL) is most famous for being one of, if not the, fastest blockchain in real transaction speed. Ethereum (ETH) currently processes around 25-30 TPS and Solana (SOL) can do an exciting 700 TPS, with the potential to go much faster.

Still, Vitalek Buterin, creator of Ethereum (ETH), says that the future of ETH lies in Layer 2s and zkrollup solutions such as Polygon or Arbitrum, which greatly increase speed and scalability and decrease fees, whilst still making use of the safety and decentralization of Ethereum (ETH). If he is right about that, then we may not need Solana (SOL), especially as Solana (SOL) is much more centralized than Ethereum (ETH). This is important since decentralization prevents 51% hack attacks and helps to keep chains accountable, unlike say, TerraFormLabs, whose centralisation ultimately made it possible for founder Do Kwon to get away with shady practices.

Solana (SOL) is known for being unreliable and the blockchain has frozen and had to be restarted on more than one occasion, with outages lasting several hours.

Now lets ask this: how much money would it take for Solana to overtake Ethereum in terms of value? Solanas Fully Diluted Value (FDV) is around $11bn and Ethereums is over $220bn, thats approx 1845% more! Solana (SOL) is currently trading at around $21.70, so if this growth were to happen (and ETHs price didnt change at all), then Solana (SOL) would be worth around $320 a coin. However for this to happen we would need to see way bigger adoption as it would take over $200bn of new investment into Solana (SOL) to make this possible, and that isnt on the horizon any time soon as far as we know.

High percentage increases in established coins are unusual, meaning that it is harder to make very high returns with coins like SOL and ETH. Although they make a great addition to a crypto portfolio, to benefit from the really crazy growth that we have seen, its best to add some new and upcoming cryptos to your portfolio, such as HedgeUp (HDUP).

In a world where inflation is rising, the stock market is not doing well, and there are global macroeconomic problems, where can you put your money so that it will bring you long term returns? HedgeUp (HDUP) has a solution for that.

HedgeUp (HDUP) enables investors to buy fractions of alternative assets, backed by F-NFTs. Maybe you always wished to own a piece of a Banksy, or even a Gauguin, or perhaps you predict that gold and diamonds are going to go up in value but you cant afford a large amount, or perhaps dont have access at all. HedgeUp (HDUP) will value, tokenize and safely store alternative assets, and then sell the NFT fragments to investors for as little as $1.

HedgeUp (HDUP) is on sale for $0.013 and the HedgeUp (HDUP) team has raised almost half a million dollars in just a few days. With a predicted minimum launch price of $0.09, big gains are likely, especially for PG holders of HedgeUp (HDUP).

Continued here:

Can Solana (SOL) overtake Ethereum (ETH)? HedgeUp (HDUP ... - Analytics Insight

Read More..