Category Archives: Computer Science

Dunkirk native receives honor | News, Sports, Jobs – Evening Observer

Dunkirk native David Van Wey, graduated from Dunkirk High School in the class of 1985. His Principal was Mr. John Mancuso and his computer science teacher was Mr. James Will (DHS Class of 1961 graduate). Both encouraged Van Wey to further his education in the computer science field and after graduation he began his college career at the Rochester Institute of Technology.

While furthering his education in a Bachelors level program for computer science at RIT, he simultaneously worked in the office for the National Technical Institute for the Deaf (NTID). Having made many new friends and colleagues, he was referred for a position at the University of Rochester in the Lab for Laser Energetics. Working in the laser lab, sparked an interest in him that led him to further his education, ultimately receiving a masters degree in computer science from the University of Rochester.

In April 2023, he received a letter from the President of the University of Rochester, Sarah C. Mangelsdorf, stating he was chosen to receive one of the three Witmer Awards for Distinguished Service. Manglesdorf stated in the letter:

For over 30 years, your commitment to the Laboratory for Laser Energetics (LLE) and University has been invaluable. From creating relational databases to modernizing operations years ago to standing watch over the laser system one shift per week for over 20 years as a trained power conditioning operator to becoming a highly respected and appreciated human resources professional today, your contributions exemplify our Meloria values.

He was presented with the award where he was congratulated by the University of Rochesters Board of Trustees. At this time, Van Wey plans to continue the rest of his career at the University of Rochester, eventually retiring in his hometown of Dunkirk.

Today's breaking news and more in your inbox

See original here:

Dunkirk native receives honor | News, Sports, Jobs - Evening Observer

Artificial intelligence and us Marquette’s Intersection faculty roundtable | Marquette Today – Marquette Today

Intersection is a recurring feature in Marquette Magazine that brings together faculty members from different disciplines to share perspectives on a consequential topic. This time, with large-language models such as ChatGPT and artificial-intelligence-driven image and video generators exploding into our lives, schools and economy, three professors reflect on the change AI is making us grapple with. Following are key excerpts from a conversation with them.

Whats the most promising or exciting thing youve seen from the world of AI over the past year?

MZ: When I study how users interact with these technologies, theres a lot of utility. Im not even thinking about ChatGPT or similar platforms, but about how smart devices are becoming better at processing voice commands and predicting my needs and wants. I joke in my classes that we critique the issues and challenges posed by AI, but I rely on Apple Maps. I rely on Grammarly to autocorrect my grammar. So, there are definitely benefits. AI is helping to make a lot of these tools better for a lot of people.

NY: Generative AI is the most promising thing that I have seen. I have been using it in a lot of useful ways for my research to address challenges that had been hard tosolve. I can generate synthetic data to compensate for a lack of enough data. I can remove the biases in my data set. This is something that has amazed me in the last few years.

JSL: The way AI can handle huge sets of data has wonderful implications for our approaches to major ethical problems, such as climate change, because AI can handle calculations that an individual person cant. Same with medicine. So on those issues AI could be really promising especially when we make accountability and transparency priorities, so people understand, at least somewhat, whats going on behind what AI generated for them.

What has been the most concerning development or the AI-related challenge that most urgently needs addressing?

JSL: For professors in the humanities, whats going on in our circles is the question of what were trying to teach our students to do, composing an essay, most obviously. Is that being perceived, or is it in fact, not even useful to them anymore? How can we convince our students that the creative process which is something weve been given by God, the act of writing and the rigor that entails is actually important for them? Its like a basketball player practicing their shots. There are some things you just cant have a machine do for you, for your own growth. (Saint-Laurent also cited concerns with the use of deep fakes by malicious actors or states, and with AI potentially worsening inequalities.)

NY: There are a few things I should mention: ethical and privacy risks, the use of generative AI for creating content fake audio, video, images and even assignments at school that can put people at risk. My other concern is our need to learn how to have humanAI collaboration because its crucial to know what you need from AI and how to use it correctly. A final point in terms of using these algorithms is their explainability. Do their processes make sense? The algorithms are getting better and better, but the interpretability of the AI is another concern.

MZ: Its hard to add to the points already being made. But a broader concern that I share with students is an overall kind of quantification bias thats emerged that assumes AI or anything data-driven will be inherently correct and better than having a human make a decision or a prediction. We rely on algorithmic and AI-driven systems without having that explainability as Nasim was saying. And now that data the things we can compute and put into a model are what matters most, that could have an impact on things like humanness or imperfection or broader kind of humanistic qualities that were losing because of the reliance on these models.

How dramatically do you see AI impacting higher education? How should Marquette prepare for that impact?

MZ: Part of me asks, why are we treating this differently than the emergence of the calculator or an online encyclopedia? Students have always been able to copy or find shortcuts for their work, but it does seem to be at a different scale now. Does that require us to change our mode of instruction or what were expecting of students? I suspect we expect different things in a math classroom than we did 30 years ago because memorizing the multiplication tables is just not as necessary.

Still, in our computer science classrooms, were struggling because there are tools out there that can write code for our coding assignments. Students can ace the assignments, but when theres an exam and theyre forced to do it on their own, theyre suddenly struggling, realizing what theyre not learning. That puts pressure on us as instructors to help students see that difference.

JSL: Its kind of hard to underestimate the impact. Frankly, Im actually worried about the divisions in the faculty and the students that this might cause. This has to be an interdisciplinary endeavor for us as a university, so we are talking to one another in important ways and not bringing our own bias against other disciplines, saying, We know this better than you do. Its impacting all of us, and its all of our responsibility.

Were not going to have a unified voice, but we must be able to identify what our concerns are, what is in keeping with the Jesuit mission of the university, how are those still really important questions for us all, so were not becoming like the Luddites over here and the techies over there. No, we have to put our students at the center and also us as professors to really try to get the human heart back in there. We can see what AI can do. Yes, it is amazing, but it also can give us greater awe about what the human person is too, what our capacities are, and to help our students not lose sight of that.

NY: Most of the faculty are now struggling with assignments and things that are given to them by students who may be using AI tools. Its all part of an adjustment process. It reminds me of how people were not ready to use elevators when they were first introduced; people were still using stairs. We need to get ready. As faculty, we need to learn how to use these tools and teach students how to use these tools correctly. There is now software to help us detect fakes and copied information. So, we need to get ready and define rules. Then we can probably even benefit in the classroom. AI may help us find personalized content for the students of the future based on their needs and GPA, and recommendations for custom course content. We could benefit as people did when they began using elevators.

Visit link:

Artificial intelligence and us Marquette's Intersection faculty roundtable | Marquette Today - Marquette Today

The tentacles of retracted science reach deep into social media. A simple button could change that. – EurekAlert

image:

How the interface showing more information about retracted science would work.

Credit: Judy Kay and authors, University of Sydney

In 1998, a paper linking childhood vaccines with autism was published in prestigious journal,The Lancet, only to be retracted in 2010 when the science was debunked.

Fourteen years since its retraction, the papers original claim continues to flourish on social media, fuelling misinformation and disinformation around vaccine safety and efficacy.

A University of Sydney team is hoping to help social media users identify posts featuring misinformation and disinformation arising from now-debunked science. They have developed and tested a new interface that helps users discover further information about potentially fraught claims on social media.

They created and tested the efficacy of adding a more informationbuttonto social media posts. Thebuttonlinks to a drop down which allows users to see more details about claims or information in news posts, including information on whether that news is based on retracted science. The reseachers say social media platforms could use an algorithm tolink posts to details of retracted science.

Testing of the interface among a group of participants showed that when people understand the idea of retraction and can easily find when health news is based on a claim from retracted research, it can help reduce the impact and spread of misinformation as they are less likely to share it.

Knowledge is power, saidProfessor Judy Kayfrom the School of Computer Science who led the research. During the height of the COVID-19 pandemic, myths around the efficacy and safety of vaccines abounded.We want to help people to better understand when science has been debunked or challenged so they can make informed decisions about their health, she said.

The ability to read and properly interpret often complex scientific papers is a very niche skill not everybody has that literacy or is up to date on the latest science. Many people would have seen posts about now-debunked vaccine research and thought: it was published in a medical journal, so it must be true. Sadly, that isnt the case for retracted publications.

Social media platforms could do much better than they do now, said co-author and PhD student Waheeb Yaqub. During the height of the COVID-19 pandemic, myths around the efficacy and safety of vaccines spread like wildfire.

Our approach shows that when people understand the idea of retraction and can find when health news is based on a retracted science article, it can reduce the impact and spread of misinformation, he said.

Tool boosts literacy of processes behind scientific research

The research was conducted with 44 participants who started with little or no understanding of scientific retraction. After completing a five-minute tutorial, they rated how various reasons for retraction make a papers findings invalid.

The researchers then studied how participants used the More Informationbutton.They found the new information altered the participants beliefs on three health claims based on retracted papers shared on social media.

These claims were: whether masks are effective in limiting the spread of coronavirus; that the Mediterranean diet is effective in reducing heart disease; and snacking while watching an action movie leads to overeating.

The first claim was based on two papers, one which had been retracted and one which hadnt. The other two claims were based on retracted papers. The researchers specifically chose papers of which participants would have differing knowledge.

Participants confidently considered masks were effective. Most didnt know about the Mediterranean diet and so were unsure about whether it was true. Many people whose personal experience of snacking during films made them believe it was true.

Thebuttoninfluenced participants when they knew little about a topic to begin with. When the participants discovered the post was based on a retracted paper, they were less likely to like or share it.

On social media, both misinformation(the inadvertent spread of false information) and disinformation (false information deliberately spread with malicious intent), are rising.

Papers can be retracted when problems with methodology, results or experiments are found.

The researchers say it would be feasible for social media platforms to develop back-end software that links databases of retracted papers.

If social media platforms want to maintain their quality and integrity, they should look to implementsimplemethods like ours, Professor Kay said.

The study was published inProceedings of the ACM on Human-Computer Interaction.

DECLARATION

The authors declare no conflicts of interest. Waheeb Yaqub is the recipient of a research scholarship.

Proceedings of the ACM on Human-Computer Interaction

Survey

People

Foundations for Enabling People to Recognise Misinformation in Social Media News based on Retracted Science

26-Apr-2024

The authors declare no conflicts of interest. Waheeb Yaqub is the recipient of a research scholarship.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

The rest is here:

The tentacles of retracted science reach deep into social media. A simple button could change that. - EurekAlert

AI for more caring institutions – Harvard School of Engineering and Applied Sciences

More and more public services such as affordable housing, public school matching and child welfare are relying on algorithms to make decisions and allocate resources. So far, much of the work that has gone into designing these systems has focused on workers experiences using them or communities perceptions of them.

But what about the actual impact of these programs have on people, especially when the decisions the systems make lead to denial of services? Can you design algorithms to help people make sense of and contest decisions that significantly impact them?

Naveena Karusala, a postdoctoral fellow at the Harvard John A. Paulson School of Engineering and Applied Science (SEAS), with Krzysztof Gajos, the Gordon McKay Professor of Computer Science at SEAS and a team of researchers, are re-thinking how to design algorithms for public services.

Instead of only centering the worker or institution that is using the tool to make a decision, can we center the person who is affected by that decision in order to work towards more caring institutions and processes, asked Karusala.

In a paper being presented this week at the Association of Computing Machinerys conference on Human Factors in Computing System, Karusala and her colleagues offer recommendations to improve the design of algorithmic decision-making tools to make it easier for people impacted by those decisions to navigate all the steps in the process, especially when they are denied.

The researchers aimed to learn from areas where algorithms currently arent being used but could be deployed in the future. They looked specifically at public services for land ownership in rural South India and affordable housing in the urban Northeast United States and contestation processes after applicants are denied services.

Governments in the U.S. and India as well as around the world recognize the right to contest a denial of public services, and increasingly so when denied by an algorithm. But contestation processes can be complex, time consuming and difficult to navigate, especially for people in marginalized communities.

Intermediaries like social workers, lawyers and NGOs play an important role in helping people navigate these processes and understand their rights and options. In public health, this concept is known as accompaniment, where community-based aid workers assist people in under-resourced communities to navigate complex healthcare systems together.

One of the takeaways of our research is the clear importance of intermediaries and embedding the idea of accompaniment into the algorithm design, said Karusala. Not only should these intermediaries be involved in the design process, but they should also be made aware of how the decision-making process works because theyre the ones that bridge communities and public services.

The researchers suggest that algorithmic decision-making systems should be designed to proactively connect applicants to those intermediaries.

Today, many AI researchers are focused on improving an algorithms ability to explain its decision but that isnt useful enough to the people who have been denied service, said Karusala.

Our findings point to the fact that rather than focusing only on explanations, there should be a focus on other aspects of algorithm design that can prevent denials in the first place, said Karusala.

For example, if a background check turns up information that puts a person on the boundary between approval and disapproval for housing, algorithms need to be able to ask for additional information to either make a decision or ask a human reviewer to step in.

These are some concrete ways that the burden often placed on marginalized communities could be shared with not only intermediaries, but also public service administrators and algorithmic tools, said Karusala.

This research is particularly significant because it challenges an assumption held deeply in the computing community that the most effective way to provide people with grievance redressal mechanisms is for algorithms to provide explanations of their decisions, said Gajos. Instead, this research suggests that algorithms could be used throughout the process: from identifying individuals who may not apply on their own and may need to be encouraged to do so, to helping applicants prepare and contextualize information to make applications relevant and informative, to navigating contestation strategies.

The research was co-authored by Sohini Upadhyay, Rajesh Veeraraghavan and Gajos.

More:

AI for more caring institutions - Harvard School of Engineering and Applied Sciences

76% of weighted Career and Technical Education course enrollment is male. Here’s why. – Central Times

For seniors like Luoxu Chen, being surrounded by men has become the norm in advanced courses like AP Computer Science A and Multivariable Calculus. These classes prepare students for industries that are dominated by men. But even at the high school level, classes are also majority male with low female enrollment.

Women take weighted Career and Technology Education (CTE) courses at a disproportionately lower rate compared to men, according to 2021-22 data from District 203.

District 203 defines a students gender as anyone who identifies as female, male or neither. The CTE departments weighted courses are 23.37% female and 76.63% male, numbers substantially disproportionate to the schools 52.5-47.3-0.2% split along those same gender groups.

Classrooms with different groups and types of students have different perspectives, which benefits their curricular and social-emotional learning, according to CTE teacher Derek Miller, who completed his doctorate dissertation on gender disparities in computer science.

If you dont have a diverse group, youre going to miss things in your software, Miller said. This wasnt software, but in terms of gender inequity, very early on when airbags were first developed, they ended up killing women and children. The reason that happened is because all of the [male] engineers who were working on airbags, made crash test dummies model after themselves, so they didnt think to test it out on people who might be built more like women or children.

Currently, most weighted classes in CTE are male-dominated.

I think perhaps the minority gender might feel like theyre singled out in a way because theyre unique in that classroom, CTE department chair Lynn Andrees said. I could see where there could be some social-emotional questions that they might have, like, Why am I the only male and in a typically female classroom or Why am I the only female in a male-dominated classroom?

Senior Athena Chen has taken a multitude of weighted CTE courses in her time at Central. The differences by gender usually ranged from somewhat noticeable to super apparent.

In the beginning, it definitely made me uncomfortable going into my accounting class and seeing it was only guys, Chen said. I just felt like I was surrounded by finance bros. But I feel like its just like that in the workforce, and its kind of something you just have to get used to. If anything, its more of a motivation to keep going.

The issue of female enrollment has not gone unnoticed by the CTE department. What makes this problem difficult to solve is the same industries they are preparing students for are also often perpetuating gender norms, according to Andrees.

However, this does not mean the CTE department hasnt made strides to help remedy the issue.

"We are aware of [the enrollment disparities], especially as we look at the student population sitting in front of us. And I think it extends beyond gender as well. I think it crosses all lines of equality in a way. It's hard to change because typically the industry that we're preparing people for that's what [the courses] align with" - CTE department chair Lynn Andrees

We did a lot of equity training, Andrees said. I think it was having teachers, first of all, look at their own implicit bias and trying to learn about that, then understanding where students are and [understanding] their feelings socially [and] emotionally so that we could work with students in our classrooms better and in the whole school.

Through his dissertation, Miller discovered different ways to help combat the female enrollment problem. One way to do this is through changing the curriculum.

Part of it is showing the connection of computer science to the real world, not just focusing on puzzles and math type of stuff, Miller said. Showing that computer science can apply to just about any other field.

The CTE department tries to implement this real-world connection learning as much as possible in their courses, but it is usually seen mostly once students get to later courses like software engineering where students are involved in big projects that apply to the world.

While Central still has room to improve, it is also better than many surrounding communities.

I think we have a lot of parents, like mothers, in the district who are in computer science, Miller said. So I think that is part of the reason why we dont have as big a disparity as some other schools might.

Chen said that the difference between the number of males and females in her AP Computer Science class wasnt too drastic, but still impacted her.

I noticed it when you have to pick partners because normally, its easier for the two [of the same] genders to pair up, Chen said. I dont know why, but [it just is]. And so then its difficult when youre trying to pick a partner because theres so many boys.

Unlike weighted CTE courses, the science department has one of the most gender-balanced departments when it comes to their weighted courses. The science department has 49.42% female and 50.58% male enrollment.

Science department chair Dan Olandese said the science and CTE departments are like apples and oranges, and they are not comparable in this way.

Some other departments have offered a lot more specialized [courses where] the number of electives is greater, Olandese said. The variety of classes that CTE offers is far greater than [science] and probably always will be. Because you offer such a great variety, you have lower individual enrollment in classes because you have so much more to pick from as opposed to science for like freshman and sophomore year.

The gender disparities that exist in weighted courses extend beyond the classes and into extracurriculars.

I think it doesnt even have to be curricularly because I used to be part of Wall Street society, which was like an investing club, and I think it might have died because I quit last year, but it was literally all guys, and I was the only, Chen said. But doing more extracurricular stuff, where like sponsoring like girls who invest or like girls in finance clubs, just like [Girls in Engineering, Math and Science] for CTE.

Jake Pfeiffer and Nolan Shen contributed to this story

Read the original:

76% of weighted Career and Technical Education course enrollment is male. Here's why. - Central Times

Healey-Driscoll Administration Announces $195000 in Grants During Computer Science Workforce STEM Summit – Mass.gov

BRIDGEWATER The Healey-Driscoll Administration yesterday hosted the 2024 Massachusetts STEM Summit at Bridgewater State University that focused on the evolving computer science education and workforce landscape. As the computer science workforce continues to grow, the administration is promoting opportunities that enable more diverse and inclusive pathways for interested students.

At the Summit, the Administration announced that Massachusetts STEM Week will take place next year from October 21 - 25, 2024. To gear up for STEM week, the Administration also announced $195,000 in grants going to five educational organizations to support STEM-related hands-on learning opportunities for students in the fall.

In Massachusetts, we want to open doors for our young people, especially to STEM pathways. I want to encourage students, regardless of their circumstance, zip code, or socioeconomic status, to explore every option available to them. By empowering them to see STEM as a viable, exciting pathway to their future, we will inspire the next generation of innovators, saidGovernor Maura Healey. I am looking forward to next years STEM Week and all of the exciting opportunities our STEM Design Challenge Awardees will provide for students.

It was wonderful to be with students, educators, workforce partners, colleagues and more at our STEM Summit to highlight the importance of computer science and the impact educational opportunities in this field can have on students and our state. We will continue to lift up these careers for all students, strengthening our tech workforce pipeline and economy, driving our competitiveness and building a brighter future for all, said Lieutenant Governor Kim Driscoll, co-chair of the STEM Advisory Council.

At the Summit, Lieutenant Governor Driscoll, Education Secretary Tutwiler, and other state officials first toured Bridgewater States Cybersecurity Training Center and Biology Labs. Then students from the Southeastern Regional Vocational Technical High School shared their perspectives on computer science education and the impact of meaningful computer science and IT career exposure opportunities.This was followed by a panel discussion on the role of mentoring and career exposure to advance workforce development and generate awareness about the growing workforce needs in the cybersecurity industry withthe Mass Cyber Center.

I want to thank Bridgewater State University and President Clark for hosting the STEM Summit. Their Cybersecurity Training Center and Biology Labs were impressive examples of the STEM opportunities available for students across the Commonwealth. I am leaving the Summit energized and optimistic about the future for so many reasons. Among them are the Southeastern Regional Vocational Technical High School students who shared their stories and how the states investments are setting them up for success, saidSecretary of Education Dr. Patrick Tutwiler. I also want to congratulate the 2024 STEM Design Challenge Awardees and look forward to how they continue to engage Massachusetts students in STEM.

As a computer science major in undergrad myself, I was fortunate to be surrounded by incredible educators, peers, mentors and workforce development opportunities, said Secretary of Technology Services and Security Jason Snyder. The STEM Summit is illustrative of what these partnerships mean to todays students who represent the future of our statewide workforce, and I am so glad these opportunities and resources, like the Bridgewater States Cyber Range, exist right here in our backyard. We know these rapidly growing fields in emerging tech need highly skilled, diverse, workforce-ready individuals to step up in the coming years, and it is clear the STEM Advisory Council is a major driver toward that goal.

The STEM Summit is organized by the Executive Office of Education and theSTEM Advisory Council. The STEM Advisory Council was established to expand access to high-quality STEM education for students across Massachusetts, and is currently co-chaired by Lieutenant Governor Driscoll, U.S Congressman Jake Auchincloss and Chairman, President and CEO of Vertex Pharmaceuticals Dr. Jeffery Leiden.

Massachusetts continues to have all the essential pieces and partnerships to sustain a robust workforce pipeline to support companies and transform lives across the Commonwealth, said Massachusetts Life Sciences Center Acting CEO and Vice President of Economic Development and Partnerships Jeanne LeClair. We are incredibly proud of our support for Bridgewater State University as it demonstrates our shared mission to grow our life sciences ecosystem and broader STEM workforce on a regional basis.

2024 STEM Design Challenge Awardees

Museum of Science In the year of the EarthShot, the Museum of Science is launching an environmental engineering challenge for students in grades 3-5 to raise awareness about the prevalence of plastic pollution. Students will consider the effects of plastics on ecosystems and communities as they engineer filters to reduce plastic waste entering bodies of water. This challenge can also be modified for students in grades 6-8.

Wade Institute for Science Education This Design Challenge, Extreme Zoo Makeover: A STEM Approach to Habitat Design, will engage students in grades 5-8 in a unique experience that integrates science and engineering concepts in a series of inquiry-based investigations that lead up to a student-driven challenge. The Wade Institute is partnering on this design challenge with the Lloyd Center for Environment and Buttonwood Park Zoo.

Kids In Tech STEM Goes Green: The STEM Challenge introduces students to a range of environmental topics, emphasizing how human activities impact water resources. Through hands-on activities, like a simulation demonstrating overfishing with goldfish crackers and a model of ocean gyres in a bottle, students gain practical insights into the issues facing marine ecosystems. It also covers plastic pollution and its journey through watersheds, encouraging students to brainstorm solutions to reduce environmental impact. Students learn about the water cycle, exploring concepts like evaporation, condensation, and precipitation, and they perform water quality tests to understand the importance of clean water for human health. By the end of the unit, students will be better equipped to think critically about water sustainability and contribute to positive environmental change. This challenge is geared towards elementary/ middle school students.

PBL Works The Future of Work is a high-quality, meaningful K-12 applied project-based learning experience for Mass STEM Week aligned to Massachusetts standards created by PBLWorks for middle and high school students. Students have the opportunity to investigate in-demand careers and dialogue with industry experts. For a final product, student teams use what they learn to develop a website that will prepare and inspire students in their community to pursue the career they have chosen to explore. They engage in peer critique and use it to revise their designs. Students combine their team websites into one class website to share with an authentic audience of students in their community. This website becomes a resource for college and career counseling services at their school site and others in the district.

WPI STEM Education Center For STEM Week 2024, the STEM Education Center will expand the I Am STEM Lesson Library (PK-7) with up to 8 additional lessons. In addition, a new category of CS will be established and some of the newly created lessons will be aligned with digital learning and computer science (DLCS) standards. Following teachers requests, they will create letters to families/caregivers to be added to all lessons in the library. Lastly, they will conduct a series of teacher training about the lessons. As in previous years, the Center will hire a team of expert STEM teachers to develop the additional lessons.

###

View post:

Healey-Driscoll Administration Announces $195000 in Grants During Computer Science Workforce STEM Summit - Mass.gov

Berks Best 2024 computer science winner Vanesa Aguay a bridge between generations – Reading Eagle

For information on submitting an obituary, please contact Reading Eagle by phone at 610-371-5018, or email at obituaries@readingeagle.com or fax at 610-371-5193.

Most obituaries published in the Reading Eagle are submitted through funeral homes and cremation services, but we will accept submissions from families. Obituaries can be emailed to obituaries@readingeagle.com.

In addition to the text of the obituary, any photographs that you wish to include can be attached to this email. Please put the text of the obituary in a Word document, a Google document or in the body of the email. The Reading Eagle also requires a way to verify the death, so please include either the phone number of the funeral home or cremation service that is in charge of the deceased's care or a photo of his/her death certificate. We also request that your full name, phone number and address are all included in this email.

All payments by families must be made with a credit card. We will send a proof of the completed obituary before we require payment. The obituary cannot run, however, until we receive payment in full.

Obituaries can be submitted for any future date, but they must be received no later than 3:00 p.m. the day prior to its running for it to be published.

Please call the obituary desk, at 610-371-5018, for information on pricing.

Read this article:

Berks Best 2024 computer science winner Vanesa Aguay a bridge between generations - Reading Eagle

Stanford AI Projects Greenlighted in National AI Research Resource Pilot – Stanford HAI

On May 6, the U.S. National Science Foundation and the Department of Energy awarded grants to 35 research teams for access to advanced computing resources through the National Artificial Intelligence Research Resource (NAIRR) pilot. This initial wave ofawarded projects includes scholars from across the U.S. who are working in clinical medicine, agriculture, biochemistry, computer science, informatics, and other interdisciplinary fields. Two Stanford AI projects from the School of Engineering and School of Medicine were selected to participate in the pilot.

Part of the 2023Executive Order on the Safe, Secure, and Trustworthy Development and Use of AI, the NAIRR pilot launched in January 2024 with four stated goals: spur innovation, increase diversity of talent, improve capacity, and advance trustworthy AI. Stakeholders in academia, industry, and government see this program as a critical step toward strengthening U.S. leadership in AI and democratizing AI resources for public sector innovation.

The NAIRR pilot is a landmark initiative that supports applied AI research and will benefit the entire nation, said Stanford Institute for Human-Centered AI Deputy Director Russell Wald. No AI scholar should be constrained by the high cost of compute resources and access to data to train their models.

Most of the awarded projects are given computational time on NSF-funded supercomputer systems at the University of Illinois Urbana-Champaign, University of Texas at Austin, and Pittsburgh Supercomputing Center; additionally, the DOE will allocate resources at its Summit supercomputer at Oak Ridge National Laboratory and AI Testbed at Argonne National Laboratory to a few of the research teams.

A team from the Stanford Intelligent and Interactive Autonomous Systems Group (ILIAD), led by HAI Faculty AffiliateDorsa Sadigh, an assistant professor of computer science and of electrical engineering, submitted a proposal to continue groundbreaking work in the domain of human-robot and human-AI interactions. The project will focus on learning effective reward functions for robotics using large datasets and human feedback.

Reward functions are key to a machine learning technique called reinforcement learning, which works by training a large language model to maximize rewards. When humans provide feedback as part of the training process, the model learns how to make decisions that are aligned with human priorities. Stanford computer science PhD student Joey Hejna says that applying this technique to real-world robotics presents new challenges because it requires understanding the visual world, which is captured by modern visual-language models. Another challenge is that its not enough for the model to get the right result; how it arrives at that answer also matters. Researchers will want to make sure the robot operates safely and reliably around people, and they may need to personalize how certain robots interact with humans in a home-care setting, for example.

Training robot models that can work in the real world will require a massive amount of compute power," Hejna explains. High-performing VLMs usually have at least 7 billion parameters. This project would not be possible without access to the GPU hours from the National Science Foundation.

The second Stanford project to receive NSF support comes out of the School of MedicinesClinical Excellence Research Center (CERC), dedicated to reducing the cost of patient care. Part of a multiyear initiativeto enhance healthcare environments by integrating smart sensors and AI algorithms, the awarded project seeks to develop computer vision models that can collect and analyze comprehensive video data from ICU patient rooms to help doctors and nurses better track patients health.

A key aspect of the research is to address potential biases in the AI models used for predicting patient status and monitoring clinical activities. By analyzing demographic data from electronic health records, the team aims to identify and correct algorithmic biases that might affect predictions across different ethnicities and sexes. The ultimate goal is to develop bias-free algorithms and propose interventions to ensure fair and accurate patient monitoring and care in ICUs, saidthe teams lead scholar,Dr. Kevin Schulman.

HAIs leadership team has been a driving force behind the creation of a National AI Research Resource since the founding of the institute in 2019. Co-Directors Fei-Fei Li and John Etchemendy started to organize universities and tech companies in 2020, and they initiated the call for a government-led task force to establish the program.

From our earliest conversations with universities, industry executives, and policymakers, we felt that American innovation was at stake, said Li. We knew that support from Congress and the president could have a meaningful impact on the future of AI technology.

According to Etchemendy, The start of this pilot program marks a historic moment for U.S. researchers and educators. It will rebalance the AI ecosystem by supporting mission-driven researchers who want AI to serve the public good.

Reflecting on the years of strategic planning and dedication that have led to this milestone, Wald added, John and Fei-Feis vision, combined with the extraordinary support of the Stanford community and our countrys policymakers, is leading to greater access to AI research not just at Stanford but at all of Americas universities.

Immediately following the May 6 announcement of initial awards, the NAIRR pilot opened the application window for a second wave of projects. With contributions from industry partners, a wider range of technical resources are going to be available for applicants this round, including access to advanced computing systems, cloud computing platforms, foundation models, software and privacy-enhancing tools, collaborations to train models, and education platforms.

Researchers and educators can apply for access to these resources and view descriptions of the first cohort projects onthe NAIRR pilot website.

Stanford HAIs mission is to advance AI research, education, policy and practice to improve the human condition.Learn more.

Read more here:

Stanford AI Projects Greenlighted in National AI Research Resource Pilot - Stanford HAI

$60 Million Gift to Noyce School of Computing Helps Students Solve ‘Challenges of Tomorrow’ – Cal Poly

Cal Poly is home to the first interdisciplinary school of its kind thanks to a transformative $60 million gift from the Robert N. Noyce Trust.

The Noyce School of Applied Computing combines three departments Electrical Engineering, Computer Science and Software Engineering, and Computer Engineering with Statistics joining as an affiliate, paving the way for students and faculty using computer principles, concepts and technologies to address real-world problems.

Currently, the demand for graduates with an applied computing degree is far outpacing supply, with the U.S. Bureau of Labor Statistics predicting jobs in computing and information technology will climb 15% between 2021 and 2031 much faster than the average for all occupations.

Software Engineering, Computer Science, Computer Engineering and Electrical Engineering accounted for 15% of recent applications to Cal Poly, and the number of applicants grows every year. The Noyce School will allow Cal Poly to increase the number of qualified students accepted to these programs.

Our students are going out into the leading-edge industrial companies, said Amy Fleischer, dean of Cal Polys College of Engineering. They're going to change the world, and the education that we're going to provide here in the Noyce School will help them do that.

The $60 million gift was made in honor of Robert N. Noyce, a co-founder of Intel and inventor of the integrated circuit, which fueled the personal computer revolution and gave Silicon Valley its name. Nicknamed the Mayor of Silicon Valley, Noyces impact on the field of computing and society at large cannot be overstated.

We are thrilled that Dr. Noyces legacy will be recognized and appreciated by the students and faculty at Cal Poly for generations to come, said Michael Groom, a trustee of the Robert N. Noyce Trust, when the gift was established. We believe the establishment of The Noyce School of Applied Computing comes at a pivotal time, when there is a major deficit of new graduates in the fields of computing and computer sciences, and the need and demand for these skilled workers remains very high.

Led by Founding Director Chris Lupo, the Noyce School of Applied Computing will have a transformational impact on the university, allowing for the establishment of an endowment that will fund the Noyce Schools operations in perpetuity and enable Cal Poly to offer state-of-the-art facilities, access to new interdisciplinary research projects and curricular and co-curricular opportunities for faculty and students.

Thanks to this generous gift, Cal Poly is already investing in state-of-the-art equipment for upgraded labs. Students will also have more opportunities to further their interests in teaching and learning, as well as industry opportunities for paid internships and opportunities for mentors to provide guidance and counseling along the way. In addition, faculty will be provided with additional resources for teaching and applied research, professional development and innovative and collaborative curriculum design.

Dr. Noyces legacy will inspire students and faculty to grow and be the next industry trailblazers," said Cal Poly President Jeffrey D. Armstrong. "Through this generous gift from the Robert N. Noyce Trust, Cal Poly will be able to educate more students to solve the challenges of tomorrow.

Want more Learn by Doing stories in your life? Sign up for our monthly newsletter, the Cal Poly News Recap!

Subscribe to the Recap

View post:

$60 Million Gift to Noyce School of Computing Helps Students Solve 'Challenges of Tomorrow' - Cal Poly

Why we need computer science in every California school – Capitol Weekly

Opinion

by MARC BERMAN & PRINCESS CHOI-CARLSON posted 05.16.2024

OPINION When teachers and lawmakers team up to prioritize California students, the outcomes benefit everyone. Thats why we are working together to ensure that every student has the computer science skills they need to succeed in todays job market and in life.

Computers and technology are an integral part of our everyday life. Nearly every industry across California, from tech and pharmaceutical to agriculture and automotive, relies on employees with computational skills. And every student, no matter where they are from, deserves to learn the basic computer skills needed to decipher and decode life in the 21stCentury.

Yet the data clearly shows that far too often, students from disadvantaged communities are not offered the same access to computer science and technology classes as their wealthy peers. Were working together to change that.

Despite being home to Silicon Valley, the global cradle of innovation and technology, California trails behind 31 other states in prioritizing computer science education. Half of all high schools in California dont offer any computer science courses. The disparities are more pronounced in rural and low-income communities, further widening the educational divide and depriving students of the chance to learn skills that are crucial to almost any career they may choose.

As a teacher at a Title 1 school in Riverside County, Princess Choi-Carlson saw first-hand how computer science courses engage students by empowering them to create their own projects with real-life applications, building critical problem-solving skills. Students taking introductory computer science courses in Riverside experienced the benefits of learning computer science, and petitioned the local school board to offer AP Computer Science Applications when none were offered at the time.

Thats why we must pass AB 2097, the Computer Science for All Act, and ensure that every high school student in California has access to the training and knowledge necessary to thrive in our digital age. This legislation will require all public high schools to adopt a plan to offer at least one computer science course by the 2028-29 school year.

The bill will expand enrollment of underrepresented groups, including women, students with disabilities, and students in low socio-economic households. Prioritizing equity is critical for a more representative and inclusive tech workforce reflective of Californias rich diversity.

The Computer Science for All Act is a commitment to our childrens future and to Californias economy. By ensuring every student has access to computer science education, California can lay the groundwork for a more equitable and informed society that gives all kids a chance to succeed.

Assemblymember Berman is the author of AB 2097, the Computer Science for All Act. Princess Choi-Carlson is an AP Computer Science teacher in Riverside.

Want to see more stories like this? Sign up for The Roundup, the free daily newsletter about California politics from the editors of Capitol Weekly. Stay up to date on the news you need to know.

Sign up below, then look for a confirmation email in your inbox.

Original post:

Why we need computer science in every California school - Capitol Weekly