Page 727«..1020..726727728729..740750..»

World University Rankings 2024 by subject: computer science – Times Higher Education

The computer science subject table uses the same trusted and rigorous performance indicators as the Times Higher Education World University Rankings 2024, but the methodology has been recalibrated to suit the discipline.

This years table includes 1,027 universities, up from 974 last year.

View the World University Rankings 2024 by subject: computer science methodology

The University of Oxford leads the computer science ranking for the sixth consecutive year. Stanford University and Massachusetts Institute of Technology (MIT) switch places to take the second and third places respectively.

The UKs Imperial College London and Princeton University in the US move into the top 10 at eighth and ninth position respectively. The National University of Singapore and Germanys Technical University of Munich fall out of this elite group.

China, Germany and Australia have seven universities each in the top 100. The highest-ranking among these is Tsinghua University in China,in 12th place.

Read our analysis of the subject rankings 2024 results

View the full results of the overall World University Rankings 2024

To raise your universitys global profile with Times Higher Education, contact branding@timeshighereducation.com

To unlock the data behind THEs rankings and access a range of analytical and benchmarking tools,click here

Go here to see the original:

World University Rankings 2024 by subject: computer science - Times Higher Education

Read More..

AI might disrupt math and computer science classes – in a good way – The Hechinger Report

For as long as Jake Price has been a teacher, Wolfram Alpha a website that solves algebraic problems online has threatened to make algebra homework obsolete.

Teachers learned to work around and with it, said Price, assistant professor of mathematics and computer science at the University of Puget Sound, in Tacoma, Washington. But now, they have a new homework helper to contend with: generative artificial intelligence tools, such as ChatGPT.

Price doesnt see ChatGPT as a threat,and hes not alone. Some math professors believe AI, when used correctly, could help strengthen math instruction. And its arriving on the scene at a time when math scores are at a national historic low and educators are questioning if math should be taught differently.

AI can serve as a tutor, giving a student who is floundering with a problem immediate feedback. It can help a teacher plan math lessons, or write a variety of math problems geared toward different levels of instruction. It can even show new computer programmers sample code, allowing them to skip over the boring chore of learning how to write basic code.

As schools across the country debate banning AI tools, some math and computer science teachers are embracing the change because of the nature of their discipline.

Related: How can schools dig out from a generations worth of lost math progress?

Math has always been evolving as technology evolves, said Price. A hundred years ago, people were using slide rules and doing all of their multiplication with logarithmic tables. Then, along came calculators.

Sluggish growth in math scores for U.S. students began long before the pandemic, but the problem has snowballed into an education crisis. This back-to-school season, the Education Reporting Collaborative, a coalition of eight newsrooms, will be documenting the enormous challenge facing our schools and highlighting examples of progress. The three-year-old Reporting Collaborative includes AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.

Price teaches with human-capable technologies in mind, making sure to give students the skills in class by hand. Then, he discusses with them the limitations of the technologies they might be tempted to use when they get home.

Computers are really good at doing tedious things, Price said. We dont have to do all the tedious stuff. We can let the computer do it. And then we can interpret the answer and think about what it tells us about the decisions we need to make.

He wants his students to enjoy looking for patterns, seeing how different methods can give different or the same answers and how to translate those answers into decisions about the world.

ChatGPT, just like the calculator and just like the slide rule and all the technology before, just helps us get at that core, real part of math, Price said.

Conversely, ChatGPT has its limits. It can show the right steps to solving a math problem and then give the wrong answer.

This is because its not actually doing the math, Price said. Its just pulling together pieces of the sentences where other people have described how to solve similar problems.

Min Sun, a University of Washington education professor, thinks students should use ChatGPT like a personal tutor. If students get lost in class and dont understand a mathematical operation, they can ask ChatGPT to explain it and give them a few examples.

The Khan Academy, an educational nonprofit that provides a collection of online learning tools and videos and has long been a go-to for math homework, has created exactly that.

The tutor is called Khanmigo. Students can open it while completing math problems and tell it that they are stuck.

They can have a conversation with the AI tutor, telling it what they dont understand, and the AI tutor helps to explain, said Kristen DiCerbo, the chief learning officer at Khan Academy.

Instead of saying, Heres the answer for you, it says things like, Whats the next step? or What do you think might be the next thing to do? DiCerbo said.

Related: The science of reading swept reforms into classrooms nationwide. What about math?

Sun, the UW education professor, wants teachers to use ChatGPT as their own assistant: to plan math lessons, give students good feedback and communicate with parents.

Teachers can ask AI, What is the best way to teach this concept? Or What are the kinds of mistakes students tend to make when learning this math concept? Or, What kinds of questions will students have about this concept?

Teachers can also ask ChatGPT to recommend different levels of math problems for students with different mastery of the concept, she said. This is particularly helpful for teachers who are new to the profession or have students with diverse needs special education or English language learners, Sun said.

Im amazed by the details that sometimes ChatGPT can offer, Sun said. It gives you some initial ideas and possible problem areas for students so I can get myself more prepared before walking into the classroom.

And, if a teacher already has a high-quality lesson plan, they could feed that to ChatGPT and ask it to create another lesson in a similar teaching style, but for a different concept.

Sun hopes ChatGPT can also help teachers write more culturally appropriate word-problem questions to make all their students feel included.

The current technology is really a technical assistant to support them, empower them, amplify their creative abilities, Sun said. It is really not a substitute to their own agency, their own creativity, their own professionalism. They really need to keep that in mind.

Related: Teachers conquering their math anxiety

A year ago, if you asked Daniel Zingaro how he assesses his introductory computer science students, he would say: We ask them to write code.

But if you ask him today, the answer would be far more complex, said Zingaro, an associate professor at the University of Toronto.

Zingaro and Leo Porter, a computer science professor at University of California San Diego, authored the book Learn AI-Assisted Python Programming with GitHub Copilot and ChatGPT. They believe AI will allow introductory computer science classes to tackle big-picture concepts.

A lot of beginner students get stuck writing very simple code, Porter and Zingaro said. They never move on to more advanced questions and many still cant write simple code after they complete the course.

Its not just uninteresting, it is frustrating, Porter added. They are trying to build something and they forgot a semicolon and theyll lose three hours trying to find that missing semicolon or some other bit of syntax that prevents a code from running properly.

AI doesnt make those mistakes, and allows computer science professors to spend more of their time teaching higher-level skills.

The professors now ask their students to take a big problem and break it down to smaller questions or tasks the code needs to do. They also ask students to test and debug code once it is already written.

If we think bigger picture about what we want our students to do, we want them to write software that is meaningful to them, Porter said. And this process of writing software is taking this fairly big, often not-well-defined problem and figuring out, how do I break them into pieces?

Magdalena Balazinska, director of the University of Washingtons Paul G. Allen School of Computer Science and Engineering, embraces the progress AI has made.

With the support of AI, human software engineers get to focus on the most interesting part of computer science: answering big software design questions, Balazinska said. AI allows humans to focus on the creative work.

Not all professors in the field think AI should be integrated into the curriculum. Some interviewed for a UC San Diego research paper and in an Education Week survey prefer blocking or negating the use of ChatGPT or similar tools like Photomath, at least in the short term.

Zingaro and Porter argue that reading a lot of code generated by AI doesnt feel like cheating. Rather, its how a student is going to learn.

I think a lot of programmers read a lot of code, just like how I believe the best writers read a lot of writing, Zingaro said. I think that is a very powerful way to learn.

This story about AI and math was produced by The Seattle Times in cooperation with the Education Reporting Collaborative, a coalition of eight newsrooms that is documenting the math crisis facing schools and highlighting progress. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.

Related articles

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Read the rest here:

AI might disrupt math and computer science classes - in a good way - The Hechinger Report

Read More..

Professor wins NSF grant to improve real-time data services … – Binghamton

Having real-time data at our fingertips has become such an expected thing in the internet age that most of us dont think about how it happens and how to make it better.

Professor KD Kang, Department of Computer Science

Thats where computer science researchers like Professor KD Kang can help.

Kang a faculty member at Binghamton Universitys Thomas J. Watson College of Engineering and Applied Science recently received a $599,084 grant from the National Science Foundation to improve real-time data services that support smart transportation, healthcare, manufacturing and other key functions.

To be most useful to users, real-time data services need to fulfill three important mandates: The data must be as fresh as possible, it needs to be processed in an efficient way, and the system should not consume too much energy.

My research is trying to support these three requirements, but sometimes they conflict with each other, Kang said. For example, if you want to maximize freshness, you can update the data very frequently, but that takes more resources. Other tasks or queries that need to be processed in a timely manner may miss deadlines, and it also increases power consumption. I want to strike a balance between those three conflicting requirements, and that could have important applications in the real world.

Calling it an underexplored area of research, Kang will leverage advanced memory hardware features and real-time data characteristics to enhance timeliness and reduce processor and memory power consumption.

For instance, he wants to use machine learning to better prioritize what data is most needed and how it is processed, as well as aggregating tasks with similar data needs and consolidating them to one CPU core. That allows idle cores to be turned off to reduce power consumption.

Much of the work for the NSF grant will be trying different combinations of freshness, processing speed and power requirements to find the right balance among all three that provides optimum performance.

Near the end, I will have a prototype system that can target a specific application such as vehicle sensors something that has a nice set of sensors and a lot of real-time data analysis requirements, as well as tight power-consumption requirements, Kang said. Compared to current systems, I can hopefully enhance timeliness and freshness but reduce power consumption by, say, 20%. Thats my goal. Then we can take what we learn and apply it in other contexts.

Read the rest here:

Professor wins NSF grant to improve real-time data services ... - Binghamton

Read More..

How microelectronics will take computing to new heights – Argonne National Laboratory

Youre seeing the story on your screen right now thanks to tiny switches known as transistors. The microchips in computers contain billions of them, each one sending electrical signals based on what you want the computer to do.

Microelectronics like these have become both essential and amazingly minuscule in the push to extract more computing power from less space. A single red blood cell dwarfs todays average transistor, which is about a thousand times smaller.

Thats still not quite small enough.

In a future driven by data and artificial intelligence (AI), even the most micro of microelectronics will need to shrink further and use less energy to boot. Scientists at the U.S. Department of Energys (DOE) Argonne National Laboratory are inventing the next generation of these computing building blocks.

The impact extends beyond our phones and desktops. Everything from how we travel to how we fight disease to our understanding of the universe and Earths climate depends on microelectronics-based devices.

Microelectronics are so embedded in everything we do that theyre really critical to the way we run our lives these days, said Argonne Distinguished Fellow and Materials Science Division Director Amanda Petford-Long, citing examples such as cars, online banking and the electric grid. I dont think anyone can get by without them.

Current microchips in smartphones can perform 17 trillion calculations per second, making them millions of times more powerful than their predecessors from several decades ago.

That astonishing leap in computing power seems unimaginable except that someone did imagine it: engineer Gordon Moore. Moore, who went on to co-found Intel, predicted in 1965 that the number of transistors placed on microchips would double every year for the next decade. He was right, and in 1975, when the number of transistors on a microchip surpassed 65,000, he revised the pace of doubling to every two years.

Whats now known as Moores law has held true, more or less. Today, a laptop might contain tens of billions of transistors embedded on a fingernail-size chip. The upcoming Aurora exascale supercomputer at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science user facility, runs on tens of thousands of state-of-the-art processors that each pack 100 billion transistors into less than 4 square inches.

The Aurora processors power will enable more than 2 quintillion calculations per second. Those calculations will fuel the discoveries we need to generate clean energy, fight viruses, build more efficient engines, explore space and, of course, explore the next frontier of microelectronics, among other endeavors.

At some point, though, Moores law begins to run into physical barriers. Materials start to behave differently as device sizes vanish to the atomic level. Devices might trap too much heat when stacked together, and they hit walls on memory and speed. Even more pressing, our growing need for microelectronics devours energy and is projected to consume a quarter of the worlds supply by 2030.

Argonne researchers are tackling these problems through a combination of new materials, hardware and software designs, and manufacturing methods. In the process, they are also pushing the boundaries of where computers can go, making electronics that stretch like skin or operate in scorching temperatures beyond your ovens hottest setting.

Silicon has been the workhorse material used to make each successive wave of powerful microchips. Its abundant, and its conductivity the ability to transmit electricity can be adjusted by adding impurities. However, the size of microelectronics is now measured in single-digit nanometers, or a billionth of a meter. The smaller they get, the harder it is to pattern these impurities reliably.

Microelectronics are so embedded in everything we do that theyre really critical to the way we run our lives these days. I dont think anyone can get by without them. Amanda Petford-Long, division director and Argonne Distinguished Fellow

Researchers at Argonne and other national laboratories are exploring alternative materials and designs to get around this limitation, as well as to increase energy efficiency. One possibility is to incorporate materials that have reversible responses to electric fields. Another project focuses on using metal oxides to create printable transistors. Diamond thin films are being developed using the Center for Nanoscale Materials, a DOE Office of Science user facility, to remove the heat that builds up when microelectronics are placed on top of each other.

Honing these concepts requires time-consuming steps to determine the optimal conditions for creating a particular material. Thin films, for example, are grown on surfaces atom-by-atom in a process called atomic layer deposition. Researchers use a special apparatus to make these films, evaluating conditions in cycle after cycle. Its not unlike testing a baking recipe, where you have to wait for the final product to come out of the oven and then determine what went right or wrong before starting over again.

To move faster, researchers are experimenting with self-driving labs that bring AI into the loop. The process may start with computer simulations that inform actual experiments. The data from experiments then strengthens the simulations, and so on.

Angel Yanguas-Gil, a principal materials scientist at Argonne, is applying the self-driving lab concept to atomic layer deposition for faster, and thus lower-cost, discovery of materials for microelectronics. In this case, AI is helping to drive the experiment.

The computer is making its own decision and exploring a new condition in real time, and then you end up with the optimum condition much faster, said Yanguas-Gil. It takes a matter of minutes instead of, say, days.

In a space where materials are designed down to the atom, the most ingenious innovation will have trouble moving beyond the lab if it is not fit to a specific purpose.

We spend many years developing materials with very interesting properties, but the connection with the applications sometimes isnt strong, Yanguas-Gil said.

Scientists are addressing this gap in the Threadwork project. The name nods to weaving together different strands of microelectronics development with specific uses in mind, particularly those for future detectors in high energy physics and nuclear physics. Those strands include the materials, the devices built with those materials, the system architecture and more.

In Threadwork, were constantly looking at all these different levels of system design and what they mean in terms of the end use, said Mathematics and Computer Science Division Director Valerie Taylor who leads the project. Its not that one persons working independently of what everybody else is doing. Youre working on designs together, from different perspectives.

Many of these designs for microchips, along with the software they power, attempt to mimic the brain.

The brain consists of approximately 86 billion neurons, but only uses about 20 watts of power. This is great energy efficiency, Taylor said. So, we are developing brain-inspired or neuromorphic devices.

A traditional computer architecture has separate areas for operations and for memory. That layout requires information to constantly flow between the two areas, limiting how fast computers can process it. Future neuromorphic processors might be set up more like your brain, where memories and actions combine in sets of neurons that are networked together. A more compact, brain-like network reduces the amount of travel for electric signals, boosting efficiency. Transistors similar to neurons, or memristors, also might be able to communicate not just a binary value zero or one but a range of values within one signal, the way spikes from neurons do.

Human brains are not the only models for Argonnes interdisciplinary design approach. Yanguas-Gil and colleagues have designed and simulated a neuromorphic chip inspired by the brains of bees, fruit flies and ants. The concept relies on a unique material developed at Argonne and was tested using the ALCFs Theta supercomputer.

It sounds trivial, but one of the biggest challenges in going smaller with microelectronics is that you have to be able to make the thing, said Petford-Long. To use another baking analogy, if youre making a cookie that is 10 atoms across, its extremely hard to get a cookie cutter that can churn out absolutely identical shapes. Even minute variations matter, Petford-Long added: Because these structures are so small, one inconsistency can make a big difference in how they behave.

Another challenge is that the actual shape being stamped out may change from a flat plane with transistors in rows to more neuron-like shapes that also may be stacked on top of one another.

Current lithography for semiconductors is highly successful, but it is set up for a 2D world where the chip is flat, said Supratik Guha, senior advisor to Argonnes Physical Sciences and Engineering directorate and a professor at the Pritzker School of Molecular Engineering at the University of Chicago. In the coming decades, chips will become increasingly 3D. This requires new ways of fabricating chips.

One of those new ways might be to use 3D printers. One Argonne project aims to create low-power, high-performance printable transistors. Other work is focused on even finer and more precise tools for atomic layer deposition that can chisel increasingly intricate features onto devices.

As scientists explore new materials, configurations and manufacturing techniques for microelectronics, the Advanced Photon Source (APS), a DOE Office of Science user facility, will be key. With an upgrade that is currently in progress, the APSs X-ray beams will become up to 500 times brighter. The increased brightness will also lend itself to smaller spot sizes that can zero in on the smallest of features in a material or semiconductor device.

The powerful new tools at Argonne will contribute to a sort of flywheel effect. The APS and other observational tools will generate more data than ever. That data will feed the innovation of the very processors that will analyze it at the Aurora supercomputer and beyond.

Some of this work is in the very initial stages, but this is why we do continuous basic research, Guha said. The questions we are trying to answer will lead us to design new classes of materials and devices and new ways of connecting them so that, moving forward, we can meet the extreme energy efficiency challenges that we face.

This research is funded, in part, by the DOE Office of Basic Energy Sciences and the DOE Advanced Scientific Computing Research program.

The rest is here:

How microelectronics will take computing to new heights - Argonne National Laboratory

Read More..

GHC approved to offer bachelors in computer science – The Daily World

Grays Harbor College took a step toward offering students a bachelor of science degree in computer science after a state board approved its proposal to do so earlier this month.

After the Washington State Board of Community and Technical Colleges gave the go-ahead for GHC to add the computer science offering at an Oct. 18 meeting, the community college in Aberdeen will now seek approval from the Northwest Commission on Colleges and Universities, and in the meantime work to organize curriculum and advising for the program, a spokesperson for the college said in an email to The Daily World.

The college said it will identify a start date for the program after receiving final approval.

GHCs proposal to offer the bachelors in computer science degree, as well as the framework for the program, was developed in agreement with South Puget Sound Community College in Olympia, which was also approved for the degree by the state board Oct. 18. The agreement will allow students to begin a computer science degree at GHC, and upon completion of associate-level courses, transfer to the community college in Olympia to complete the bachelors degree.

That framework could change in the long run as GHC gains the necessary capacity and demand for the degree. Ultimately, the two colleges will seek to transition into a formal collaboration where a student could start and finish the (bachelors of science in computer science) degree at either institution, the college said in an email.

With the collaboration, the colleges will serve students and employers in the Pacific Mountain Workforce Development Region of Washington, which includes Grays Harbor County, Pacific County, southern Mason County, and northern Lewis and Thurston counties.

GHCs work toward building a computer science program was triggered by the Washington state Legislatures 2021 passage of Senate Bill 5401, which authorized community and technical colleges to offer bachelor of science degrees in computer science. The bill aimed to equip more Washington students, especially low-income or students of color, with necessary credentials for high-demand jobs in the technology sector.

According to the Washington Technology Industry Association, the technology sector accounts for 22% of the states economy, a higher percentage than any other state in the union. The tech sector grew by 33% from 2019 to 2022, when tech employers added 89,000 new workers, bringing the industry total to about 361,000 jobs many of which require bachelors degrees.

The tech giant Amazon contributed funding to the state board for the development of computer science programs across the state. The state board also requested a $9 million 2024 supplemental budget resolution from the state Legislature for expansion of computer science programs.

A survey conduced by Grays Harbor and South Puget Sound colleges revealed high interest in computer science locally. Currently, The Evergreen State College and St. Martins University are the two options for students in the Grays Harbor and South Sound regions seeking a bachelors degree in computer science.

GHCs computer science offering would be the colleges first bachelor of science degree, and fourth bachelors degree overall, joining three applied science degrees in teacher education, forest resource management, and organizational management.

While there is still work to be done before GHC will be ready to offer the Bachelor of Science in Computer Science, the State Boards decision to approve our degree proposal with South Puget Sound Community College is a big step forward, GHC President Dr. Carli Schiffner said in a statement. I am proud of the faculty and staff at GHC who have had a hand in this work, especially our faculty members including Jamie Reino, Terri Bell, Alison Criswell and Tom Kuester, and our team in the office of Instruction including Evi Buell, Paulette Lopez, Marjie Stratton and Nicole Lacroix.

Contact reporter Clayton Franke at 406-552-3917 or clayton.franke@thedailyworld.com.

Read the original:

GHC approved to offer bachelors in computer science - The Daily World

Read More..

Computer Science Program Once Again Ranked Among Nation’s Best – Rose-Hulman Institute of Technology

For the fourth straight year, Rose-Hulman has been ranked for having one of the nations top undergraduate computer science programs, according to rankings featured in the 2024 U.S. News & World Report's College Guide.

Rose-Hulman tied for 56th out of 554 programs nationally this year, higher than last years rankings. This places Rose-Hulman within the top 10 of Midwest institutions, and among a select group of private colleges across the nation.

Deans and senior faculty familiar with U.S. computer science departments were surveyed to assess programs at ABET-accredited bachelors degree-granting colleges and universities based upon the academic quality and training in areas of programming languages, computer systems, theory, data analysis, and data science.

The programs continued high national ranking is based on our strong, innovative, up-to-date, and expanding curriculum, the expertise of our faculty and staff, and the quality of contributions our graduates are making in their career fields, said Sriram Mohan, PhD, head of Rose-Hulmans Department of Computer Science and Software Engineering.

A new minor in cybersecurity is allowing students to gain the skills necessary to meet future high-tech challenges and become familiar with cybersecurity issues. A minor in Artificial Intelligence is helping students gain expertise in an area with potential to reshape the modern world. Meanwhile, data science is offered as a second academic major and a bachelors degree in international computer science features spending a year living and learning in Germany to earn a dual degree from Rose-Hulman and Hochschule Ulm University of Applied Sciences.

Mohan asserts that the computer science program provides the fundamental skills, theoretical underpinnings and the practical knowhow in a hands-on and caring educational environment that allows a growing number of alumni to impact the computing industry.

Rose-Hulmans Class of 2022 computer science majors had a near 100% placement rate within six months of Commencement, with an average starting salary of $95,801 and a high salary offer of $165,000. These opportunities included career employment with companies such as Amazon, Cisco Systems, Cummins, DMI, Edgile, Google, Microsoft, Raytheon, Software Engineering Professionals, Telemetry Sports, and Zotec Partners. Students also went onto to attend graduate schools at Brown, Carnegie Mellon, Indiana University, Northwestern, Princeton, Rice, University of Illinois, and University of Southern California.

Software engineering graduates had a 100% placement rate in 2022, with an average starting salary of $101,997 and high salary offer of $125,000 from such companies as Groupon, Lexmark International, and Toyota.

Provost and Vice President for Academic Affairs Rick Stamper points out that the ability of Rose-Hulman computer science and software engineering students to quickly adapt and learn about new technology environments has the programs alumni being highly respected and coveted by employers and graduate/doctorate degree program leaders. Students have showcased their computing and problem-solving skills in national and regional programming contests and Hackathons for several years.

Our computer science program continues to be highly respected for providing academic and extracurricular opportunities that prepare our graduates for an ever-changing technology landscape. Thats what brings a record number of companies from throughout the country each year to recruit our students for full-time, internship, and co-op work opportunities, said Stamper, pointing out that Rose-Hulman graduates, no matter the academic major, are lifelong learners.

Rose-Hulman was recently ranked No. 17 in the nation in the Wall Street Journals Best Colleges in America guide, as well as No. 1 in both learning opportunities and learning facilities; second in the nation in the likelihood of students recommending the college to others; and fourth in career preparation. U.S. News and World Report ranked Rose-Hulman first for the 25th consecutive year among U.S. engineering colleges that are focused on bachelors and masters-level education. Rose-Hulmans computer engineering program was judged first by engineering deans and senior engineering faculty, along with civil engineering, electrical engineering, and mechanical engineering.

Learn more about the Rose-Hulmans rankings and national distinction here.

Rose-Hulmans Early Action deadline to apply for the 2024-25 school year is Nov. 1, 2023.

Originally posted here:

Computer Science Program Once Again Ranked Among Nation's Best - Rose-Hulman Institute of Technology

Read More..

Google Bard asked Bill Nye how AI can help avoid the end of the world. Here’s what ‘The Science Guy’ said – CNBC

You may not know this, but Bill Nye, "The Science Guy," has professional experience overseeing new and potentially dangerous innovations. Before he became a celebrity science educator, Nye worked as an engineer at Boeing during a period of rapid changes in aviation control systems and the need to make sure that the outputs from new systems were understood. And going all the way back to the days of the steamship engine innovation, Nye says that "control theory" has always been a key to the introduction of new technology.

It will be no different with artificial intelligence. While not an AI expert, Nye said the basic problem everyone should be concerned about with AI design is that we can understand what's going into the computer systems, but we can't be sure what is going to come out. Social media was an example of how this problem already has played out in the technology sector.

Speaking last Tuesday at the CNBC Technology Executive Council Summit on AI in New York City, Nye said that the rapid rise of AI means "everyone in middle school all the way through to getting a PhD. in comp sci will have to learn about AI."

But he isn't worried about the impact of the tech on students, referencing the "outrage" surrounding the calculator. "Teachers got used to them; everyone has to take tests with calculators," he said. "This is just what's going to be. ... It's the beginning, or rudiments, of computer programming."

More important in making people who are not computer literate understand and accept AI is good design in education. "Everyone already counts on their phone to tell them what side of the street they are on," Nye said. "Good engineering invites right use. People throw around 'user-friendly' but I say 'user figure- outtable.'"

Overall, Nye seems more worried about students not becoming well-rounded in their analytical skills than personally thinking AI is going to wipe out humanity. And to make sure the risk of the latter can be minimized, he says we need to focus on the former in education. Computer science may become essential learning, but underlying his belief that "the universe is knowable," Nye said that the most fundamental skill children need to learn is critical thinking. It will play a big role in AI, he says, due to both its complexity and its susceptibility to misuse, such as deep fakes. Noting the influence of Carl Sagan on his own philosophy, Nye said, "We want people to be able to question. We don't want a smaller and smaller fraction of people understanding a more complex world."

During the conversation with CNBC's Tyler Mathisen at the TEC Summit on AI, CNBC surprised Nye with a series of questions that came from a prompt given to the Google generative AI Bard: What should we ask Bill Nye about AI?

Bard came up with about 20 questions covering a lot of ground:

How should we ensure AI is used for good and not harm?

"We need regulations," Nye said.

What should we be teaching our children about AI?

"How to write computer code."

What do you think about the chance for AI to surpass human intelligence?

"It already does."

What is the most important ethical consideration for AI development?

"That we need a class of legislators that can understand it well enough to create regulations to handle it, monitor it," he said.

What role can AI play in addressing some of the world's most pressing problems such as climate change and poverty?

Nye, who has spent a lot of time thinking about how the world may end he still thinks giant solar flares are a bigger risk than AI which, he reminded the audience, "you can turn off" said this was an "excellent question."

He gave his most expansive responses to the AI on this point.

Watch the video above to see all of Bill Nye's answers to the AI about how it can help save the world.

See the article here:

Google Bard asked Bill Nye how AI can help avoid the end of the world. Here's what 'The Science Guy' said - CNBC

Read More..

Computer science led Browning to SSCC – Hillsboro Times Gazette

The Southern State Community College Foundation has announced Thomas Browning of Franklin County as a recipient of the Sara M. Barrere Memorial Scholarship for the 2023-24 academic year.

Browning, a 2003 high school graduate, is a working husband and father of three as well as an Air Force veteran. He is pursuing a degree in computer information technology at Southern State Community College, under the direction of associate professor Dr. Joshua Montgomery.

I first heard about Southern State when Dr. Montgomery came to my workplace to talk about getting the A-plus certification. He was approachable and took time to talk to me about his course and how it would compare to the A-plus certification I already had, Browning said.

Browning decided to enroll in Montgomerys course at Southern State. While taking his course, I was struck by how much Dr. Montgomery genuinely cares for his students. I already hold two associate degrees, and Ive not had a professor who has such a passion for their students, added Browning.

At the end of the course, Browning spoke with an academic advisor about whether he should complete the associate of applied science degree program at Southern State with Montgomery or move on to his bachelors degree since he already had the majority of the credits from his prior two degrees.

Although the academic advisor recommended going straight into a bachelors degree program, Browning knew that Southern State had a professor from whom he had more to learn.

I enrolled in the computer science program and modified my future BS/MS degree plans to accommodate the additional technology-based associate of applied science degree. Ive also been given the opportunity to join Phi Theta Kappa, an international college honor society, since coming to Southern State, said Browning.

He added, I am grateful for the opportunity to set a good example for my children, and being awarded this scholarship allows me to focus on my studies and provide inspiration to help them find what they are passionate about.

Browning hopes to continue his education by obtaining his bachelors and masters degrees in IT management, and continue to work in the IT field, helping dispel the stigma that IT people are distant and unapproachable.

The Southern State Computer Science Program offers an associate of applied science degree in computer technology. Students can select two focus areas from four different pathway options, which include networking, programming, cybersecurity and robotics.

A degree in computer technology from Southern State will allow students to gain in-demand skills in a flexible, affordable and fun environment. Southern State helps students get connected with companies in the area for internships and job opportunities. To learn more, visit https://www.sscc.edu/academics/programs/computer-science.shtml.

Spring semester begins Jan. 8, 2024. Registration is underway.

Submitted by Elizabeth Burkard, director of marketing, Southern State Community College.

Here is the original post:

Computer science led Browning to SSCC - Hillsboro Times Gazette

Read More..

Doctoral Researcher, Department of Computer Science job with … – Times Higher Education

The Department of Computer Science (CS) of the Faculty of Science is seeking aDOCTORAL RESEARCHERinAI for big scientific data processing and analytics forthe project of Artificial Intelligence systems for enhancing sensing technologies and scientific discoveries.

The Doctoral Researcher will work on a project funded by theAcademy of Finland.The project will be carried out in close collaboration with the Institute for Atmospheric and Earth System Research (INAR). The project aims to develop Artificial Intelligence (AI) and data science methods to automate the data processing and analysis of big atmospheric and environmental data, mainly generated via the Stations for Measuring Ecosystem-Atmosphere Relations (SMEARstations) and other related research infrastructure, such asACTRIS, etc. The developed methods will be deployed in various platforms, such as computing clusters operated by CSC IT Center for Science (CSC). The hired candidate will have an opportunity to interact and collaborate with world-class atmospheric scientists in INAR, data scientists and data engineers in CSC, top scientists at the Department of Computer Science (CS), and many other collaborators within Atmosphere and Climate Competence Center (ACCC) and Finnish Center for Artificial Intelligence (FCAI).

Position description

The doctoral researcher position will focus on researching, designing, and developing various feature engineering, machine learning (ML), and deep learning (DL) methods. The ML/DL models are developed mainly based on time-series measurements gathered at many research infrastructures, such as SMEAR stations. Other data sets can be in the forms of spatiotemporal database, images and other unstructured data sets. The developed ML/DL models are expected to be deployed on our computing platforms (e.g., CSC etc.) to process the gathered measurement data and generate the results automatically in near real-time.

The position will require the doctoral researcher to advance research in specific technological areas, such as supervised and unsupervised machine learning, deep learning, computer vision, and cloud computing. The doctoral researcher will be encouraged to design their own research project within the scope of the overall project in collaboration with the Principal Investigators (PIs). High cooperation between the doctoral researcher and scientists in CS and INAR is a key requirement for the project's success. The successful candidate will be primarily supervised by Dr. Martha Arbayani Zaidan,Prof. Tuukka Petjand Prof. Sasu Tarkoma. The position will be a fully funded contracts for 3 years with a possibility of extension for a fourth year.

Requirements and eligibility criteria

A successful candidate should have a master's degree in computer science, electrical engineering, or a related field. The candidate must have scientific curiosity and a meticulous work ethic. Based on the position of interest, prior experience in data sciences, AI, data engineering, cloud computing, and/or their deployments is desirable. A good track record of relevant scientific publications will be considered a plus. The candidate must have good organizational and time management skills and be able to work both independently and as part of a team. Excellent communication skills in English, both verbal and written, are expected. The University of Helsinki seeks to promote an equitable and inclusive working environment and welcomes applicants from diverse genders, linguistic and cultural backgrounds.

Applicants who do not currently hold a doctoral study right in the Doctoral Programme in Computer Science (DoCS) at the University of Helsinki are eligible to apply, but in the event of hiring, they are expected to acquire the status during the standard 6-month probationary period. Please check the admission periods and eligibility criteria to DoCS:https://www.helsinki.fi/en/admissions-and-education/apply-doctoral-programmes/doctoral-school-and-doctoral-programmes/doctoral-programmes-natural-sciences/doctoral-programme-computer-science/admissions-doctoral-studies.

Salary and benefits

The starting salary of a doctoral researcher is typically 2531-2662 euros/month, depending on previous qualifications and experience.

The University of Helsinki offers comprehensive services to its employees, including occupational health care and health insurance, sports facilities, and opportunities for professional development. The University provides support for internationally recruited employees with their transition to work and life in Finland. For more on the University of Helsinki as an employer, please seehttps://www.helsinki.fi/en/about-us/careers.

How to apply

Please submit your application in a single PDF file in English, which should include the following documents:

We aim to fill the position as soon as possible and therefore encourage early applications. However, the latest deadline for submitting applications isOctober 31st, 2023.

More information

For project and position related questions, please contact

For support with the recruitment system, please contact

The University of Helsinki (https://www.helsinki.fi/en) is an international scientific community of 40,000 students and researchers. It is one of the leading multidisciplinary research universities in Europe and ranks among the top 100 international universities in the world. We are an equal opportunity employer and offer an attractive and diverse workplace in an inspiring environment with a variety of development opportunities and benefits.

As a part of the Faculty of Science, the Department of Computer Science (https://www.helsinki.fi/en/computer-science)is a leading unit in Finland in its area and responsible for the teaching and research in computer science at the University of Helsinki. The number of professors at the department has grown in recent years and there are now 32 professorships. The main research fields at the department are artificial intelligence, big data frameworks, bioinformatics, data analysis, data science, discrete and machine learning algorithms, distributed, intelligent, and interactive systems, networks, security, and software and database systems.The department has extensive international collaboration with companies and universities. Within teaching, the departments professors and staff are in charge of the Bachelors, Masters, and Doctoral Programmes in Computer Science, as well as the separate Masters Programme in Data Science, in which other departments also participate.

Do you have a helsinki.fi username and a valid employment contract, a grant researcher's contract, a visiting researcher's or visiting professor's contract at the University of Helsinki? Log in here to apply to our open positions.

Read more:

Doctoral Researcher, Department of Computer Science job with ... - Times Higher Education

Read More..

JEE Main: NIT Hamirpur cut-offs for BTech in Computer Science and Engineering from last 5 years – The Indian Express

JEE Main 2024: The National Institute of Technology (NIT) Hamirpur admits candidates to its undergraduate engineering programmes on the basis of the Joint Entrance Examination Main (JEE Main) ranks. The institute offers undergraduate and postgraduate degree programmes and PhD courses.

All the NITs hold separate counselling processes through the Joint Seat Allocation Authority (JoSAA). While half of the seats are generally reserved for students from within the state in which the NIT is located, the remaining seats in the NIT are reserved for the ones from other states for admission to an NIT. In case of open and unreserved seats, students from outside the state will require higher JEE Main rank.

You have exhausted your monthly limit of free stories.

Read more stories for freewith an Express account.

Continue reading this and other premium stories with an Express subscription.

This premium article is free for now.

Register to read more free stories and access offers from partners.

Continue reading this and other premium stories with an Express subscription.

This content is exclusive for our subscribers.

Subscribe now to get unlimited access to The Indian Express exclusive and premium stories.

Every year, around six rounds of JoSAA counselling are held and given below are the Round 1 opening and closing ranks at which NIT Hamirpur admitted students to its BTech in Computer Science and Engineering programme.

The NIT cut-off in 2023 for the other state category open seats dropped to 8774 from 6332 in 2022 and 874 in 2021.

NIT Hamirpur 2023 CSE Cut-Off Round 1

NIT Hamirpur JEE Cut-Off CSE Round 1 from 2022

NIT Hamirpur JEE round-1 Cut-Off for CSE from 2021

NIT Hamirpur JEE Cut-Off for CSE Round 1 counselling from 2020

NIT Hamirpur JEE Cut-Off for CSE Round 1 counselling from 2019

NIT Hamirpur was set up on August 7, 1986 as a Regional Engineering College, a joint and cooperative enterprise of the Indian government and Himachal Pradesh government. At the time of its inception, the institute had only two departments Civil and Electrical Engineering having an intake of 30 students in each.

IE Online Media Services Pvt Ltd

First published on: 30-10-2023 at 09:59 IST

Read this article:

JEE Main: NIT Hamirpur cut-offs for BTech in Computer Science and Engineering from last 5 years - The Indian Express

Read More..