Page 1,338«..1020..1,3371,3381,3391,340..1,3501,360..»

Jabra releases Elite 4 everyday earbuds with ANC for long-lasting … – iTWire

Audio, video, and collaboration solution provider Jabra has released its latest feature-packed earbuds, the Jabra Elite 4, bringing all-day ANC, multipoint Bluetooth, dust- and water-resistance, and many other features, all in a great package at a great price.

iTWire has long enjoyed seeing Jabra's extensive and excellent range of audio equipment including conference room setups, as well as headphones and earbuds, and the company is continuing to deliver on its proven 150-year history of audio engineering with this latest release, the Jabra Elite 4.

This tiny set of earbuds is the latest addition to Jabra's Elite line-up and is a step up from the previous Elite 3 model. It brings an incredible 22-hour battery life using the charge case - and that'swith ANC turned on! It's 28 hours with ANC off. And, as you might have guessed by that statement, it includes active noise cancellation, or ANC, which takes you out of the hustle and bustle of everyday life and into your own personal cocoon of peace as you immerse yourself in music or podcasts. Or, talk to your friends or colleagues too. Whether you're using these earbuds for work or play, Jabra has your back.

They also offer Bluetooth Multipoint so you can be connected to two different devices simultaneously. Listen to music on your laptop while connected to your phone for calls. Or the other way around; play music through your phone, while connected to your laptop for Teams or Zooms calls. Or your tablet and phone, or whatever combination you want. iTWire has been trying the Elite 4 out for ourselves and found not only are the earbuds comfortable and easy to fit, but we found the multipoint worked effortlessly and easily. The initial pairing was a cinch, with Jabra imbuing the Elite 4 with extra smarts in the form of Fast Pair and Swift Pair which connect and link your earbuds instantly to your Android 6.0 or higher phone or tablet, or Windows 10/11 laptop or desktop. It was quite magical how quickly that worked. iOS, macOS, and Linux users aren't left out in the cold though; regular Bluetooth pairing still works exactly as normal.

The ANC is impressive for such a tiny device. Jabra makes it work with feedforward ANC that filters out unwanted sounds. The buds include four microphones and 6mm speakers to ensure you're also heard loud and clear when making calls, with crystal clear sound. The companion Jabra Sound+ app allows you to further personalise sound qualities to your own liking.

The battery gives 5.5 hours of playtime by itself but extends to 22 hours using the charging case. As before, that's with ANC on and extends to 28 hours with ANC off. You can fast charge to get an hour of playback with 10 minutes of charge time. What's more, you can also go solo. The earbuds don't have to work as a pair, and in fact, allow you to use one while the other charges. Thus, you can literally keep playing and keep playing and keep playing with zero downtime.

The earbuds are also rated IP55 against dust and water; "IP" stands for ingress protection and that first number is for dust and the second for water. Both are rated out of 6, so the earbuds come in right near the top so you know they offer serious protection for pretty much any scenario you use them in - except deep sea diving, underground coal mining, or the most seriously dusty and wet conditions. In short, you can drop them on your floor, use them in the gym, or get caught in the rain, and your Jabra Elite 4 earbuds will effortlessly shake it off.

They come in Dark Gray, Navy, Lilac, and Light Beige, and are designed with premium durable materials. All-in-all this is an impressive package for everyday true wireless earbuds with ANC, and iTWire found them comfortable, effortless to use, and deliver crisp, clear audio with pumping sound when we wanted to enjoy our music.

Jabra SVP Calum MacDougall said,The modern earbud user is looking for tech thats ready for work and play at their fingertips, whilst not compromising on key features. The Elite 4 offers a solution to this and is the perfect all-rounder, designed to help users to concentrate, connect, and call without distractions, and is the ideal companion to balance work and life.

The Jabra Elite 4 is available in selected retailers at a retail price of $139 ($NZ 159) and includes a two-year warranty.

And, hey, if you're a Pokemon player, youknow you need earbuds called the Elite 4, while you smash the other Elite 4.

Read more:
Jabra releases Elite 4 everyday earbuds with ANC for long-lasting ... - iTWire

Read More..

Exxact Joins Supermicro’s Test Drive Program to Enable Remote … – HPCwire

FREMONT, Calif., March 21, 2023 Exxact Corporation, a leading provider of high-performance computing (HPC), artificial intelligence (AI), and data center solutions, announced their participation in Supermicros GPU Test Drive program to trial accelerated development on NVIDIA H100 GPU-powered workloads. Exxacts customers can remotely test drive the capabilities of the newest flagship data center GPU the NVIDIA H100 80GB.

Potential customers can apply for the program through Exxacts Test Drive website. Once approved, customers can obtain remote access to a high-performance solution platform to test, benchmark, and qualify their advanced workloads with dual NVIDIA H100 80GB PCIe Tensor Core GPUs.

Exxact Corporation is collaborating with Supermicro to provide remote access to a powerful system, providing an excellent opportunity to prove its capabilities in accelerating workloads across a wide variety of applications with NVIDIA GPUs, said Jason Chen, Vice President, Exxact Corporation.

More about the Supermicro H100 Test Drive Platform

Experience the unprecedented boost in performance delivered by the Supermicro next generation 4U server with the new NVIDIA H100 Tensor Core GPUs through the Test Drive program. The new Supermicro system delivers exponential performance gains over the current generation of systems and is optimized for large language models, HPC, and various AI training workloads.

About Exxact Corporation

Exxact develops and manufactures high-performance computing platforms and solutions that include workstation, server, cluster, and storage products developed for Deep Learning, Life Sciences, HPC, Big Data, Cloud, and more. With a full range of engineering and logistics services, including consultancy, initial solution validation, manufacturing, implementation, and support, Exxact enables its customers to solve complex computing challenges, meet product development deadlines, maintain a competitive edge, and fuel the innovative minds of the world.

Source: Exxact

See the original post here:
Exxact Joins Supermicro's Test Drive Program to Enable Remote ... - HPCwire

Read More..

Bob Metcalfe ’68 wins $1 million Turing Award | MIT News | Massachusetts Institute of Technology – MIT News

Robert Bob Metcalfe 68, an MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) research affiliate and MIT Corporation life member emeritus, has been awarded the Association for Computing Machinery (ACM) A.M. Turing Award for his invention of Ethernet. Often referred to as the Nobel Prize of computing, the award comes with a $1 million prize provided by Google.

Metcalfe, the founder of 3Com Corp., the company that designed, developed, and manufactured computer networking equipment and software, was cited by the ACM for the invention, standardization, and commercialization of Ethernet, one of the earliest and most extensively utilized networking technologies. 3Com, Metcalfe's Silicon Valley startup founded in 1979, helped to increase the commercial viability of Ethernet by selling network software, Ethernet transceivers, and Ethernet cards for minicomputers and workstations. 3Com introduced one of the first Ethernet interfaces for IBM PCs and their clones when IBM launched its personal computer.

A current research affiliate in computational engineering at MIT, Metcalfe is also an emeritus professor of electrical and computer engineering after 11 years at The University of Texas at Austin. Metcalfe graduated from MIT in 1969 with bachelor's degrees in electrical engineering and industrial management.

Bobs work has profoundly impacted computer science and the world, which cannot be overstated, says Daniela Rus, director of CSAIL and the Andrew and Erna Viterbi Professor in the MIT Department of Electrical Engineering and Computer Science (EECS). With the invention of Ethernet, he enabled seamless communication and information sharing, paving the way for countless applications that have become integral to our daily lives. From the internet to online video streaming and beyond, Ethernet has formed the foundation of modern technology and transformed how we connect. It's hard to fathom life without the connectivity that Ethernet has made possible.

Ethernet, commence

Metcalfes renowned 1973 memo on a broadcast communication network proposed linking the first personal computers, PARC's Altos, in a single building, which paved a way for the devices to talk to each other and share information in a local area network. The first Ethernet clocked in at 2.94 megabits per second, approximately 10,000 times faster than the terminal networks it replaced. The memo suggested that the network should be adaptable to new technologies such as optical fiber, twisted pair, Wi-Fi, and power networks and to swap out the original coaxial cable as the primary means of communication to over an ether. The contribution was later immortalized in their 1976 Communications of the ACM article, Ethernet: Distributed Packet Switching for Local Computer Networks.

But what came before Ethernet? Metcalfe likes to call it some version of luck. My first task at Xerox was to put it on the ARPANET, which I had already done for Project MAC. I built the necessary hardware and software and Xerox got it connected. Later, the Computer Science Lab at Xerox PARC aimed to create the first modern personal computer and have one on every desk imagine that, he says. So they asked me to design a network for this purpose. I was given a card with 60 chips for the network. That was the second-biggest stroke of luck in my life. The first one was being born to my parents.

Metcalfe was not alone in pursuing efficient network communication; he connected as well with David Boggs, the co-inventor of Ethernet, who passed away last year. Metcalfe and Boggs set out to avoid using wires, an idea that was short-lived. We went to a place that had packet radio network at the University of Hawaii and saw immediately that we couldn't have zero wires. The radios were just too big and expensive and slow. We'd have to have more than zero wires, so we decided to have one. And this one wire would be shared among the attached PCs.

Making things standard

Metcalfe and Boggs cooked up a recipe of Jerrold taps, Manchester encoding, and ALOHA randomized retransmissions to bring Ethernet to life. The first would puncture the coaxial cable and attach to the semiconductor without cutting the cable. Manchester encoding allowed the clock to be in the packet. Finally, ALOHA randomized retransmissions allowed for turn-taking. The two then built and attached many stations to the Ethernet and wrote network protocols to use it.

We had to make Ethernet standard. I served as the so-called 'marriage broker' connecting Digital Equipment Corporation, then the second-largest computer company in the world, and Intel Corporation, a brand new semiconductor company, and Xerox Corporation, a large systems vendor, says Metcalfe. We created the DIX Ethernet standard and submitted it to the IEEE. A couple of painful years later, it got standardized. Then a big fight ensued between the Ethernet companies and IBM and General Motors, a three-way battle. General Motors lost quickly, and IBM hung on for 20 years. All of them wanted their technology to be the standard everybody used to connect computers. Ethernet won.

Today, Ethernet is the main conduit of wired network communications worldwide, handling data rates from 10 megabits per second to 400 gigabits per second (Gbps), with 800 Gbps and 1.6 terabits per second technologies emerging. As a result, Ethernet has also become an enormous market, with revenue from Ethernet switches alone exceeding $30 billion in 2021, according to the International Data Corp.

Life at MIT

MIT has long been a second home to Metcalfe, where he could not stay away for long.

From 1970 to 1972, he worked on putting MIT on the ARPANET in J.C.R. Licklider's dynamic modeling systems group. This was all on the ninth floor of Tech Square where everything interesting in the world was happening says Metcalfe. I returned to MIT in 1979, after I left Xerox, and became a consultant to Michael Dertouzos. I was peddling Ethernet, and Mike's people were peddling something called a 'token ring.' He and I were on opposite sides of a big argument. He was pro-token ring, and I was pro-Ethernet. Dertouzos wanted me to join that fight, which I did through most of 1979, during which I founded the 3Com Corporation while at LCS.

Metcalfe has been a member of the MIT Corporation since 1992. In 1997-98 he served as president of the MIT Alumni Association. He served as MIT Visiting Innovation Fellow with the Innovation Initiative and Department of Electrical Engineering and Computer Science during the 2015-16 academic year, during which he mentored students on all things entrepreneurship. In 2022, Metcalfe joined CSAIL as a research affiliate to pursue research in computational engineering.

Ethernet is the foundational technology of the internet, which supports more than 5 billion users and enables much of modern life, says Jeff Dean, Google Senior Fellow and senior vice president of Google Research and AI, in the official ACM announcement. Today, with an estimated 7 billion ports around the globe, Ethernet is so ubiquitous that we take it for granted. However, it's easy to forget that our interconnected world would not be the same without Bob Metcalfe's invention and his enduring vision that every computer must be networked.

Past Turing Award recipients who have taught at MIT include Sir Tim Berners-Lee (2017), Michael Stonebraker (2014), Shafi Goldwasser and Silvio Micali (2013), Barbara Liskov (2008), Ronald Rivest (2002), Butler Lampson (1992), Fernando Corbato (1990), John McCarthy (1971) and Marvin Minsky (1969).

Metcalfe will be formally presented with the A.M. Turing Award at the annual ACM Awards Banquet, held this year on June 10 at the Palace Hotel in San Francisco.

See the article here:

Bob Metcalfe '68 wins $1 million Turing Award | MIT News | Massachusetts Institute of Technology - MIT News

Read More..

Computer Science Professor Cynthia Dwork Delivers Annual Ding … – Harvard Crimson

Harvard Computer Science professor Cynthia Dwork discussed the shortcomings of risk prediction algorithms at the Center of Mathematical Sciences and Applications annual Ding Shum lecture Tuesday evening.

During the talk titled Measuring Our Chances: Risk Prediction in This World and its Betters Dwork presented her research to a crowd of nearly 100 attendees. The Ding Shum lecture, funded by Ding Lei and Harry Shum, covers an active area of research in applied mathematics. In the previous three years, the series was canceled due to the Covid-19 pandemic.

Tuesdays discussion, which was also livestreamed online, was moderated by CMSA postdoctoral fellow Faidra Monachou.

Dwork began her talk by presenting what she described as a fundamental problem in how algorithms are applied.

Im claiming that risk prediction is the defining problem of AI, meaning that theres a huge definitional problem associated with what we commonly call risk prediction, Dwork said.

She said that although predictions may assign a numerical probability towards an event happening, these predictions are very difficult to interpret for one-time events, which either happen or do not.

You have, maybe, intuitive senses of what these mean. But in fact, its quite mysterious. What is the meaning of the probability of an unrepeatable event? Dwork asked.

In addition, it can be difficult to tell whether a prediction function is accurate based on observing binary outcomes, Dwork said.

If I predict something with three quarters probability, both yes and no are possible outcomes. And when we see one, we dont know whether I was right about that three quarters, Dwork said.

How do we say: is that a good function, or is that a bad function? We dont even know what its supposed to be doing, she added.

To illustrate the complexity of the issue, Dwork asked the audience to consider two worlds. In one, there is no uncertainty in whats going to happen in the future, but current predictors may lack enough information to make an accurate guess. In the other, theres real inherent uncertainty, meaning that outcomes may change even if prediction processes are perfect.

The issue, Dwork said, is these two worlds are really indistinguishable.

Since all algorithms take in inputs to predict a yes or no output, Dwork said the predictions will not reveal whether real life is arbitrary and real valued or binary.

But Dwork said in her research, she has been able to draw from the field of pseudorandom numbers to help detect the difference between these two situations.

So now, instead of trying to determine whether a sequence is random, or pseudorandom, our distinguishers are trying to determine whether an outcome is drawn from real life, she explained.

Dwork, who is also a faculty affiliate at Harvard Law School and a distinguished scientist at Microsoft Research, is known for her contributions to the areas of differential privacy and algorithmic fairness. In 2020, she received the Institute of Electrical and Electronics Engineers Hamming Medal for her work in privacy, cryptography, and distributed computing, and for leadership in developing differential privacy.

To conclude her talk, Dwork presented a roadmap of her future research.

The next step is to try to develop an understanding of what it is that transformations that satisfy this technical property can actually accomplish, Dwork said.

Excerpt from:

Computer Science Professor Cynthia Dwork Delivers Annual Ding ... - Harvard Crimson

Read More..

Is a master in computer science worth it even with ChatGPT? – Study International

In the age where powerful artificial intelligence (AI) tools such as ChatGPT exist, is a master in computer science worth it?

The short answer? Yes, a masters in computer science degree is still worth it.

Although the media has sensationalised the impressive features of what AI tools are capable of, human experts are still required design, develop, and maintain these systems.

Students and graduates of computer science degrees shouldnt have to worry about having their jobs replaced by AI.

Instead, they should embrace these new technologies thatll help increase their productivity.

In fact, the demand for computer science professionals is expected to grow as technology advances, creating new job opportunities and challenges that require human expertise.

The US Bureau of Labour Statistics (BLS) projects job outlooks for computer science professionals will remain positive despite the introduction of advanced AI tools.

Employment in this industry is projected to grow about 21% from 2021 to 2031, much faster than the average for all occupations.

In a society increasingly dominated by the use of AI, computer science degrees still remain relevant and are among the most-searched degrees. Source: Lionel Bonaventure/AFP

Unsurprisingly, computer science is the among the most-searched degrees in the US, given its status as one of the fastest-growing subjects in the world, according toThe Higher Education (THE).

Similarly, asurvey byForbeson a class of 2022 high school graduates showed that computer science was among the popular areas of interest mainly due to potentiallylucrative careersand high salaries.

The salary ranges of computer science graduates vary according to their chosen speciality.

To paint a picture,Payscale.comreports that a software engineer in the US can earn an average of US$88,946, while a Software Quality Assurance (SQA) Engineer can earn an average of US$125,873.

Before we can answer the question, Is a master in computer science worth it?, lets first understand what pursuing this degree entails.

Computer science is the study of computers and computational systems. Unlike electrical and computer engineers, computer scientists deal mostly with software and software systems.

In general, graduate degree classes for computer science will be very different from undergraduate studies. Unlike at an undergraduate level, where you only delve into theories of the subject matter, you can expect all your graduate classes to require a deep understanding of the subject matter.

Some of thecoursesyoull learn in a masters of computer science are:

ChatGPT is a conversational AI chatbot tool that serves as a powerful tool for various reasons. Source: Jason Redmond/AFP

Imagine the Magic 8 ball but one that can answer more complicated questions. Thats ChatGPT.

What makes this chatbot unique is its ability to learn quickly.

ChatGPTs ability to produce entire sentences comes from its learnings of the model making it a powerful tool that anyone can ask to do basically everything, fromtranslation to writing essays.

Open AI, the company behind the AI tool, recently launched anew version of the programmein March. 13.

The newly launched ChatGPT-4 includes updates such as accepting images as inputs, handling over 25,000 words of text and having advanced reasoning capabilities.

The platform has also collaborated with organisations such as Duolingo, Khan Academy and even thegovernment of Icelandto preserve the Icelandic language.

GPT-4 is significantly safer to use than the previous generation. It can reportedly produce 40% more factual responses in OpenAIs own internal testing, while also being 82% less likely to respond to requests for disallowed content, shares the AI company.

All in all, is a master in computer science worth it? Yes.

While AI undoubtedly has a significant impact on the field of computer science, it is unlikely to replace the need for skilled human professionals.

Looking at it with a glass-half-full perspective, AI will continue to complement and enhance the work of computer science graduates, further allowing them to tackle new challenges.

Originally posted here:

Is a master in computer science worth it even with ChatGPT? - Study International

Read More..

Is a computer science degree worth it during a period of rampant … – Study International

Is a computer science degree worth it?

Thats the question most computer science graduates would ask as they read news of massive layoffs in the tech industry.

Amazon deepens tech sector gloom as the company announced it would axe another 9,000 roles on Monday,Reutersreported.

The move piles on a wave of layoffs that have swept the tech sector as an uncertain economy forces companies to get leaner.

Other tech giants like Microsoft and Facebook have made similar moves.

Meta has announced that they will be laying off another 10,000 people and will institute a further hiring freeze as part of the companys Year of Efficiency, according to aFacebook postby Mark Zuckerberg.

Microsoft is in its third round of layoffs. The company has axed jobs that impacted employees in roles related to supply chain, artificial intelligence and the Internet of Things,Business Standardreports.

Tech layoffs are putting a gloom on the industry. As a computer science graduate, you might be wondering: Is a computer science degree worth it? Source: Chris Delmas/AFP

Approximately 174 tech companies have laid off over 56,000 employees in 2023, according toForbes.

As a sector that has long been one of explosive growth and capitalising on the latest trends, companies are constantly readjusting their priorities and resources for growth.

That includes allocating the necessary workforce for the right jobs or challenging resources to new tech that may benefit a company in the long run.

Whats more, the economy has been anything but stable in the last year.

Thecost of living crisishas made headlines in the UK, and economists have forecasted arecession to hit the USas early as Jan. 2023.

Naturally, tech companies are trimming their expenses by cutting jobs.

Deputy Dean at MITs Sloan School of Management Michael Cusumano, however, thinks that massive tech layoffs have more to do with investors than the companies bottom lines.

Investors focus more on revenue per employee. With so many hires during the COVID-19 pandemic, that metric has declined dramatically for major tech firms.

Jeffrey Pfeffer, a professor at the Stanford Graduate School of Business, thinks otherwise. He believes that companies are copying each other.

Oftentimes, companies dont have a cost problem. They have a revenue problem. And cutting employees will not increase your revenue. It will probably decrease it, he told The Verge.

So, why do tech companies lay off workers?

People do all kinds of stupid things all the time, Pfeffer shares. I dont know why youd expect managers to be any different.

If you dreamed of working at Microsoft, theres still light at the end of the tunnel. Source: Fabrice Coffrini/AFP

It depends.

According to theUS Bureau of Labour Statistics, the median usual weekly wage for full-time employees who have obtained a degree wasUS$1,525in the first quarter of 2022, compared toUS$827for those with only a high school diploma and no college.

It shows that the highest paying jobs are usually possible with a degree qualification.

Data fromPayscales Online Salary Surveyshow that those who graduate with a computer science degree have a median starting salary ofUS$77,300.

When they graduate, they can work as a:

But at what cost?

Theres no better place to study computer science than in a country that gave birth to the worlds first computer. Source: Spencer Platt/Getty Images North America/Getty Images/AFP

Depending on the uni youve chosen, the cost to pursue a computer science degree varies.

At Stanford University, the cost of attendance for the 2022-23 term is US$79,540. This price includes tuition, housing, food, and estimated miscellaneous expenses.

At the University of Florida, students can expect to payUS$45,428for an education in computer science. This number includes transportation costs, living expenses, miscellaneous personal expenses, and federal student loan fees.

Theres also the option to pursue your computer science degree online a golden opportunity for those who want to pursue this lucrative career without going through the trouble of living in a new country.

Do also consider that inflation in the US is currently around 8.6%, the highest figure since the early 1980s, according to the latest report from the Bureau of Labor Statistics.

It will affect the cost of most commodities from food and gasoline to housing.

With this, universities across the country have unfortunately had to raise the costs of tuition after holding out throughout the pandemic.

Seattle UniversityandSyracuse University, for example, have raised their tuition by3.75%and4.5%, respectively.

Robert Manning, Board of Trustees Chair of the University of Massachusetts, says the2.5% tuition fee increaseacross all its campuses is unavoidable.

Pair this with theincrease in groceries and rent, and international students would need to fork out more money to cover their cost of living over the next few years.

Go here to see the original:

Is a computer science degree worth it during a period of rampant ... - Study International

Read More..

Computer engineering research prompts bug fixes, updates to major … – University of California, Santa Cruz

Graphical processing units, or GPUs, are the workhorses behind some of the biggest topics in computer science today, powering the innovations behind artificial intelligence and machine learning. As major tech companies develop their own GPUs which are used in devices such as computers and phones, tests need to be put in place to ensure software can be run safely and efficiently across the various processors.

Thats where UC Santa Cruz Assistant Professor of Computer Science and Engineering Tyler Sorensen and his team of colleagues and student researchers step in. Sorensens team creates tests to ensure that programming languages can run correctly and safely across the diverse range of processors that different companies are producing. This contributes to the overall stability of the processors that are deployed on our computers and phones as they are being tasked to do increasingly important tasks such as facial recognition.

A new paper details a suite of tests to assess how GPUs implement programming languages. This work was led by Sorensens Ph.D. student Reese Levine along with UCSC undergraduates Mingun Cho and Tianhao Guo, UCSC Assistant Professor Andrew Quinn, and collaborators at Google. Levine will present the work at the 2023 ASPLOS conference, a premier computer systems conference.

In developing and running these tests they discovered significant bugs in a major GPU, leading to changes to an important GPU framework for programming web browsers.

If youre a company and you want to implement this language so that people can program your GPU, were giving you a really good way to test it, and even a scorecard on how well it was tested, Sorensen said. People are always saying this is a very difficult part of the programming language to reason about some people have even called it the rocket science of computer science.

In this paper, the researchers tested GPUs specifically on desktop devices from major companies such as Apple, Intel, NVIDIA, and AMD.

Through these tests, the team found a bug in an AMD compiler, a program that translates code written in one programming language into another language. This discovery led AMD to confirm the bug and fix the problem on their devices.

This behavior was so unexpected that they changed the programming language to adapt to our observations, Sorensen said.

Moreover, this led to a change in a major GPU programming framework called WebGPU, an important tool used by programmers to ensure that web browsers can accelerate web pages using new GPU technologies.

Everytime you run Chrome, you know you're running a version that's passed our tests, Levine said.

The tests developed by the team also uncovered a GPU bug on the Google Pixel 6. That bug has been confirmed, and Google has committed to fixing it. These results are discussed in another paper from Sorensens group, which is currently under submission. In their ongoing research, they recently deployed their tools and methodology to test over 100 different GPU devices.

In order to surface these bugs, the researchers use mathematical models of the programming languages to guide their tests toward interesting areas of the GPU where bugs have historically been lurking elusively.

How do you know your tests are working, and how do you know they're actually testing the right parts of the system? Levine said. We use mathematical models to provide confidence that these tests are performing as they should.

Going forward, the researchers plan to use their tests on more devices, particularly on mobile phones, to ensure programming languages can be executed safely and efficiently.

This research was supported through a gift from Google.

Read more:

Computer engineering research prompts bug fixes, updates to major ... - University of California, Santa Cruz

Read More..

Off The News: Computer science in schools – Honolulu Star-Advertiser

Please try our search feature or see the most recent news stories below. You can also browse StarAdvertiser.com by clicking the Sections button in the top-left corner of every page. Then select the news section you would like to browse.

One of the oldest surviving biblical manuscripts, a nearly complete 1,100-year-old Hebrew Bible, could soon be yours for a cool $30 million. Read more

Millions of Muslims in Indonesia are gearing up to celebrate the holy month of Ramadan, which is expected to start on Thursday, with traditions and ceremonies across the worlds most populous Muslim-majority country amid soaring food prices. Read more

Britains inflation rate rose for the first time in four months in February, surprising analysts and increasing pressure on the Bank of England to raise interest rates at its meeting on Thursday. Read more

President Ferdinand Marcos Jr. on Wednesday defended his decision to allow a larger United States military presence in the country as vital to territorial defense despite Chinas fierce opposition and warning that it would drag the Philippines into the abyss of geopolitical strife. Read more

Dozens of people attended the meeting to speak with the mayor and roughly 30 employees representing most of the City & County of Honolulus executive branch departments. Read more

A 10-year-old boy is serious condition after he was struck by a car while riding his bicycle this afternoon in the Ewa area. Read more

Kelvin Sampson has been around college basketball long enough to remember when preseason practices started in October following a true offseason, teams remained largely intact for multiple seasons and players werent permitted to pursue endorsement deals. Read more

Womens basketball practice at Miami had been over for 30 minutes. Most of the coaches were gone. Almost all the players were gone. The scoreboard had long been turned off. Read more

Tobin Anderson is leaving NCAA Cinderella Fairleigh Dickinson after one fairy-tale season and replacing Rick Pitino at Iona. Read more

The red-light safety camera at Kapiolani Boulevard and Kamakee Street the intersection where a teen was fatally struck by a pickup truck begins issuing warnings on Wednesday, according to state officials. Read more

The video banner above the entrance to Madison Square Garden on Tuesday read: Welcome Rick Pitino. Read more

If youre craving an exotic getaway without leaving Oahu, the island offers a variety of global cuisines to explore. Read more

A roasted sweet potato reminds me of Japan almost as much as a bowl of ramen, and definitely contains just as much enthusiasm. Read more

Family-owned and -operated Surf & Salsa is a Haleiwa staple. According to owner Jhon Acuna, the biz has been serving up burritos, tacos, enchiladas and more since 2012. Read more

Just like a baked potato, this soup is homey and comforting just what you want on a chilly day. Read more

This is a true oven-to-table sheet-pan dinner full of tender, crunchy Brussels sprouts and crispy, lemony chicken. Read more

The next time youre craving Mexican cuisine, check out these options: Read more

Green curry paste, coconut milk and fish sauce take this butternut squash soup to the next level. Read more

This recipe ran under the headline How to Make the Best Miso Soup of Your Life. Read more

Collards are deep green and earthy, often paired with a smoky meat such as a ham hock. Read more

More here:

Off The News: Computer science in schools - Honolulu Star-Advertiser

Read More..

Computing networking pioneer Metcalfe wins top industry prize – Reuters

March 21 (Reuters) - Computing networking pioneer Bob Metcalfe on Wednesday won the industry's most prestigious prize for the invention of the Ethernet, a technology that half a century after its creation remains the foundation of the internet.

The Ethernet is the standard connection for everything from servers inside data centers to telecommunications networks.

The Association for Computing Machinery credited Metcalfe, 76, with the Ethernet's "invention, standardization, and commercialization" in conferring its 2022 Turing Award, known as the Nobel prize of computing. It comes with a $1 million prize thanks to backing from Alphabet Inc's (GOOGL.O) Google.

The Ethernet got its start when Metcalfe, who later went on to co-found computing network equipment maker 3Com, was asked to hook up the office printer.

In the early 1970s, he worked at Xerox's Palo Alto Research Center, which had invented the personal computer and also a laser printer. Metcalfe sketched out a networking approach that would excel at connecting them together in way that could expand smoothly as the number of computers in the network rose - which helped pave the way for the internet.

Metcalfe, who graduated from the Massachusetts Institute of Technology in 1969 and earned a doctorate in computer science from Harvard in 1973, told Reuters in an interview that there is still much research to be done in connecting computers, especially in artificial intelligence.

Metcalfe said previous generations of AI "died on the vine because of a lack of data." That is no longer a problem thanks to the billion-plus people generating data by using the internet, but the challenge now is to better connect the computers that process that data through artificial neural networks.

Those networks loosely approximate the human brain, except that in a human brain, neurons have more than 10,000 connections each, while their artificial counterparts have far fewer.

"You can either increase the compute power of the neurons, or you can connect them better. And the brain teaches us that connecting them is where it's at," Metcalfe said.

The vast room for improvement in connecting neural networks "is cause for optimism on the future of AI, which I think will continue scaling," he added.

Reporting by Stephen Nellis in San Francisco; Editing by Edwina Gibbs

Our Standards: The Thomson Reuters Trust Principles.

Continued here:

Computing networking pioneer Metcalfe wins top industry prize - Reuters

Read More..

MIT-led teams win National Science Foundation grants to research … – MIT News

Three MIT-led teams are among 16 nationwide to receive funding awards to address sustainable materials for global challenges through the National Science Foundations Convergence Accelerator program. Launched in 2019, the program targets solutions to especially compelling societal or scientific challenges at an accelerated pace, by incorporating a multidisciplinary research approach.

Solutions for todays national-scale societal challenges are hard to solve within a single discipline. Instead, these challenges require convergence to merge ideas, approaches, and technologies from a wide range of diverse sectors, disciplines, and experts, the NSF explains in its description of the Convergence Accelerator program. Phase 1 of the award involves planning to expand initial concepts, identify new team members, participate in an NSF development curriculum, and create an early prototype.

Sustainable microchips

One of the funded projects, Building a Sustainable, Innovative Ecosystem for Microchip Manufacturing, will be led by Anuradha Murthy Agarwal, a principal research scientist at the MIT Materials Research Laboratory. The aim of this project is to help transition the manufacturing of microchips to more sustainable processes that, for example, can reduce e-waste landfills by allowing repair of chips, or enable users to swap out a rogue chip in a motherboard rather than tossing out the entire laptop or cellphone.

Our goal is to help transition microchip manufacturing towards a sustainable industry, says Agarwal. We aim to do that by partnering with industry in a multimodal approach that prototypes technology designs to minimize energy consumption and waste generation, retrains the semiconductor workforce, and creates a roadmap for a new industrial ecology to mitigate materials-critical limitations and supply-chain constraints.

Agarwals co-principal investigators are Samuel Serna, an MIT visiting professor and assistant professor of physics at Bridgewater State University, and two MIT faculty affiliated with the Materials Research Laboratory: Juejun Hu, the John Elliott Professor of Materials Science and Engineering; and Lionel Kimerling, the Thomas Lord Professor ofMaterials Science andEngineering.

The training component of the project will also create curricula for multiple audiences. At Bridgewater State University, we will create a new undergraduate course on microchip manufacturing sustainability, and eventually adapt it for audiences from K-12, as well as incumbent employees, says Serna.

Sajan Saini and Erik Verlage of the MIT Department of Materials Science and Engineering (DMSE), and Randolph Kirchain from the MIT Materials Systems Laboratory, who have led MIT initiatives in virtual reality digital education, materials criticality, and roadmapping, are key contributors. The project also includes DMSE graduate students Drew Weninger and Luigi Ranno, and undergraduate Samuel Bechtold from Bridgewater State Universitys Department of Physics.

Sustainable topological materials

Under the direction of Mingda Li, the Class of 1947 Career Development Professor and an Associate Professor of Nuclear Science and Engineering, the Sustainable Topological Energy Materials (STEM) for Energy-efficient Applications project will accelerate research in sustainable topological quantum materials.

Topological materials are ones that retain a particular property through all external disturbances. Such materials could potentially be a boon for quantum computing, which has so far been plagued by instability, and would usher in a post-silicon era for microelectronics. Even better, says Li, topological materials can do their job without dissipating energy even at room temperatures.

Topological materials can find a variety of applications in quantum computing, energy harvesting, and microelectronics. Despite their promise, and a few thousands of potential candidates, discovery and mass production of these materials has been challenging. Topology itself is not a measurable characteristic so researchers have to first develop ways to find hints of it. Synthesis of materials and related process optimization can take months, if not years, Li adds. Machine learning can accelerate the discovery and vetting stage.

Given that a best-in-class topological quantum material has the potential to disrupt the semiconductor and computing industries, Li and team are paying special attention to the environmental sustainability of prospective materials. For example, some potential candidates include gold, lead, or cadmium, whose scarcity or toxicity does not lend itself to mass production and have been disqualified.

Co-principal investigators on the project include Liang Fu, associate professor of physics at MIT; Tomas Palacios, professor of electrical engineering and computer science at MIT and director of the Microsystems Technology Laboratories; Susanne Stemmer of the University of California at Santa Barbara; and Qiong Ma of Boston College. The $750,000 one-year Phase 1 grant will focus on three priorities: building a topological materials database; identifying the most environmentally sustainable candidates for energy-efficient topological applications; and building the foundation for a Center for Sustainable Topological Energy Materials at MIT that will encourage industry-academia collaborations.

At a time when the size of silicon-based electronic circuit boards is reaching its lower limit, the promise of topological materials whose conductivity increases with decreasing size is especially attractive, Li says. In addition, topological materials can harvest wasted heat: Imagine using your body heat to power your phone. There are different types of application scenarios, and we can go much beyond the capabilities of existing materials, Li says, the possibilities of topological materials are endlessly exciting.

Socioresilient materials design

Researchers in the MIT Department of Materials Science and Engineering (DMSE) have been awarded $750,000 in a cross-disciplinary project that aims to fundamentally redirect materials research and development toward more environmentally, socially, and economically sustainable and resilient materials. This socioresilient materials design will serve as the foundation for a new research and development framework that takes into account technical, environmental, and social factors from the beginning of the materials design and development process.

Christine Ortiz, the Morris Cohen Professor of Materials Science and Engineering, and Ellan Spero PhD 14, an instructor in DMSE, are leading this research effort, which includes Cornell University, the University of Swansea, Citrine Informatics, Station1, and 14 other organizations in academia, industry, venture capital, the social sector, government, and philanthropy.

The teams project, Mind Over Matter: Socioresilient Materials Design, emphasizes that circular design approaches, which aim to minimize waste and maximize the reuse, repair, and recycling of materials, are often insufficient to address negative repercussions for the planet and for human health and safety.

Too often society understands the unintended negative consequences long after the materials that make up our homes and cities and systems have been in production and use for many years. Examples include disparate and negative public health impacts due to industrial scale manufacturing of materials, water and air contamination with harmful materials, and increased risk of fire in lower-income housing buildings due to flawed materials usage and design. Adverse climate events including drought, flood, extreme temperatures, and hurricanes have accelerated materials degradation, for example in critical infrastructure, leading to amplified environmental damage and social injustice. While classical materials design and selection approaches are insufficient to address these challenges, the new research project aims to do just that.

The imagination and technical expertise that goes into materials design is too often separated from the environmental and social realities of extraction, manufacturing, and end-of-life for materials, says Ortiz.

Drawing on materials science and engineering, chemistry, and computer science, the project will develop a framework for materials design and development. It will incorporate powerful computational capabilities artificial intelligence and machine learning with physics-based materials models plus rigorous methodologies from the social sciences and the humanities to understand what impacts any new material put into production could have on society.

View original post here:

MIT-led teams win National Science Foundation grants to research ... - MIT News

Read More..