Page 1,227«..1020..1,2261,2271,2281,229..1,2401,250..»

Vulnerability exploitation volumes up over 50% in 2022 – ComputerWeekly.com

Driven by significant cyber security disclosures affecting supply chain dependencies, such as Log4j and Realtek, threat actors have vastly increased their use of vulnerabilities as a means to work their way inside their victims systems, with vulnerability exploitation attempts per customer up by 55% year on year (YoY) over the course of 2022, according to data compiled by Palo Alto Networks Unit 42 threat intelligence experts.

Presented in the latest edition of its Network threat trends research report, Unit 42s data was drawn from across its parents portfolio of network monitoring and cloud products and services, including its next-generation firewalls, extended detection and response (XDR), and secure access service edge (SASE) offerings, as well as external feeds and sample exchanges among its peers in the industry.

Unit 42s research team described a race between suppliers and threat actors to uncover and seal off new avenues of exploitation, which is creating a process of constant churn and piling pressure on end-user security teams.

Their findings tally with elements of Verizons annual Data breach investigations report (DBIR), which was also released this week, revealing that Log4j may potentially be the most exploited vulnerability in history.

Attackers are using both vulnerabilities that are already disclosed and ones that are not yet disclosed aka exploiting zero-day vulnerabilities, the research team wrote. We continue to find that vulnerabilities using remote code execution (RCE) techniques are being widely exploited, even ones that are several years old.

While using old vulnerabilities might seem counterproductive, they still have significant value to attackers. In some cases, vulnerabilities discovered years ago have not been patched. This could be either because the company failed to fix the issue, or they didnt provide the patch in a way that customers could easily find. In other cases, the product could lack a patch because the product is at the end of its supported lifespan.

However, they argued, the weight of responsibility for fixing this problem should not just fall on the security supplier community end-user organisations must have appropriate processes in place for remediating vulnerabilities safely and quickly, paying particular attention to acquiring, testing and applying patches, but also accounting for issues that might not immediately spring to mind, such as the network bandwidth needed to rush a patch out across a large enterprises entire IT estate.

Others also lack awareness of available patches, and are effectively rendering old, well-known vulnerabilities into which category Log4j must soon fall, if it has not done so already as dangerous as a newly discovered zero-day.

Threat actors know these problems exist, and they continue to try these old vulnerabilities because theyre counting on organisations to fail at some point in the process of applying patches, they said.

The full report contains insight into a great many security trends, but perhaps among the most notable statistic is a 910% increase in monthly registrations for domains related to OpenAIs ChatGPT tool, and a 17,818% increase in attempts to mimic ChatGPT through domain squatting.

While these increases are of course starting from a base of zero given ChatGPT was only launched in 2022, they nonetheless highlight some of the more realistic risks of tools driven by artificial intelligence (AI). Whereas much has been written about how ChatGPT may be able to create malicious activity, Unit 42s team said that they had not seen any noticeable rise in attributable, real-world activity in this regard.

However, they said, many more traditional techniques are attempting to take advantage of AI, and it is this that is leading to a boom in fraud attempts and scams.

The speed with which scammers used traditional techniques to profit off the AI trend underscores that organisations need to exercise caution around internet activity and software that are getting attention in popular culture, the team wrote.

At the same time, it remains possible that threat actors could find ways to take advantage of the unique technological capabilities of AI. For the time being, the main way that organisations can prepare for this possibility is to continue to employ defence-in-depth best practices. Security controls that defend against traditional attacks will be an important first line of defence against any developing AI-related attacks going forward, they said.

View original post here:
Vulnerability exploitation volumes up over 50% in 2022 - ComputerWeekly.com

Read More..

The impact of generative AI on the datacentre – ComputerWeekly.com

Questions remain about the potential impact on datacentres from generative artificial intelligence (AI) adoption, even when it comes to the need for more processing, storage and power. One thing for certain is that there will be an impact.

Slawomir Dziedziula, application engineering director at Vertiv, warns that no one has fully calculated power consumption for individual applications. So, how so many requests will specifically affect software and hardware requirements remains uncertain.

Its still early days to say precisely, he agrees, pointing out that countries that banned crypto mining had similar concerns about infrastructure impacts and sustainability.

One side is how much you can trust generative AI, although you can definitely use it to enhance your knowledge and also your skills, Dziedziula says.

The other thing is you need many servers, GPUs, data storage devices and so on, and then your engineers. If theyre using value scripts for use in applications, theyll need customisation.

It can already be difficult to pinpoint use of a large language model (LLM). Experienced programmers use generative AI to come up with fresh ideas and perspectives yet some may not spot objectively poor results, he notes.

Everyone can believe theyre really good at something by using generative AI, Dziedziula points out.

Working with generative AI entails a tremendous lot of verification. Skillsets and new applications may be required. Cyber security pressures may intensify too. ChatGPT can produce vast volumes of believable phishing emails, for example.

There will be increased dependency on skilled workers, Dziedziula warns. Yet instead of 10 people, I need just two people and smart software to do the rest.

Chris Anley, chief scientist at IT security, assurance and software escrow provider NCC Group, says the datacentre may need a fresh look at resource consumption, infrastructure management and security.

Emerging network infrastructures, architectures, data storage and retrieval models will need to be secured, so the impacts are not simply about scale and capacity. Provisioning in new ways will entail internet scale distributed storage mechanisms, going beyond relational databases to achieve the throughput for training of AI and machine learning (ML) systems.

You cant just have a single cluster doing it; youve got to spread the load between lots of GPUs, Anley says.New requirements will change datacentres, from cooling and power to the physical and logical structure of networks. A datacentre optimised for AI can look very different to one optimised for typical corporate operations.

Yet ML tools have been gradually penetrating the market for years despite alarmist media hype about generative AI eating the world, notes Anley.

He confirms using ChatGPT for security code review. However, while it can help pinpoint or triage issues, he feels the results arent entirely trustworthy. It can invent facts, either missing bugs completely, just focusing on something else, or hallucinates fictional bugs. Both are bad for security.

He hastens to add that mostly there is little threat from this. Programmers in need of generative AI to code arent typically going to be working on critical corporate applications. Also, although subtle bugs do happen, bad code is usually immediately apparent because it just does not do what you want.

Code isnt one of those things where it can be mostly right like a song or a theatrical production or a piece of prose or whatever, Anley says.

Generative AI is likely to remain mainly about making skilled staff more efficient and productive. Even a 10% productivity improvement can slash cost at an organisational level, he says.

Generative AI is already good at the small stuff, such as library code where a programmer might not be quite familiar with the library, does not know the name of the specific function in that library, or for certain technical tasks such as converting data from one format to another.

Itll autocomplete something, saving you a trip to the web browser or the documentation, Anley continues. I think most of our customers are now using AI in one form or another, whether for customer support, chatbots, or just optimising internal processes.

However, with complex AI or ML development and hosting technologies pushed into corporate networks, caution is required. For instance, aggregating lots of training data across security boundaries can remove important controls on what can be seen.

Training data can be retrieved from trained models simply by querying them, using attacks such as membership inference and model inversion. The result is a situation similar to the familiar SQL injection data breach attacks.

He notes that at least one supplier recently banned generative AI because developers were adding sensitive corporate code into a third-party policy engine just to help them write. Yet not doing this should be common sense, and many firms already have policies forbidding code-sharing with third parties.

Matt Hervey, partner and head of AI law at Gowling WLG, says that while its still tough to train these models to generate and categorise data perfectly, the quality looks to have jumped up dramatically in the past six to 12 months. With ML techniques are being baked into standard tools, profound impacts can be expected, but these may mostly represent business opportunity.

I suspect this is good news for the datacentre business...and there are movements to achieve similar results with smaller training sets, Hervey says.

However, certain bad activity may end up in the private space, he adds, and questions remain as to whether datacentres will be entirely shielded when it comes to legal risk.

With a massive rise in ML use entailing ramp-ups in processing and power beyond what has been previously seen, some will also be moving cloud applications or services to the edge. On-board processing on mobile phones for example presents potential for privacy or other regulatory compliance issues.

Views on the economic value of certain activities or roles is set to change, with some areas or activities becoming more or less cost-effective, rippling across various industries and sectors including in datacentres, Hervey says.

Jocelyn Paulley, partner and co-head of UK retail, data protection and cyber security sectors at Gowling WLG, adds that datacentre expansions and connectivity where there are already capacity issues, such as London, could add a challenge, but are perhaps soluble with infrastructure and cooling rethinks and increased server densities.

Careless or non-compliant customer use of ChatGPT, for example, will not affect colocation providers with zero access to customer software and environments that do not host applications or other peoples content and where that can be an issue, legislation is already evolving, Paulley says.

Jaco Vermeulen, chief tech officer at consultancy BML Digital, points out that generative AI does not really do anything more advanced than search, which means brute-force in terms of cyber attack. While LLMs might require greater human intervention in interpretation or joining up certain factors in analysis, for example, the latest AI iteration is not really a threat in itself.

It needs to be directed first and then validated, he says.

Datacentre access already requires physical, biometric or possibly double biometric identification, plus a second party. Two people are typically needed to access a building, each with three elements of identification and then verification.

For AI to extract all of that, it needs a lot of access to personal information, which is just not available on the internet and if its drawing data its not meant to access, thats down to the organisations and individuals using it, says Vermeulen.

Using more complex prompts to achieve greater sophistication will only result in responses failing more miserably...because its going to try to give you actual intelligence without real context on how to apply it. Its only got a narrowband focus, Vermeulen says.

Youre going to have bad or lazy actors any place. This machine does not go beyond the box. And if in future it does turn into Skynet, lets unplug it.

Further, Vermeulen says most agents will be deployed where an organisation has full control over it. He also pours water on the need for any unique datacentre-related proposition.

Generative AI is mostly more of the same, unless theres a real business case in actual product, Vermeulen says. Its just pattern recognition with output that picks up variations. The commercial model will remain about consumption, support and capacity.

Rob Farrow, head of engineering at Profusion, adds that most AI models simply retrain on the same inputs to produce their models. Although developments such as an ability to self-architect could make AI enough of a threat to require some failsafe or kill switch option, this seems unlikely within about 10 years.

Theres no real valid level of complexity or anything even like human intelligence, Farrow points out. Theres a whole bunch of technical problems. When it does happen, we need to think about it.

That brings us back to the computational expense of running ML. Further uncertainties remain, stemming from increased software complexity, for instance, so more things can go wrong. That suggests value in working on developing transparency of the software and how it operates or makes decisions.

Writing less code and simplifying where possible can help, but platforms for this often do not supply enough nuance, Farrow says.

While warning against organisations leaping into generative AI or ML projects without sufficiently strong data foundations, he suggests that the impacts on power, processing and storage might be countered by using AI or ML to develop greater predictability, achieving savings across systems.

Some Amazon datacentres have solar panels with thousands of batteries, making huge amounts of heat, but actually using ML to take solar energy based on circadian rhythms, he says.

But a lot of businesses jump the gun, chasing an AI or ML model they want. You are building a house on sand if you cannot retrain it, you cannot go and get new data, you have no visibility, and you cannot audit it. It might work for a short time and then fail, Farrow warns.

Read the rest here:
The impact of generative AI on the datacentre - ComputerWeekly.com

Read More..

How the Internet of Things Can Facilitate an Enhanced Passenger … – TechNative

Technology is at the heart of everything we do today, and mobility is no exception.

Since the Internet of Things (IoT) term was coined in 1999 by computer scientist Kevin Ashton, weve come a long way in developing the concept. Put simply, the Internet of Things is a network of objects and people that is connected through technology.

As passengers seek seamless, convenient, and fast transport experiences, the mobility sector has been on the hunt for the latest technology that can take the passenger experience to the next level.

Here, we discuss how the latest innovations in the IoT are enhancing the passenger experience.

Safe and reliable railway infrastructureIn todays competitive world, being able to make decisions quickly is key. Long gone are the days when a driver was pretty much fully responsible for operating a vehicle. Today, cutting-edge computing and machine learning are driving not just the train forwards but also the future of railway connectivity.

Utilising computing power, railway operators can collect data and process it within milliseconds to enable near-real-time decision-making and responsiveness. This then opens the door to improving the safety and reliability of railway infrastructure.

Track and train part failures are common issues that cause delays and reduced passenger satisfaction. Big data collection and analysis through sensors can help streamline business processes and generate insights that can reduce downtime by predicting maintenance issues and allowing better management of staff and security. In return, railway operators can increase capacity, improve reliability, and reduce maintenance costs.

Data is collected through sensors that are placed on critical parts of the trains, such as brakes, wheels, and engines, or on the actual tracks. They can measure variables that have predictive value to maintenance teams, such as track condition and air and track temperatures.

Seamless passenger experienceThe IoT is providing an enhanced passenger experience not only through optimising the rail infrastructure and train operations to offer safety and reliability but also by modernising day-to-day passenger service.

What that means is that the passenger journey from A to B is seamless and informed by digital services that promote connectivity, efficiency, and convenience. There are the latest passenger service tools.

Real-time passenger information (RTPI)For example, big data can provide accurate scheduling information. With the advancement of real-time passenger information (RTPI), details about service updates, time schedules, accurate bus locations, and destination data can be shown on both passenger information displays and mobile apps. That way, connectivity allows riders to plan their journeys much in advance and improve their passenger satisfaction rate.

Wi-Fi connectivityMany IoT devices make use of WiFi to connect to the internet. A high capacity WiFi service can also be used to elevate the passenger experience significantly. Outfitting the train with onboard Wi-Fi connectivity means that passengers can optimise their travel times, especially commuters. A recent study at the University of Glasgow investigated the relationship between internet use while commuting and travelling and modes of transport. It became evident that internet use on public transport impacts the value of travel time. As a result, ridership was increased.

InfotainmentWi-Fi connectivity also facilitates infotainment. This is commercial content and useful travel information displayed on onboard screens. Infotainment is a vital communication link between a transit agency and its passengers. It connects to the trains ecosystem through a network connection and displays commercial content through scheduled programming technology that can be controlled based on prime times and locations.

The displays showcase information about schedule updates, safety measures, and the companys policies, and are also a great way to promote the companys services and offers. It can also be used as a monetisation tool by allowing advertising content from other parties.

Smart Ticketing Automated Fare Collection

In order to eliminate queue lines at ticket machines, operators can implement automated fare collection that fuses cloud-based technologies with cutting-edge computing. Through sensors on platforms or trains, specific smartphone apps can be detected as passengers enter the station or train. That way, theyre automatically being charged the correct fare.

This is beneficial in terms of optimising the passenger experience and operations and collecting data about passenger behaviour to inform future optimisations.

With innovations of the Internet of Things, were enjoying a more connected, frictionless, and convenient passenger experience that is benefiting not only the riders but also the transit companies. We are excited to see what the future of mobility holds!

About the Author

Paul Vaclik is Head of R&D Architecture at Nomad Digital. Nomad Digital is a world leading provider of passenger, fleet management and monitoring solutions to the transport industry. We offer a broad solutions portfolio to both transport operators and builders that facilitates a significantly enhanced passenger experience with seamless connectivity, real-time journey information and on-board entertainment.

Featured image: allvision

Go here to see the original:
How the Internet of Things Can Facilitate an Enhanced Passenger ... - TechNative

Read More..

What is Dark Web Monitoring and How Does It Work? – Trend Micro News

Reports of identity theft and fraud have skyrocketed over the last decade, and with it, the huge losses incurred by American consumers and employees who have fallen victim. At the same time, the so-called dark web, venue for much of this criminal activity, lurks silently beneath the internet. With the above in mind, we wanted to turn our attention to a service our readers may have seen advertised by cybersecurity companies: dark web monitoring. What is it? Read on the low-down.

The dark web is the hidden part of the internet. It makes up approximately 5% of internet content, and is part of a much larger area known as the deep web, which conversely makes up a huge 90% of the internet. The dark web, which can only be accessed via specific browsers, is a series of websites that require specific authorization to enter. Dark websites allow users unparalleled anonymity due to encryption software such as the Tor (short for The Onion Router) browser. Unlike the surface web, the dark web does not use information available on search engines like Google or Bing; instead, it utilizes content from individual sources: forums, email, social media, and company databases. These features are why the dark web is an enticing place to do business for criminals.

Dark web monitoring is a cybersecurity service that involves scanning the dark web for your sensitive information and PII. Its a central element in identity theft protection, in that it allows you to proactively respond to leaked data before damage is done. Dark web monitoring works by scanning many thousands of websites every day for evidence of your information if this is found, youll be alerted. Examples of the kinds of websites that the scanner will search include marketplaces, forums, and chat rooms.

Trend Micros ID Security, which currently offers a one-month free trial, offers precisely this service. Its dark web monitoring tool will scan the dark web for your personal information, including up to 5 email addresses, 5 bank account numbers, and 10 credit card numbers. If your data is ever leaked, youll know about it instantly. Give it a go today and enjoy 24/7 comprehensive personal data monitoring.

The dark web is not illegal, and accessing it is completely lawful. Nonetheless, the dark web does have a well-earned reputation for illegal content and activity taking place within it. For example, it is the go-to place for cybercriminals to buy and sell stolen credentials, such as credit card numbers, email addresses, passwords, and Social Security numbers.

Aside from identity theft, it is also a venue for many other criminal ventures, including:

In summary, the dark web is not illegal but most activities that people use it for are illegal.

We are all to some extent at risk from the dangers of the dark web regardless of whether we use it or not. Aside from the abundance of viruses, trojans, and ransomware due to lax security provisions, the dark web is the go-to marketplace for stolen credentials and PII. Last year, the FBI estimated that losses from cybercrime reached almost $7 billion much of this takes place on the dark web. Your data is of great value and at great risk: dark web monitoring will reduce that risk.

Compromised personal data can have serious consequences, including identity theft, financial fraud, and job losses. The best thing you can do is a) have reliable cybersecurity protection, and b) ensure you will find out ASAP in the event of being affected. We would encourage readers to head over to our new FREE ID Protection platform, which has been designed to meet these challenges.

With ID Protection, you can:

All this for free why not give it a go today? As always, we hope this article has been an interesting and/or useful read. If so, please do SHARE it with family and friends to help keep the online community secure and informed and consider leaving a like or comment below. Heres to a secure 2023!

Read this article:
What is Dark Web Monitoring and How Does It Work? - Trend Micro News

Read More..

Bitcoin and Tether Dominance Surge as Investors Seek Shelter – BeInCrypto

Bitcoin, the top cryptocurrency by market cap, and Tether, the most popular stablecoin, have boosted their share of the total crypto market.Measured as a percentage of the total crypto market capitalization, Bitcoin dominance spiked to nearly 50% early on Saturday morning.

At around the same time, USDT dominance surged to over 8.2%.

As is often the case with rises in Bitcoin dominance, Saturdays increases coincided with a major altcoin market crash.

The reasons behind the surge in Bitcoin dominance are twofold.

On one hand, the price of BTC has remained resilient compared to many altcoins. With the price of tokens like Solana and Polygon plummeting by over 20% overnight, their market cap shrunk rapidly.

As a result, Bitcoins relative share of the total crypto market cap has risen.

At the same time, bitcoins comparatively stable price offers investors a degree of protection. This drives more funds out of altcoin markets and into bitcoin as investors seek refuge from falling prices.

BTCs dominance rate has been steadily rising since November. And it increased rapidly during the March U.S. banking crisis. Following Saturdays surge, bitcoin dominance is at its highest level since 2021.

While bitcoins strong liquidity makes it an attractive way for crypto traders to exit altcoin markets, stablecoins also offer an alternative risk aversion strategy. And as the most popular dollar-pegged stablecoin, USDT provides an easy way to exit crypto market volatility without fully off-ramping into fiat currency.

Another factor that could have influenced the rise in USDT dominance is the worrying price deviations of fellow dollar-denominated stablecoin TrueUSD (USDT).

On Saturday morning, the coin became temporarily depegged from the dollar, falling as low as $0.9951.

The sudden dip in value was followed by an announcement from TrueUSD stating that TUSD mints via Prime Trust would be paused for further notification.

TUSD minting and redemption services remain unaffected and will continue to operate as usual. We want to assure you that our partnerships with other banking institutions remain intact, allowing for seamless transactions, the company tweeted.

In adherence to the Trust Project guidelines, BeInCrypto is committed to unbiased, transparent reporting. This news article aims to provide accurate, timely information. However, readers are advised to verify facts independently and consult with a professional before making any decisions based on this content.

View post:
Bitcoin and Tether Dominance Surge as Investors Seek Shelter - BeInCrypto

Read More..

Shortage of faculty with PhDs in engineering institutes’ – Deccan Herald

Only 44.51% of the faculty in engineering colleges in India have a PhD, while only 55.49% have a masters degree, data from the Ministry of Education shows.

Engineering institutes that rank in the top 100 of the National Institutional Ranking Framework rankings have a concentration of faculty with doctoral qualification. Across all engineering colleges, only 34.65% of faculty have an experience of 15 years or more.

However, among the engineering institutions that ranked in the top 100 in the NIRF rankings, 81.20% of faculty had a doctoral degree.

There are over 1,27,296 professors and teachers across 1,138 engineering institutes across the country, in addition to 33,891 faculty in the top 100 institutes. Pan-India, only 53,527 educators (33.21%) in these institutes who were not in the higher ranks had 8 to 15 years of experience. This is a serious handicap since mentorship received during the doctoral training can play a vital role in preparing the faculty for a teaching career in higher education, the ministry said.

In technical colleges, which come under the aegis of the All India Council of Technical Education (AICTE), a BTech or MTech degree was enough till a few years ago; now the government requires them to go through a training period.

Similar to the trend in engineering colleges, in degree colleges, 50,160 (77.79%) of 64,484 educators in the top 100 ranking colleges had a doctoral degree. In the remaining 1,166 colleges, only 51.36% of faculty had a doctoral degree. The average number of faculty in the top institutions was 645, as opposed to 162 in the remaining institutions.

In colleges without non-technical education, the University Grants Commission (UGC) had this year said that it is not mandatory for Assistant Professors in Central universities to have a Doctor of Philosophy (PhD) degree. Instead, candidates can qualify for the position with the National Eligibility Test (UGC-NET).

UGC chairperson M Jagadesh Kumar said that more faculty members in engineering colleges of universities have a doctorate as they have access to many research facilities. Therefore, they focus not only on undergraduate programmes but also on training masters and research students and can attract faculty members with a doctorate. Whereas standalone engineering colleges, most of which are in this category, focus more on undergraduate programs and very little on carrying out research. This may be why engineering colleges cannot attract PhD faculty members, he said.

Follow this link:

Shortage of faculty with PhDs in engineering institutes' - Deccan Herald

Read More..

Top 3 altcoins bleed in double-digits as Bitcoin dominance rises: MATIC, Cardano and Solana – FXStreet

Bitcoin dominance is increasing consistently since Friday. It is likely that market participants are rotating capital from Polygon (MATIC), Cardano (ADA), and Solana (SOL) to Bitcoin. These assets rank in the list of cryptocurrencies labeled as securities by the US Securities and Exchange Commission (SEC).

Also read: Vitalik Buterin proposed this roadmap for Ethereum as the altcoin battles intense selling pressure

The US SECs regulatory crackdown on cryptocurrencies in its lawsuit against Binance and Coinbase has set a series of events in motion, triggering a bloodbath in crypto. Altcoins that were labeled as securities by the US financial regulator have suffered a double-digit drop in their prices as traders shed their holdings.

Altcoin prices decline under selling pressure

Alongside altcoin bloodbath, Bitcoins dominance has increased from 44.89% to 45.91% since Friday. Based on data from CoinGecko, the largest asset by market capitalization continues to dominate the ecosystem as altcoins crumble under selling pressure from the uncertainty among traders.

Binances Changpeng Zhao (CZ) attempted to explain the narrative driving the decline in altcoin prices and the mass sell off experienced by MATIC, SOL and ADA. CZ assured market participants that Binance is not converting its holdings to fiat.

The exchanges crypto reserves have increased over the past few weeks. CZ asks crypto traders to manage their fear, greed and risks while the exchange focuses on ensuring smooth operations of the trading platform.

If altcoin market capitalization continues to decrease and Bitcoin dominance rises, the asset could begin its recovery in the short term.

See original here:
Top 3 altcoins bleed in double-digits as Bitcoin dominance rises: MATIC, Cardano and Solana - FXStreet

Read More..

University of Arizona launching computer science and engineering B.S. – EurekAlert

image:The University of Arizona College of Engineering will offer a bachelor's degree in computer science and engineering beginning with the fall 2023 semsester. view more

Credit: University of Arizona College of Engineering

Right now, United States employers are unable to fill around 1 million computer science-related jobs because of a lack of qualified candidates, as estimated by the Bureau of Labor Statistics. And the demand isnt going away the bureau projects employment in the field to grow much faster than average through 2031, while the number of graduates will continue to lag behind job openings.

This workforce need is the primary reason the College of Engineering will soon offer a bachelor's degree in computer science and engineering, said Michael Wu, head of the Department of Electrical and Computer Engineering, which houses the degree. Students can access the program at the main campus and online, beginning in the fall 2023 semester.

The programs educational model distinguishes it from other degree programs and provides an option for students who are interested in the field and want an interdisciplinary engineering education.

The market is so large that were not competing with other institutions or other programs within the University of Arizona, said Wu. Instead, were joining with other educators to develop a qualified workforce for the computer industry.

Employers seeking these graduates vary from the biggest companies, such as Meta and Google, to small startups hiring computer scientists to develop apps and websites.

The Regional Industry ViewKarla Morales is vice president of the Arizona Technology Councils Southern Arizona regional office. Among other goals, the Arizona Technology Council is working to develop Arizona as a national tech hub.

Morales is also a member of the Deans Advisory Council, and her ongoing discussions with industry representatives inform her service with the college. She believes the new program will address important needs.

This is such a great opportunity for our community and really for anybody who wants to take advantage, Morales said. It will elevate the College of Engineerings programs and the workforce abilities within Arizona.

Members of the trade association have expressed a strong need for more workers with computer science engineering skills. They also want to hire recent graduates who demonstrate the ability to collaborate, communicate and solve problems, Morales said.

It's so important for our graduates to start developing those skills at the very beginning of their academic career, and not just as an add-on. We feel it is important to have that simultaneous experiential and theoretical learning, she said.

A Unique, Experiential ModelThe computer science and engineering curriculum plan will engage students in an applied education model thats likely to extend beyond courses specifically for the degree, said Wu.

A student could work with a faculty member in aerospace and mechanical engineering, for example, to apply computer science techniques to working with autonomous vehicles, he said. Just being part of this very interdisciplinary engineering college will allow our students to not only be exposed to computer science knowledge but find applications in a wide variety of engineering domains.

Each students education will culminate with a senior year Interdisciplinary Capstone, in which they will work with students from other majors to complete a sponsored project as part of the Craig M. Berge Engineering Design Program.

Students will work with theories, but we stress the hands-on part. They will get opportunities to solve real-world problems in every course. In the capstone, they will form teams to solve even bigger problems, he said.

These aspects, and the overall engineering focus, make the new program distinct from traditional computer science degrees and from the program offered by the UA College of Science. However, Wu is looking forward to collaborating with the College of Science the colleges are sharing some courses and planning joint research projects.

Computer science and engineering is also different from other college degree programs such as software engineering. The former is a broader discipline teaching mastery of computer principles and algorithms as well as concepts including machine learning, artificial intelligence, cybersecurity, quantum computing, data analytics, humancomputer interaction, virtual reality, robotics, and hardware and software co-design. Software engineering is a more focused specialty, said Wu.

Next Steps for Computer Science and EngineeringCollege leaders have submitted proposals to launch masters and doctoral degree programs in the same subject to the Arizona Board of Regents. They also will propose program extensions to university campuses in Yuma and Chandler in the near future.

In the meantime, the ECE department has begun recruiting students and faculty. The department will hire 20 to 25 new faculty members within the next five years to teach the new program and build a research program in computer science and engineering.

The department will have electrical engineering, computer engineering and computer science, giving us the full spectrum of expertise. This will significantly enhance our national reputation and rankings in the area of computing. Our faculty are extremely excited about this opportunity, said Wu.

The department and college will focus on recruiting and supporting both faculty and students with diverse perspectives, experiences and backgrounds, said Wu. On the student front, planned measures include visiting K-12 schools, especially those that are under-resourced, to spread the word about the new computer science and engineering program. The department will also invite speakers from underrepresented groups to give seminars, providing diverse role models and perspectives for students.

Local tech leaders want to collaborate with the university to build a more diverse and inclusive workforce and close industry gaps in representation for women and minority populations, said Morales.

Addressing these gaps is seen as crucial for promoting innovation and achieving better problem-solving outcomes, she said.

Continue reading here:

University of Arizona launching computer science and engineering B.S. - EurekAlert

Read More..

Sibley School celebrates 150 years of mechanical engineering … – Cornell Chronicle

The Sibley School of Mechanical and Aerospace Engineering is celebrating 150 years of mechanical engineering at Cornell with a year of festivities that reflect on the schools distinguished past and look forward to its promising future.

The Sibley150 celebration formally kicked off during Cornell Reunion, June 8-11. Alumni in attendance may notice Sibley150 banners decking light posts surrounding Upson and Rhodes Halls as well as the Duffield Hall atrium. Virtual and in-person events are being planned for the rest of the year through the first half of 2024.

While engineering and mechanic arts students were studying at Cornell as early as 1868, it wasnt until the 1873-74 academic year that a four-year Bachelor of Mechanical Engineering degree was offered by what was then the Sibley College of the Mechanic Arts, according to a history written by Francis Moon, professor emeritus and historian in the Sibley School.

Weve come so far in the last 150 years, said David Erickson, the S.C. Thomas Sze Director of the Sibley School. The first Sibley graduates with mechanical engineering degrees invented milling machines and new techniques for grinding ball bearings. Our recent graduates are sending rockets into space and developing biomaterials to fight infectious diseases.

In the late 19th century, the Sibley College distinguished itself from other colleges by offering an education that incorporated more than simply technical training, according to Moon.

The Sibley College brought together the practical aspects of building machines along with the new engineering sciences and the testing of engineering materials, said Moon, who added that by the end of the century, Cornell was producing nearly one in five mechanical engineers in the U.S. Cornell was ahead of its time, and today the Sibley School continues to produce leaders for academia, industry and government.

In the early 20th century, the automobile and airplane would change the Sibley Schools curriculum, and as new technologies were introduced over the years, research and education continued to evolve.

Mechanical engineering has changed a lot in the last 150 years and thats in part because we try to adapt to societys changing needs, said Atieh Moridi, an assistant professor who joined the Sibley School in 2019. My students and I are using additive manufacturing to make novel materials for engineering-demanding applications from fusion reactors to biomedical devices.

As part of the Sibley150 celebration the school has established the Sibley 300 Future Fund, an initiative aimed at shaping the future of mechanical engineering at Cornell for the next 150 years by supporting state-of-the-art research facilities, developing innovative interdisciplinary programs, and fostering collaborations that address societys grand challenges.

Go here to read the rest:

Sibley School celebrates 150 years of mechanical engineering ... - Cornell Chronicle

Read More..

Trader Predicts Ethereum-Based Altcoin Will Plummet by Over 50%, Updates Outlook on Binance Coin and Chainlink – The Daily Hodl

A closely followed crypto strategist says that an Ethereum (ETH)-based altcoin is flashing signs of bullish exhaustion as he details his forecast for Binance Coin (BNB) and Chainlink (LINK).

Pseudonymous trader Altcoin Sherpa tells his 195,900 Twitter followers that the recent uptrend of the blockchain-based rendering protocol Render (RNDR) appears to be over for now.

According to the crypto strategist, RNDRs anemic volume over the last few weeks suggests that traders have lost interest in the project running on the Ethereum network.

RNDR: I think this one looks tired, expecting much lower prices.

Very few trends are sustainable. Most will go all the way back down and very few alts outperform ETH over the long run.

Am looking for shorts

You can see the one last exhaustion push-up. I think that this one goes much lower. $1 or lower in time.

Render is trading at $2.27 at time of writing.

Next up is Binance Coin (BNB), the utility token of the worlds largest crypto exchange. Altcoin Sherpa says that traders should think twice about becoming overly bearish on the fourth-largest altcoin by market cap as he says BNB is still trading within its yearly range.

BNB: I dont see a true trend change for BNB overall. Its still in the same range its been in for the last year, and I think a test around $220-250 is probably coming.

I still think Binance is ok and will be the leader for the foreseeable future

I dont think this is a good area to short. Its a decent support area still and the EQ (equilibrium) of the entire range. Would wait for further breakdown or a move up to low $300s before shorting personally.

BNB is trading at $260 at time of writing.

Turning to the decentralized oracle network Chainlink (LINK), the crypto trader says that the altcoin could ignite a short-term rally should it hit a key support level.

LINK: Expecting this to bounce around $5.50. Range lows and should be fine.

This current area is also a support level, so anywhere here to the range lows seems decent for support.

At time of writing, LINK is trading for $5.90.

Generated Image: Midjourney

Link:
Trader Predicts Ethereum-Based Altcoin Will Plummet by Over 50%, Updates Outlook on Binance Coin and Chainlink - The Daily Hodl

Read More..