Page 1,205«..1020..1,2041,2051,2061,207..1,2101,220..»

Hidden Costs: The Energy Consumption of Machine Learning – EnergyPortal.eu

Machine learning has become an integral part of our lives, revolutionizing industries and transforming the way we interact with technology. From personalized recommendations on streaming platforms to advanced medical diagnostics, the applications of machine learning are vast and ever-growing. However, there is a hidden cost to this technological marvel that is often overlooked: the energy consumption of machine learning.

The energy consumption of machine learning is surprisingly high, and it is essential to understand the implications of this fact. With the increasing demand for more complex and powerful machine learning models, the energy required to train and run these models is also on the rise. This energy consumption not only contributes to the global energy crisis but also has a significant impact on the environment.

Machine learning models are developed through a process called training, where the model learns from a large dataset to make predictions or decisions. This training process is computationally intensive and requires a significant amount of energy. In fact, the energy consumption of training a single machine learning model can be equivalent to the energy consumed by multiple households in a year.

A study conducted by researchers at the University of Massachusetts, Amherst, found that training a single natural language processing (NLP) model, which is used for tasks such as translation and sentiment analysis, can generate carbon emissions equivalent to nearly five times the lifetime emissions of an average car, including its manufacturing process. This startling revelation highlights the environmental impact of machine learning and the need for more sustainable practices in the field.

The energy consumption of machine learning is primarily driven by the hardware used for training and running the models. Graphics processing units (GPUs) and tensor processing units (TPUs) are commonly used for these tasks due to their high computational capabilities. However, these specialized processors consume a significant amount of energy, contributing to the overall energy consumption of machine learning.

Another factor contributing to the energy consumption of machine learning is the increasing complexity of models. As researchers and developers strive to create more accurate and sophisticated models, the number of parameters and computations required for training increases. This, in turn, leads to higher energy consumption.

Data centers, which house the servers and hardware required for machine learning, also play a significant role in the energy consumption of machine learning. These facilities consume vast amounts of energy to power the servers and maintain optimal operating conditions, such as cooling systems to prevent overheating. As the demand for machine learning services grows, so does the need for more data centers, further exacerbating the energy consumption issue.

To address the energy consumption of machine learning, researchers and developers are exploring various solutions. One approach is to develop more energy-efficient hardware, such as specialized processors designed specifically for machine learning tasks. Another strategy is to optimize machine learning algorithms to reduce the number of computations required for training, thereby reducing energy consumption.

Additionally, there is a growing interest in exploring alternative, more sustainable energy sources for powering data centers. For example, some companies are investing in renewable energy sources, such as solar and wind power, to reduce the environmental impact of their data centers.

In conclusion, the energy consumption of machine learning is a critical issue that must be addressed as the field continues to grow and evolve. By developing more energy-efficient hardware, optimizing algorithms, and exploring sustainable energy sources, the machine learning community can help mitigate the environmental impact of this groundbreaking technology. As we continue to reap the benefits of machine learning in various aspects of our lives, it is crucial to be aware of the hidden costs and strive towards a more sustainable future.

Follow this link:
Hidden Costs: The Energy Consumption of Machine Learning - EnergyPortal.eu

Read More..

Dallas College and Texas Blockchain Council Join Forces To Offer … – Dallas College

Our collaboration with the Texas Blockchain Council positions Dallas College as a leader in technology education for the digital economy, said Dallas College Chancellor Justin Lonon.

Media Contact: Debra Dennis; DDennis@DallasCollege.edu

For immediate release June 15, 2023

(DALLAS) Dallas College and the Texas Blockchain Council (TBC) have announced a partnership that will make the college a leading innovator in technology education for the digital economy while encouraging students to seek careers in blockchain and cryptocurrency fields. The collaboration emphasizes hands-on learning and will allow participating students a chance to earn a new Blockchain and Cryptocurrency Advanced Technical Certificate through Dallas College.

Our collaboration with the Texas Blockchain Council positions Dallas College as a leader in technology education for the digital economy, said Dallas College Chancellor Justin Lonon. Dallas College has always been committed to providing our students with the most relevant and valuable educational experiences. This unique partnership with the TBC will allow us to stay at the forefront of technological innovation and prepare our students for the digital economy.

The partnership comes at a time when Texas is becoming a leader in bitcoin mining. Under the partnership, Dallas College will also host a unique bitcoin miner installation at Richland Campus.

Steve Kinard, director of bitcoin mining for the Texas Blockchain Council, said, Our collaboration with Dallas College isnt just about installing a bitcoin miner; its about creating an environment where students can immerse themselves in cutting-edge technology. The digital economy demands a workforce with a deep understanding of high-performance computing and blockchain concepts, and were here to ensure that Dallas College students are ready to meet that demand.

As the future of the economy shifts towards digitalization, Dallas College is stepping up to ensure its students are prepared for the technological changes that are revolutionizing industries worldwide and working alongside industry partners.

The installation of a bitcoin miner at Richland Campus allows students to gain firsthand experience with the technology that powers the worlds first and largest cryptocurrency. Key technical supporters of the initiative include Luxor Technologies and Bentaus Mining. The Texas Blockchain Council donated the hardware at no cost to Dallas College. And 100% of the bitcoin proceeds from the operation will go to Dallas College Foundation to support its mission.

It is exciting to team up with Dallas College and the Texas Blockchain Council to continue to bring education and awareness of how bitcoin really works, said Bob Davidoff, founder of Bentaus Mining. It all starts at the academic level to provide real information regarding the technologies of the future.

Ethan Vera, COO of Luxor Technologies, said, Dallas College is leading the way when it comes to forward-thinking adoption of bitcoin mining and the benefits it brings to the Texas grid and society. Luxor is pleased to support this institution with our full suite of software products.

The mining installation is being facilitated through a relationship with Coinbase Institutional. Anthony Basili, head of asset allocators for Coinbase Institutional, said, Coinbase is a global leader in providing trusted and compliant access to digital assets and custody solutions. I am proud to be able to support this initiative and see my hometown of Dallas leading the way.

The Blockchain and Cryptocurrency Advanced Technical Certificateis available at all seven Dallas College campuses. For more information, visit the Blockchain Certificate webpage.

# # #

The rest is here:
Dallas College and Texas Blockchain Council Join Forces To Offer ... - Dallas College

Read More..

Environmental Evolutions: Environmental Sustainability Of … – Mondaq News Alerts

To print this article, all you need is to be registered or login on Mondaq.com.

On this episode, Megan is joined by Partner Allison Watkins Mallick and CryptocurrencyMining and Staking Sustainability Association President Cameron Rafati to discuss the future ofsustainable digital currencies. Covering everything from energysources, grid stability, and permitting this episode dives into theregulations, impacts and innovations of cryptocurrency intoday's world.

For more information, reach out to Allison or visit bakerbotts.com.

Environmental Evolutions explores emerging areas and recentdevelopments in environmental law and policy. Click here to listen to priorepisodes.

The content of this article is intended to provide a generalguide to the subject matter. Specialist advice should be soughtabout your specific circumstances.

POPULAR ARTICLES ON: Environment from United States

Katten Muchin Rosenman LLP

Katten ESG Guidepost is a monthly publication highlighting the latest news, legal and regulatory developments involving environmental, social and governance matters.

Kelley Drye & Warren LLP

This week, the FTC held its Talking Trash at the FTC workshop, a four-hour event intended to examine "recyclable" claims in ads. We've sifted through some of the trash and pulled out a few things worth noting.

Excerpt from:
Environmental Evolutions: Environmental Sustainability Of ... - Mondaq News Alerts

Read More..

Progress of Artificial Intelligence and Tiny Machine Learning – Bisinfotech

Renesas Electronics Corporation has provided an update on its progress in providing artificial intelligence (AI) and tiny machine learning (TinyML) solutions one year after announcing its acquisition of Reality Analytics, Inc. (Reality AI), a leading embedded AI provider.

On June 9, 2022, Renesas announced that it was acquiring Reality AI in an all-cash transaction. Reality AIs wide range of embedded AI and TinyML solutions for advanced non-visual sensing in automotive, industrial and commercial products fit well with Renesas embedded processing and IoT offerings. They provide machine learning with advanced signal processing math, delivering fast, efficient machine learning inference that fits on small MCUs and more powerful MPUs. With Reality AI Tools, a software environment built to support the full product development lifecycle, users can automatically explore sensor data and generate optimized models. Reality AI Tools contains analytics to find the best sensor or combination of sensors, locations for sensor placement, and automatic generation of component specs and includes fully explainable model functions in terms of time/frequency domains.

In just one year since the announcement, Renesas has delivered a wide range of solutions based on Reality AI technology. The following products will be presented at Renesas Booth #945 at the Sensors Converge Tradeshow, June 20-22 at the Santa Clara Convention Center:

Reality AI Tools is now tightly integrated with Renesas compute products and supports all Renesas MCUs and MPUs natively with a built-in parts picker engine. Support for automatic context switching between Reality AI Tools and e2Studio, Renesas flagship embedded development environment, is also in place.

RealityCheck Motor Toolbox, an advanced machine learning software toolbox, uses electrical information from the motor control process to enable the development of predictive maintenance, anomaly detection, and smart control feedback all without the need for additional sensors. It enables early detection of small fluctuations in system parameters that indicate maintenance issues and anomalies, reducing downtime. The software works seamlessly with Renesas MCUs, MPUs, and motor control kits and is fully integrated with Reality AI Tools to create, validate, and deploy sensor classification or prediction models at scale. This functionality is a toolchain built with predictive models that can be easily accessed out of the box by using the Reality AI toolchains for developers.

RealityCheck HVAC Solution Suite is a vertically integrated solution suite for the HVAC industry. This solution is a comprehensive framework that includes a hardware and firmware reference design, a set of pre-trained ML models ready to leverage for product design, and a clearly outlined process for model training, customization, and field testing to meet specific product requirements. This advancement has significantly improved the efficiency of HVAC systems.

Automotive SWS Solution Suite uniquely combines both hardware and software to give passengers a new level of protection. The suite comes with a MEMS microphone array integrated into components or placed on the roof. Flexible geometry automotive MCUs run AI detection and localization software on inexpensive hardware. AI models detect and classify different threats accurately at 1.5km distance for sirens, 35m+ for cars, trucks, and motorcycles, and 10m for bicycles and joggers. Localization is provided through AI models that compute the angle of arrival, estimate distance, and detect whether threats are approaching or receding.

Customers in a wide range of industries have adopted Renesas AI solutions for a variety of applications. For example, ITT Goulds Pumps Inc. is implementing data analytics using Renesas AI technology. Brad DeCook, R&D Director, Monitoring and Controls for the company, said The unique capabilities of the Renesas AI technology enabled us to develop machine diagnostics that effectively identify equipment faults caused by high vibration and temperature.

We believe the convergence of AI and IoT is creating a significant inflection point as customers increasingly move intelligence to the endpoint, said Sailesh Chittipeddi, Executive Vice President and General Manager of Renesas Embedded Processing, Digital Power and Signal Chain Solutions Group. The addition of the unique and powerful technology from Reality AI into our portfolio enables our customers to process and react to information faster, more accurately, and with fewer compute and power resources than ever before.

Visit link:
Progress of Artificial Intelligence and Tiny Machine Learning - Bisinfotech

Read More..

Research on the establishment of NDVI long-term data set based on … – Nature.com

Data

This paper selects parts of China and surrounding areas as the research area. The research data selects the NDVI data of MODIS (NDVIm) and AVHRR (NDVIa) sensors on Terra and Aqua, and the NDVI data of VIRR (NDVIv) sensors on Fengyun satellite31. (I) Compare the NDVIv with the NDVIa, and the NDVIa and NDVIm. (II) Find out the functional relationship between NDVIa and NDVIm, and the functional relationship between NDVIv and NDVIa through comparison. (III) use NDVIa to correct NDVIv data to a level equivalent to NDVIm.

The data used in this study include (see Table 1): NDVIa from 1982 to 2015, NDVIm from 2000 to 2019, and NDVIv from 2015 to 2020, all of which have a resolution of 0.05. Because in 2005, there are both NDVIa data and NDVIm data. Therefore, we use the data of this year to compare NDVIa and NDVIm, and explore the correlation between the two. Because in 2015, there are both NDVIv data and NDVIa data. Therefore, we used the data of this year to compare NDVIv and NDVIa and explore the correlation between the two. Finally, we compared the corrected NDVIv of 2019 with the NDVIm of 2019 to verify the success of the model we constructed.

Figure1 shows the spectral response function curves of different satellite sensors in the visible and near-infrared spectrum32. By comparison, it can be found that in the visible light band, the spectral response function of MODIS is narrower than AVHRR, and the spectral response function of AVHRR is narrower than VIRR. In the near-infrared band, MODIS still has the narrowest spectral response function, followed by VIRR, and AVHRR has the widest spectral response function. The channel, wavelength range, corresponding spectrum and sub-satellite resolution information of MODIS, AVHRR, and VIRR sensors are shown in Table 2.

Spectral response function curves of different satellite sensors in the visible and near-infrared spectrum29.

Linear model is a form of machine learning model. The form of linear model is relatively simple and easy to model. The linear model contains some important basic ideas in machine learning. Many more powerful nonlinear models can be obtained by introducing hierarchical structure or high-dimensional mapping on the basis of linear models. There are many forms of linear models, and linear regression is a common one. Linear regression tries to learn a linear model to predict the real-valued output markers as accurately as possible. By establishing a linear model on the data set, a loss function is established, and finally the model parameters are determined with the goal of optimizing the cost function, so as to obtain the model for subsequent prediction. The general linear regression algorithm process is as presented in Fig.2.

Schematic diagram of the linear regression algorithm flow.

The detailed procedure is as follows33:

The data is standardized and preprocessed. The preprocessing includes data cleaning, screening, organization, etc., so that the data can be input into the machine learning model as feature variables.

Different machine learning algorithms are selected to train a separate data set, and find the best machine learning model, establish a machine learning model based on the normalized vegetation index product retrieved by Fengyun satellite.

Verify and output the long-term series normalized vegetation index of the Fengyun satellite.

For 20012005, there are both AVHRR NDVI data and MODIS NDVI data. Therefore, we used the data of these 5years to compare NDVIa and NDVIm and explore the correlation between the two. Because 2015 has both VIRR's NDVI data and AVHRR's NDVI data. Therefore, we used the data of this year to compare NDVIv and NDVIa and explore the correlation between the two. Finally, we compared the corrected NDVIv of 2019 with the NDVIm of 2019 to verify the success of the model we constructed.

The linear machine learning model is used to construct the optimal functional relationship between the NDVIa and the NDVIm. The formula is as presented in formula (1):

$${text{Y}}_{{{text{NDVIm}}}} = left{ {{text{k2}}00{1},{text{k2}}00{2},{text{k2}}00{3},{text{k2}}00{4},{text{k2}}00{5},{text{kmin}},{text{kmax}},{text{kave}}} right} times {text{X}}_{{{text{NDVIa}}}} + left{ {{text{m2}}00{1},{text{m2}}00{2},{text{m2}}00{3},{text{m2}}00{4},{text{m2}}00{5},{text{mmin}},{text{mmax}},{text{mmean}}} right}$$

(1)

In the formula, XNDVIa is the NDVI value of AVHRR, YNDVIm is the NDVI value of MODIS, k is the coefficient value of the linear function relationship between NDVIa and NDVIm, k2001, k2002, k2003, k2004, k2005, kmin, kmax, kave are the coefficients of 2001, 2002, 2003, 2004, 2005, the 5-year minimum, 5-year maximum, and the 5-year coefficient average respectively. m is the intercept of the linear function relationship between the NDVIa and the NDVIm, m2001, m2002, m2003, m2004, m2005, mmin, mmax, mmean are the intercept of 2001, 2002, 2003, 2004, 2005 Year, 5-year minimum, 55-year maximum, and 5-year average respectively.

Through multiple cross-comparison analysis, the optimal coefficient k and the optimal coefficient m are selected, and then the optimal functional relationship between NDVIa and NDVIm is determined.

Based on the above analysis, we continue to construct the functional relationship between NDVIa and NDVIv, according to formula (2).

$${text{X}}_{{{text{NDVIa}}}} = {text{aZ}}_{{{text{NDVIv}}}} + {text{b}}{.}$$

(2)

In the formula (2), ZNDVIv is the NDVI value of VIRR, XNDVIa is the NDVI value of AVHRR, a is the coefficient value of the linear function relationship between the NDVIv and the NDVIa fitting, and b is the intercept of the linear function relationship between NDVIv and NDVIa fitting.

Replacing the functional relationship between NDVIa and NDVIv into the optimal NDVIa and NDVIm functional relationships filtered out to obtain the refitted NDVIv, which is Yvir_ndvi in the formula (3). The functional relationship formula of the simulated NDVIv is as follows (3):

$${text{C}}_{{{text{NDVIcv}}}} = {text{k}}_{{{text{NDVIa}}}} + {text{m}} = {text{k}}left( {{text{aZ}}_{{{text{NDVIv}}}} + {text{b}}} right) + {text{m}} = {text{kaZ}}_{{{text{NDVIv}}}} + {text{kb}} + {text{m}}{.}$$

(3)

In the formula, CNDVIcv is the corrected NDVIv(NDVIcv), k is the optimal coefficient of the correlation between NDVIa and NDVIm, and m is the optimal intercept of the correlation between NDVIa and NDVIm.

The data of 2005 were selected to compare NDVIm and NDVIa in some parts of China and surrounding areas. The data of 2015 were selected to compare NDVIv and NDVIa in some parts of China and surrounding areas. Through analysis, the correlation among NDVIv, NDVIa and NDVIm is found.

See the article here:
Research on the establishment of NDVI long-term data set based on ... - Nature.com

Read More..

Campus Adds New Areas of Studies for Students to Choose From … – University of California, Merced

New students or those who have not yet chosen their majors will have an array of options before them.

Five new majors and several new emphases, ranging across all three schools, are all coming online in 2024 and are recruiting students now.

New bachelors of science degrees:

New bachelors of arts degrees:

New emphases:

In the mechanical engineering major:

In the political science major:

In the sociology major:

Students who enroll in the public health bachelors of science program can do so as part of the medical education pathway or as the basis for a multitude of other health care related careers.

The standard bachelors of science program has more biology, physiology and nutrition science than the bachelors of arts major, said Professor Nancy Burke, who led the development of the new major. We also have a health professionals/pre-med track that incorporates all the preparation students would need to apply to medical school or for any other health professional degree.

Those who are interested in careers in aerospace engineering can now choose that subject as an emphasis within their mechanical engineering degrees. And those in cognitive science can now enroll in that majors new honors program.

Mechanical engineering Professor and Monya Lane and Robert Bryant Presidential Chair in Excellence in EngineeringAshlie Martini said aerospace engineering has been a subject of interest among students and faculty for quite a while.

Many of our mechanical engineering undergraduate alumni go into aerospace companies already, so it has been something we have wanted to bring to UC Merced, she said. We are starting with an emphasis, but if we see a lot of students signing up for this, that would encourage the creation of a major.

Faculty have created four new classes for the aerospace engineering emphasis: aerospace structures and materials; flight dynamics and control; aeroelasticity; and aerospace propulsion. Technically, the classes are electives, so anyone in the mechanical engineering major can take them.

Students interested in the growing field of data science have two choices: data science and analytics and data science and computing.

Majors are really a prescribed set of courses of study and data science is so big and so new, it's not possible to have a one-size-fits-all program," said Professor Suzanne Sindi, chair of the Department of Applied Math, and co-author of the data science and computing major.

Data science the availability, collection and analysis of data has changed every field of study, Sindi said.

It behooves us to make sure that our students understand not just a domain, or area of study, but that they understand the data that exists within thedomain and what you can use it for, she said. It also helps them understand the world we live in.

The data science and analytics major is also a choice for those interested in understanding this growing field of study.

Data is at the core of modern-day systems-thinking and decision-making. The DSA major will not only provide an accessible bridge, but one that serves as a launchpad for clearing the digital divide, such that graduates will be well-equipped to think critically and tell stories with data, said management of complex systems Professor Alexander Petersen.

For students who lean more toward engineering, there is the new chemical engineering degree, offered by the Department of Materials Science and Engineering. Professor Kara McCloskey, who leads the chemical engineering program, said obtaining the degree opens a plethora of career paths for graduates.

It is a popular engineering major, it's considered a traditional engineering major and most of the other UC campuses offer it and industry, including food and beverage industries in the Central Valley, have been asking us when we are going to offer it, she said. They hire many of our students anyway, but chemical engineering is the right training for many other students they want to hire.

Chemical engineering includes a lot of mass separations, especially at a large scale, which are part of food- and wine-making processes, McCloskey explained.

There has been a demand for chemical engineers and especially those from the Valley, by Valley industry, department chair Professor Valerie Leppert said. It helps them retain workers because people can remain closer to family and home.

Excerpt from:

Campus Adds New Areas of Studies for Students to Choose From ... - University of California, Merced

Read More..

A.I. is coming for the jobs you’d least expectand it means … – Fortune

BY Sydney LakeJune 14, 2023, 6:11 PM

Jeff Maggioncalda, CEO of Coursera. Photo courtesy Coursera

ChatGPT may seem like a more savvy and interactive version of Google or another search engine, but this and other artificial intelligence tools could replace and displace workers in nearly every industry. During the next five years, 75% of organizations are expected to adopt A.I. practices, a recent study by the World Economic Forum showsand companies are anticipating a job loss rate related to A.I. adoption of 25%.

What potentially more alarming is that 49% of workers could have half or more of their tasks exposed to large language models like ChatGPT, according to a recent study from the University of Pennsylvania. A potential job loss rate that highcoupled with a major shift in job requirementsmeans that people whose jobs were taken over by A.I. will need to be reskilled.

Thats where online education and upskilling companies like Coursera step in.Coursera offers nearly 6,000 online courses, professional certificates, and degree programs for the most in-demand industries including data science and analytics, digital marketing, project management, and much more.

A.I. and other advanced technologies have already infiltrated jobs in the service industryincluding as waiters and freight moversbecause theyre more repetitive, repeatable, predictable jobs, as Jeff Maggioncalda, Courseras CEO, puts it. But language-focused jobs, including teachers and lawyers, will become more vulnerable to advancements in ChatGPT and other generative A.I. toolsand credentials earned via Coursera can help workers upskill for the jobs most susceptible to A.I. development.

If all these jobs become a lot more vulnerable, then everybodys in the reskilling world, Maggioncalda says. If you dont know how to use A.I. for your job, youre in trouble. All employers want you to be able to use this if youve graduated.

Courses and certificates vs. degree programs

Advanced degree programs can serve as a launchpad to more job opportunities and higher pay. MBA programsas well as masters degree programs in data sciencecan help students double their salary post-graduation and make major career moves.

But everything is at a price, as Maggioncalda notes, and while there will almost certainly be value in degree programs, they come at a higher cost than other options. Through online learning, people can earn certificates and other credentials by completing individual courses to learn the skills they needoften at a lower cost and more flexibility.

I do think that there is a much higher level of competition and alternatives available, Maggioncalda adds. For the learner, it looks like more options than alternatives. For the university, it looks like competition.

More universities could start offering different pathways to degree programs, meaning that there could be more lenient admissions thresholds. One example of this is University of ColoradoBoulders new online masters degree program in computer science on Coursera, which requires prospective students to take three preliminary courses.Ball State University also recently launched computer science and data science masters programs with similar requirements.

Admissions for CU Boulders and Ball States respective programs arent based on an applicants prior academic history, and a bachelors degree isnt required. Rather, prospective students choose a pathway in either algorithms or software architecture and must earn at least a B in the three courses associated with that pathway before they can qualify to enroll in for-credit courses. Plus, the CU Boulder program costs just $15,750 and has a pay-as-you-go model that only charges students for those classes in which theyre currently enrolled. Some computer science masters programs can cost six-figures to complete.

Degrees are going to become more affordable and more flexible, Maggioncalda predicts.

Other advanced tech usage in online education

To give students a more realistic and immersive online learning experience, Coursera plans to release more virtual reality (VR) components to its online classesfrom public speaking to Chinese language courses to physiology.

In an online public speaking course from the University of Washington on Coursera, students can practice giving speeches in front of a virtual audience and receive feedback in real time from their professor. Students can even adjust the venue type in which they want to practice, whether it be an auditorium or conference room and choose a calm or restless audience, Maggioncalda adds.

To practice more realistic interactions using a new language, a beginners Chinese language course offered by Peking University places students in a VR setting in a Chinese market, which can help students practice their listening, speaking, pronunciation, and vocabulary. Whats more, students enrolled in a physiology course from Duke University can even hop inside blood vessels via VR to learn more about measuring blood pressure and other physiological systems.

Why online education can be an accessible option for workers looking to reskill

Online programs can be a more suitable option for students who want lower costs and more flexibility in completing a program. More than one-third of online learners say that their top motivation for considering further education was a stalled career or stalled career search, according to a recent McKinsey & Co. report. But among the top concerns for online learners is a lack of access to one-on-one mentoring and coaching for post-grad opportunities.

To combat these concerns, Coursera this spring launched its coach function, which is powered by ChatGPT. Online students can use this function to ask specific and targeted questions about course material, Maggioncalda says.

Fortune sat down with Maggioncalda last month to see a demo of the new feature, and the bot will only answer questions related to the course, which can make it a more targeted and reliable source than turning to larger search engines like Google. The coach uses the course to answer questions rather than searching the entire internet, he explains. The coach can also provide summaries of lectures and even point learners to recommended video clips to answer their questions.

The main change in the way people learnbecause of A.I.is that its going to be more personalized. Its going to be more interactive, Maggioncalda says. The easiest way to think about it is if every single student had a personal [teaching assistant]. Youll have someone to help you with your career coaching and help you with your studies.

Check out all ofFortunesrankings of degree programs, and learn more about specificcareer paths.

Read more from the original source:

A.I. is coming for the jobs you'd least expectand it means ... - Fortune

Read More..

Big Data and Data Science: The Perfect Synergy – CityLife

Big Data and Data Science: The Perfect Synergy

Big data and data science are two buzzwords that have been dominating the technology landscape in recent years. While they may seem interchangeable to some, they are, in fact, two distinct concepts that complement each other perfectly. In this article, we will explore the synergy between big data and data science and how their collaboration is revolutionizing various industries.

Big data refers to the massive volume of structured and unstructured data generated by businesses, individuals, and machines every day. This data comes from various sources such as social media, sensors, digital images, and videos, among others. The challenge with big data lies in its sheer volume, velocity, and variety, making it difficult for traditional data management tools to process and analyze. This is where data science comes into play.

Data science is an interdisciplinary field that uses scientific methods, algorithms, and systems to extract knowledge and insights from structured and unstructured data. It involves the application of advanced analytics techniques, such as machine learning, artificial intelligence, and statistical modeling, to make sense of the vast amounts of data available. Data scientists are skilled professionals who can transform raw data into valuable information, enabling organizations to make data-driven decisions and gain a competitive edge.

The synergy between big data and data science is evident in the way they work together to unlock the full potential of data. Big data provides the raw material, while data science offers the tools and techniques to process and analyze it. Together, they enable organizations to harness the power of data and turn it into actionable insights that drive business growth and innovation.

One of the most significant benefits of this synergy is the ability to make more informed decisions. With the help of data science, organizations can analyze big data to identify patterns, trends, and correlations that were previously hidden. This information can be used to make better decisions, optimize processes, and improve overall efficiency. For example, retailers can use big data analytics to understand customer preferences and tailor their offerings accordingly, while healthcare providers can leverage data to predict disease outbreaks and develop targeted treatment plans.

Another advantage of the big data and data science synergy is the ability to develop new products and services. By analyzing customer data, companies can identify gaps in the market and develop innovative solutions to address them. For instance, streaming services like Netflix and Spotify use big data analytics to recommend personalized content based on user preferences, while financial institutions use data science techniques to develop sophisticated risk models and fraud detection systems.

Moreover, the combination of big data and data science has led to significant advancements in artificial intelligence and machine learning. These technologies rely on vast amounts of data to train algorithms and improve their accuracy over time. As a result, we have seen the development of self-driving cars, virtual assistants, and advanced robotics, among other innovations.

The synergy between big data and data science has also had a profound impact on various industries, including healthcare, finance, retail, and manufacturing. In healthcare, big data analytics has enabled the development of personalized medicine, where treatments are tailored to individual patients based on their genetic makeup and medical history. In finance, data science techniques have revolutionized risk management and fraud detection, while in retail, big data has transformed customer relationship management and supply chain optimization.

In conclusion, the perfect synergy between big data and data science has unlocked the true potential of data, enabling organizations to make more informed decisions, develop innovative products and services, and drive business growth. As technology continues to advance, the collaboration between these two fields will only become more critical, shaping the future of industries and the world as a whole.

See the rest here:

Big Data and Data Science: The Perfect Synergy - CityLife

Read More..

Leveraging Big Data And AI For Disaster Resilience And Recovery – Texas A&M University Today

Researchers in the Urban Resilience.AI Lab are taking the lead in harnessing community-scale big data to develop artificial intelligence-based models with the potential to impact communities before, during and after a natural disaster or crisis.

Getty Images

In a world where natural hazards can strike at any time with devastating consequences, reducing their impacts in advance may seem impossible.

The unpredictability of these events and their effects makes it challenging to anticipate and respond appropriately, leaving individuals and communities vulnerable to their destructive effects.

In 2017, Hurricane Harvey made landfall along the Texas coast as a category four hurricane and slowed to nearly 5 mph as it moved inland. Palacios experienced a storm surge exceeding 8 feet. The storm dumped 56 inches of rain in the Friendswood area and more than 60 inches near Nederland and Groves. The National Hurricane Center reported $125 million in damage because of Hurricane Harvey.

Now, researchers from the Zachry Department of Civil and Environmental Engineering at Texas A&M University have created models using big data and artificial intelligence (AI) to help communities prepare for future natural disasters, assess the impacts and monitor the recovery in near real time. They used data from Harvey to test these AI-centric solutions.

Led byDr. Ali Mostafavi, Zachry Career Development Associate Professor,the Urban Resilience.AI Labis leveraging AI and big data for predictive risk analysis, predictive infrastructure failure assessment, situational awareness during the event, monitoring of recovery and rapid assessment of the impacts of disasters.

When facing hazards, there are certain capabilities that AI and data science approaches provide or enhance that can improve the resiliency of communities to these disasters, Mostafavi said. Our vision over the past four or five years has been to improve disaster resilience by developing different classes of AI-based models that could provide foresights and insights critical for mitigation, preparedness, response and recovery.

The growth of data from different types of sensors from physical, social and other sensing technologies has given researchers tremendous information to work with in creating these models.

These days, cities and communities are essentially data factories. You can evaluate sensor data related to the condition of the community facing hazards from the traffic network cameras, peoples cell phone activities, power outage data, flood gauge data, satellite images and many other sources of technology that harness the heartbeat of the community, Mostafavi said. As our cities and communities become smarter with information and communication technologies, the amount of data generated grows.

Mostafavi and his team in the Urban Resilience.AI Lab are taking the lead in harnessing community-scale big data to develop AI-based models with the potential to impact communities before, during and after a natural disaster or crisis.

Roads are essential in urban cities, allowing goods, information and people to move from place to place. But during times of disaster, such as floods, road networks can be damaged or blocked, which impacts access to services such as hospitals, shelters and grocery stores. During floods in urban areas, vehicle accidents resulting from driving on flooded roads have been identified as a leading cause of fatalities, highlighting the failures of road networks.

Researchers have developed a deep-learning framework to effectively predict near-future flooding of roads. They tested the framework using three models, and the results showed that it can accurately predict the flooding status of roads with 98% precision and recall values of 96%. Researchers validated the models using the 2017 Hurricane Harvey flooding.

Knowing the flooding status of roads can help affected communities avoid flooded streets and aid emergency management agencies in planning evacuations and delivery of resources.

An aerial view of homes under water after Hurricane Harvey. The National Hurricane Center reported $125 million from the storm.

Getty Images

The National Oceanic and Atmospheric Administration reports from 1980 to 2020, the damage caused by hurricanes in the United States reached $945.9 billion, with 6,593 deaths. Knowing what to do before a hurricane hits is essential to decreasing the adverse effects and disruptions it can bring. This means having enough food, water and other essentials and making necessary home repairs.

Historically, surveys have measured how well households are prepared during hurricane season. Researchers used location-based big data from smartphone devices to proactively monitor hurricane preparation. They looked at three critical dimensions of hurricane preparedness: extent (how widespread is it?), timing (how early do people start preparing?) and spatial distribution (where are the hotspots?).

We have developed metrics and models that we can proactively monitor community hurricane preparedness. Which areas are preparing earlier? Which areas are preparing more based on how many trips they make to grocery stores, pharmacies and gas stations? Mostafavi said. We can identify areas that are underprepared. If an underprepared area hasnt evacuated, its a recipe for disaster.

Researchers focused on smartphone data on visits to businesses, such as grocery stores and pharmacies, to indicate how prepared the local population was for Hurricane Harvey in 2017. The results labeled regions with a decrease in visits as underprepared and those with an increase in visits as highly prepared. They saw that low-income households were more likely to prepare for the hurricane than those with higher incomes, likely because they lacked the means to protect themselves in other ways. The study outcomes allow emergency response managers and public officials to identify underprepared areas and better allocate resources promptly.

If areas are impacted (and unprepared), they wont have power. They wont have water. They wont have food. They wont have medications, Mostafavi said. But by using our models, we can identify the hotspots of various underprepared areas proactively before a hurricane lands.

Researchers proposed and tested an adaptive AI model that can learn how people typically move from place to place and adapt when an emergency, like flooding, a wildfire or a hurricane, happens. This model is helpful because there are not a lot of data available about how people travel during an emergency.

Using reinforcement learning techniques we have developed, we can identify flood impacts on traffic patterns and disrupted access to critical facilities, Mostafavi said.

To test the model, millions of past travel trajectories were used. The researchers tested the model to see how flooding during Hurricane Harvey impacted traffic and congestion in Houston. Results showed the model could make accurate predictions with a mean percentage error of 4.26% and precision and recall at the learning stage of 76%.

Instead of merely choosing the shortest route, the model leverages the vehicles surroundings to anticipate its movement. Researchers say the model can identify which roads are affected by flooding and simulate mobility during different flood scenarios.

Researchers have created a novel deep-learning model that utilizes high-resolution satellite images to accurately categorize varying degrees of destruction following natural disasters.

Getty Images

After a natural disaster hits, the damage assessment of homes and buildings can take months.

To address this challenge, researchers (in collaboration with Simon Fraser University) developed a new deep-learning model called DAHiTrA, which uses high-resolution satellite images to classify different levels of destruction after natural disasters. This can be helpful after large-scale events like Hurricane Harvey.

The model recognizes the geographic features in different locations and captures the changes over time. It then compares images of a building taken before and after a disaster to determine the level of damage. It can also be applied to other types of civil infrastructure, such as roads and bridges.

The satellite images are available within 24 hours, and our models are fast, Mostafavi said. So, the day after an event, you can know how many buildings have been damaged, the extent of the damage and how many buildings have major damage.

One of the fundamental strengths of DAHiTrA is its ability to accurately determine the boundaries of a building, which allows for more precise damage assessment. It also detects multiple types of damage, including collapse, partial damage and water damage.

The model analyzes large volumes of satellite images in a short time, which is essential for rapid response and recovery efforts after a disaster. This allows for faster and more accurate damage assessments, which can help communities and governments allocate resources more effectively.

Mostafavi said the team is actively working with government agencies, emergency management offices and even international organizations to use their model to assess building damage. One recent example is a use-case project with the Texas Department of Emergency Management.

Researchers also examined how people recover after a disaster, specifically during the short-term period after a hurricane. The study focuses on when people return home after evacuating and how long they can stay in their homes without having to move out again. The study uses location-based data to see how quickly people in different areas can do these things.

The study found that more than half of the census tracts in Harris County returned from evacuation within five days after the landfall of Hurricane Harvey and stopped moving out after six weeks. Some areas take longer than others to recover, and there are differences among different groups of people.

Researchers examined how people of different socio-demographic statuses (like income or rental status) responded to flooding. By looking at how quickly they evacuated or relocated, the team could understand different recovery patterns in various subpopulations. While the study shows it took longer to return in flooded areas than in nonflooded areas, there wasnt a significant difference between the two areas regarding evacuation and relocation for low-income populations.

The return time of high-income census tracts were longer than those of low-income census tracts when flooded, indicating the inability of low-income populations to evacuate and relocate. The study also found that areas with shorter return durations may not be more resilient to disaster, as they may indicate challenges faced by low-income and minority populations that require additional assistance and resources.

Our vision is to create a future where data science and AI technologies are leveraged to predict, prepare for and equitably respond to natural hazards, enabling us to mitigate their impact on communities, Mostafavi said. Our work so far shows the promising potential of big data and AI for augmenting different capabilities needed for disaster resilience. We are collaborating with various public and private sector organizations to develop, scale and deploy the AI technologies created in the lab.

Here is the original post:

Leveraging Big Data And AI For Disaster Resilience And Recovery - Texas A&M University Today

Read More..

Salesforce to Expand Data Cloud Connectivity with New Connectors … – Datanami

CHAPEL HILL, N.C., June 15, 2023 CData Software, a leading provider of real-time data connectivity solutions, has announced that Salesforce Data Cloud customers will have access to select connectors from CData that can be leveraged to expand the Connectors Catalog and bring more data into Data Cloud from their SaaS, database and file-based sources.

In todays market, customers expect proactive, personalized, and connected experiences across digital channels. With CData Connectors, Salesforce customers will be able to streamline access to customer data across a wide collection of data sources and touchpoints, enabling organizations to better serve their customers and gain a competitive advantage in todays market.

CData has been a vendor for Tableau for several years and has enhanced Tableaus ability to integrate with the data sources customers care about, so integrating CData connectors with Salesforce Data Cloud just made sense, said Chandrika Shankarnarayan, VP, Product, Salesforce Data Cloud. CData Connectors will increase our data connectivity options for customers using Data Cloud.

From the Fortune 500 and the Global 2000 to SMEs worldwide, thousands of organizations rely on CData connectivity to overcome data fragmentation and unlock value from diverse, dispersed data assets. Leading vendors across SaaS, data management, integration, analytics & BI, data science, data testing, AI & machine learning, and data governance embed CData connectivity to solve data access, data management, and data integration challenges.

Leading vendors across almost every facet of data management embed our high-performance connectivity to solve their data access and integration challenges, said Amit Sharma, CData co-founder and CEO. We continue to expand that connectivity, bringing our services to one of the most popular platforms for businesses worldwide to improve our customers ability to access and action their disparate data to produce truly exceptional customer experiences.

About CData Software

CData Software is the real-time data connectivity company. Our self-service data products and connectivity solutions provide universal access to live data from hundreds of popular on-premises and cloud applications. Millions of users worldwide rely on CData to enable advanced analytics, boost cloud adoption, streamline operations, and create a more connected business. Consumable by any user, accessible within any application, and built for all enterprises, CData is redefining data-driven business.

Source: CData

Read the original here:

Salesforce to Expand Data Cloud Connectivity with New Connectors ... - Datanami

Read More..