Page 1,001«..1020..1,0001,0011,0021,003..1,0101,020..»

Introducing the ChatGPT Data Science Program at Konkuk University – Fagen wasanni

Konkuk University is set to launch the first-ever ChatGPT data science education program in Korea. This program aims to provide education in data science using the latest ChatGPT technology, with a focus on deep understanding of AI and data science to drive product and service development.

The introduction of ChatGPT is rapidly transforming various fields and completely changing the paradigm of data science education. While previous education focused on understanding data analysis and artificial intelligence, the new approach involves direct learning of data science techniques using ChatGPT and quickly implementing them in product and service development.

The education program will run from September 2nd to November 25th, 2023, every Saturday at Konkuk University Future Knowledge Education Center. It is open to non-computer science major university students (including masters and doctoral students) and working professionals, with a maximum capacity of 25 participants. Upon completion of the program, participants will receive a certificate from Konkuk University and Software Carpentry, as well as a ChatGPT data science book authored by the instructor.

The program includes hands-on practical exercises, where participants will work with real data and form project teams. Topics covered in the program include Python, R, SQL, data science languages and software carpentry, data science data structures and grammar, digital writing, dashboards, API development, functional and parallel programming, machine learning, ChatGPT and open-source, and prompt engineering.

The instructor of the program is an expert who has designed and lectured at Yonsei University and GS Caltexs data science courses, and holds a Software Carpentry instructor qualification. They have extensive industry experience from Hyundai, Webzen, and KPMG, and are currently the Technical Director of the Korea R User Group and an adjunct professor at Konkuk University.

The program is designed as a practical workshop-focused course, providing participants with the latest data science techniques incorporating ChatGPT. Through this program, participants will learn data science techniques using ChatGPT, gain hands-on experience applying them to real projects, and build confidence through foundational knowledge and mentoring.

This ChatGPT data science education program is expected to be a beneficial opportunity for those aspiring to enter the field of data science, those looking to add new insights to their existing careers, and those seeking to add unique value to their work.

For more information about the ChatGPT data science education program, please visit the following website: [website link removed]

Read this article:

Introducing the ChatGPT Data Science Program at Konkuk University - Fagen wasanni

Read More..

Good-Paying Careers in Data Are Booming. But Schools Aren’t … – Education Week

This city has become one of the fastest-growing technology hubs in the country. The regions low cost of living and cheap real estate has drawn heavy-hitters like Microsoft and Facebook, and that success has helped Utah acquire one of the highest rates of billion-dollar startups of any state.

But business leaders say the schools in this area, which has come to be known as Silicon Slopes, need to build a stronger foundation in data and statistics skills for their future workers if that growth is to be long lived. Elizabeth Converse, the executive director of Utah Tech Leads, an industry group in Salt Lake City, said she sees national declines in K-12 math performance, particularly in data and statistics, as economic red flags for her state as well as the nation.

Our companies are growing at a clip that is kind of unimaginable in a state our size. We just cant, we dont have enough talent to fill the jobs, Converse said. For us, its really important that Utah lead the pack when it comes to absorbing data standards into everyday curriculum so that students are taught this from the very beginning.

VIEW THE COMPLETE COLLECTION

Converses group is working with the state board of education to develop a data-science pathway in high school and integrate more data science throughout the Beehive States K-12 math standards, which are up for renewal this fall.

Converse said industry groups like hers are working to change the image of data science as only useful for science, technology, engineering, and math careers.

All the way from our state legislature down to the student level, the way we talk about math is like this isolated thing like a club, she said. Instead, data science needs to be a seamless transition. It needs to be a part of [students] education overall.

The efforts of these advocacy groups are part of a nationwide trend to expand how teachers, parents, and students consider the full range of possible careers that utilize math skills.

Data and statistics know-how has become one of the most sought-after skills for new employees, even in fields outside of STEM. From a social entrepreneur using housing statistics to investigate building sites to a YouTube vlogger analyzing his content views and audience demographics, technology tools have made data a bigger part of many jobs.

Its important to keep in mind that ... most of us are probably using statistics in our work under the hood, said Geoff Hing, a data journalist at the Marshall Project, a nonprofit investigative news group, and thats especially the case with generative AI [artificial intelligence] as ChatGPT becomes a part of more and more industries.

The U.S. Department of Labor estimates that over the next decade, 2 of the 10 fastest-growing career fields will be related to data and statistics. The numbers of jobs available for statisticians and data scientistsboth of which boast annual incomes around $100,000are expected to increase more than 30 percent, and most related careers also are growing faster than average.

We see these effects cutting across sectors, and its every entry-level job where data and technology and the basics of statistics are being used more frequently, said Zarek Drozda, the director of the nonprofit Data Science for Everyone, one of the groups helping Utah and other states.

Sheri Johnson, a math teacher at the independent Mount Vernon School in Sandy Springs, Ga., said schools across her state are expanding data and statistics standards across K-12 this fall, in part to broaden future job opportunities for students.

Theres a disconnect between what we learn in school and what employers want people to know. Employers really want employees who can use spreadsheets and data, Johnson said.

If schools begin to introduce data and statistics in elementary school, she noted, students are also likely to get earlier exposure to the kinds of jobs that use data.

While mathematics fields can seem abstract to students, statistics can give teachers a way to help students develop a personal stake in their careers, according to public-health researcher Kristin Baltrusaitis. For example, at Harvard Universitys Center for Biostatistics in AIDS Research, Baltrusaitis uses statistics from clinical trials to study differences between adults and children in effective doses and potential side effects for medicine used to treat HIV, the virus that causes acquired immunodeficiency syndrome.

I look at infants and children and pregnant people, because these are populations that are typically not included in regular clinical trial designs. So we want to look at how effective are these drugs in these different populations, she said.

When either teaching students statistics or guiding their career planning, I think theres a huge benefit of making those interdisciplinary connections of [students] seeing the goal and the purpose of what theyre learning in their math course and where it could be useful, said Baltrusaitis, who previously taught high school math and science through the New York City Teaching Fellows program.

Utah has long integrated different strands of math, including calculus and statistics, in high school. But in the run-up to math standards discussions in 2021, Mark Tullis, a co-founder of the Salt Lake City-based TechBuzz, a local industry-news group, surveyed the areas business leaders about the kinds of math they had learned in high school and the math they most needed in employees.

So I asked them, Did you take calculus in high school? And most of them said yes. And are you applying it in your work, your career right now? And they would say, Indirectly, I guess calculus helped me achieve a certain level of problem-solving skills.

And I said, well, did you have any data science in high school? Any statistics? No, in college but not in high school was generally the response, Tullis said.

Generally, the applicability of calculus or even algebra to their daily work was very small, like 5 or 10 percent said it was relevant to their current careers. But what they did say was that if they could have learned more statistics, more data science, and machine-learning skills in high school, it would have prepared them to a much greater extent, Tullis said. The results were pretty clear, that the companies that are hiring for jobs that are math-related want data science to be taught in high schools so that the workforce is better prepared.

In most states, statistics is a high school elective after students complete a traditional sequence of at least Algebra 1 and 2 and geometry by grade 11. But the vast majority of students never get that far.

A 2022 study by the University of Texas-Austins Charles A. Dana Center found that across nine states including Utah only about 27 percent of students complete that course sequence by grade 11, and only 15 percent ended up taking statistics in high school.

Low-income students and students of color, who are already underrepresented in calculus courses, likewise end up with less access to data and statistics courses, according to Josh Recio, a course program specialist in secondary mathematics at the Charles Dana Center at the University of Texas-Austin.

Back in Utah, nearly 40 districts have signed onto the state pilot to develop a data-science pathway.

Because we have standards revisions coming up in the fall, the data that we collect from the pilot, I think, will make a compelling case for a data-science strand to be built, said Lindsey Henderson, secondary-math specialist for the Utah board of education.

Evidence of effectiveness will be critical because in other states like California, standards changes have led to conflicts between advocates for calculus and those who favor statistics pathways, something San Antonio statistics teacher Dashiell Young-Saver, called weird and unproductive.

Instead, Young-Saver, who creates statistics lessons for teachers on the site Skew the Script, argued that schools would be better off infusing data and statistics education across the curriculumboth in math and in other subjects like science or civicsto encourage students to think more broadly about their applications.

I think students are not fully aware that statistics is one of the most relevant maths for the professional world now, he said. Ultimately, calculus is used by engineers, physicists, and a few other professions. Stats is used by everyone elseand also engineers and physicists.

Read more from the original source:

Good-Paying Careers in Data Are Booming. But Schools Aren't ... - Education Week

Read More..

Analytics and Data Science News for the Week of July 28; Updates … – Solutions Review

Solutions Review editors curated this list of the most noteworthy analytics and data science news items for the week of July 28, 2023.

Keeping tabs on all the most relevant analytics and data science news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last week, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy analytics and data science news items.

AtScale now provides BI architects with a set of modeling utilities that simplify creation of common, but complex, model elements like time-relative measures, intelligent dimensions, and calculation groups.

Read on for more.

Stemmas solution was engineered to provide high-grade security, enhanced ease of use data search capabilities, and automated data intelligence. With 20 built-in data connectors, Stemmas robust data catalog solution will strengthen Teradatas data fabric and accelerate the analytic productivity of the Vantage platform.

Read on for more.

The tiles on the home page have been streamlined to facilitate the most common tasks that we see our customers doing in Databricks. The home page is now a single page for everyone that streamlines access to what users need to resume their journey.

Read on for more.

You cancreate a simple paginated report such as an inventory report withan image,textboxand a table. You can export, save or share these reports. These reports can be createdfrom a Power BI dataset byfollowing a few simple steps.

Read on for more.

Adtech is a main use case for the combination of confidential computing and DCRs and this partnership enables adtech analysts to run advanced analytics and build machine learning models on sensitive data without ever being exposed to the raw data itself.

Read on for more.

Pyramids Decision Intelligence Platform offers a low-code/no-code solution to data preparation, business analytics, data science, and machine learning, empowering the organization from the C-suite leaders to frontline professionals to make faster, more intelligent decisions.

Read on for more.

SQream is a petabyte-scale database platform in the Samsung Cloud Platform ecosystem allowing customers to leverage sophisticated analytics tools on a secure and scalable cloud environment to query massive datasets and gain critical business insights.

Read on for more.

Stemmas solution was engineered to provide high-grade security, enhanced ease of use data search capabilities, and automated data intelligence. With 20 built-in data connectors, Stemmas robust data catalog solution will strengthen Teradatas data fabric and accelerate the analytic productivity of the Vantage platform.

Read on for more.

Watch this space each week as Solutions Review editors will use it to share new Expert Insights Series articles, Contributed Shorts videos, Expert Roundtable and event replays, and other curated content to help you gain a forward-thinking analysis and remain on-trend. All to meet the demand for what its editors do best: bring industry experts together to publish the webs leading insights for enterprise technology practitioners.

With this Solutions Spotlight event, Solutions Review partnered with leading analytics, data science, and automation vendor Alteryx. Through case studies and practical examples, Alteryxs Field Chief Data & Analytics Officer, Heather Harris, helps you learn the keys to capturing the business impact of your analytic solutions.

Read on for more.

For consideration in future data science news roundups, send your announcements to the editor: tking@solutionsreview.com.

Tim is Solutions Review's Executive Editor and leads coverage on data management and analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in Data Management, Tim is a recognized industry thought leader and changemaker. Story? Reach him via email at tking@solutionsreview dot com.

Read the original:

Analytics and Data Science News for the Week of July 28; Updates ... - Solutions Review

Read More..

Doctor AI: Healing Humans and Mother Earth Hand in Hand – Data Science Central

Lets image with algorithms and a nerdy charm that could melt any data center, an AI wearing lab coats and stethoscopes patrolling hospital hallways, tirelessly monitoring patients. The digital doctor will take the pulse of Mother Earth and reduce waste, cut energy consumption, and cut energy consumption! The artificial intelligence community is well aware that it is capable of crunching data with ease, but who knew that it would turn into Mother Natures best friend? It is as if you had the love child of Einstein and Captain Planet as your personal physician optimizing resource usage, identifying our ailments, and prescribing solutions that are planet friendly!

Using artificial intelligence in the medical field can contribute significantly to the development of green solutions. Although it is not the only factor that makes healthcare environmentally friendly and sustainable, it can play a vital role in advancing these practices. These are some ways in which artificial intelligence may contribute to green solutions in the medical field:

A computerized system is capable of analyzing large amounts of data in order to optimize resource utilization in healthcare facilities, thereby reducing waste and energy consumption. Among the benefits of this process is the optimization of scheduling and inventory management, as well as the reduction of energy consumption.

The advent of high-tech technology has revolutionized healthcare, streamlined processes, and customized treatments to the point where medical professionals can say goodbye to the days of finding a needle in a haystack. Say goodbye to medical guesswork and adopt treatments as precise as Olympic archers bullseyes. Using artificial intelligence can result in more targeted, effective treatment options for patients, which will eliminate unnecessarily costly and time-consuming trial-and-error methods. By reducing energy consumption and waste generation, this can reduce environmental impact.

Medical imaging analysis can be enhanced by AI algorithms, which lead to more accurate and earlier diagnoses. Thus, repeat tests can be reduced and treatment can be better targeted.

Drug discovery can be accelerated and made more cost-effective with the help of artificial intelligence. Traditional drug development can be cut down on the need for animal testing and reduce environmental impact by identifying potential drug candidates more quickly with AI.

Do you remember the first time you saw an AI party animal? You have probably never heard of an AI-doctor, but thanks to telemedicine, he has made his way to your living room. The ultimate eco-friendly, house-call hero is just a click away no need to drag yourself out of bed when you are under the weather! As healthcare delivery becomes more telemedicine-based, the need for travel will decrease, resulting in reduction of greenhouse gases from transportation and a reduction in the environmental impact.

It is possible to maximize the energy efficiency and environmental friendliness of medical devices by using artificial intelligence throughout their lifecycles.

In addition to reducing the environmental impact of large-scale medical response, artificial intelligence can help predict and prevent diseases outbreaks by analyzing health and environmental data.

While artificial intelligence can play a major role in driving green solutions, its also important to note that the technology itself occupies significant amounts of computing resources and energy itself. This is why it is extremely important for the development and deployment of AI systems to be done in a responsible and sustainable manner with an emphasis on energy efficiency and sustainability.

AI and machine learning models are most effective when they are built using large amounts of training data. Anolytics as a company provides error-free and accurate image annotation in healthcare services.

AI, innovative technologies, and sustainable practices will, ultimately, be the key to creating green solutions in healthcare, along with broader efforts to promote environmental consciousness. In order to achieve a greener and more sustainable medical sector, the collaboration of healthcare professionals, researchers, policymakers, and technology experts is essential.

View original post here:

Doctor AI: Healing Humans and Mother Earth Hand in Hand - Data Science Central

Read More..

How generative AI impacts your digital transformation priorities – CIO

Improving customer support is a quick win for delivering short-term ROI from LLMs and AI search capabilities. LLMs require centralizing an enterprises unstructured data, including data embedded in CRMs, file systems, and other SaaS tools. Once IT centralizes this data and implements a private LLM, other opportunities include improving sales lead conversion and HR onboarding processes.

Companies have been stuffing data into SharePoint and other systems for decades, says Gordon Allott, president and CEO of GetK3. It might actually be worth something by cleaning it up and using an LLM.

The generative AI landscape has more than 100 tools covering test, image, video, code, speech, and other categories. What stops employees from trying a tool and pasting proprietary or other confidential information into their prompts?

Rodenbostel suggests, Leaders must ensure their teams only use these tools in approved, appropriate ways by researching and creating an acceptable use policy.

There are three departments where CIOs must partner with their CHROs and CISOs in communicating policy and creating a governance model that supports smart experimentation. First, CIOs should evaluate how ChatGPT and other generative AIs impact coding and software development. IT must lead by example on where and how to experiment and when not to use a tool or proprietary data set.

Marketing is the second area to focus on, where marketers can use ChatGPT and other generative AIs in content creation, lead generation, email marketing, and over ten common marketing practices. With more than 11,000 marketing technology solutions available today, there are plenty of opportunities to experiment and make inadvertent mistakes in testing SaaS with new LLM capabilities.

CIOs of leading organizations are creating a registry to onboard new generative AI use cases, define a process for reviewing methodologies, and centralize capturing the impact of AI experiments.

One important area to consider is how generative AI will impact decision-making processes and the future of work.

Over the past decade, many businesses have aimed to become data-driven organizations by democratizing access to data, training more businesspeople on citizen data science, and instilling proactive data governance practices. Generative AI unleashes new capabilities, enabling leaders to prompt and get quick answers, but timeliness, accuracy, and bias are key issues for many LLMs.

Keeping humans at the center of AI and establishing robust frameworks for data usage and model interpretability will go a long way in mitigating bias within these models and ensuring all AI outputs are ethical and responsible, says Erik Voight, VP of enterprise solutions of Appen. The reality is that AI models are no replacement for humans when it comes to critical decision-making and should be used to supplement these processes, not take them over entirely.

CIOs should seek a balanced approach to prioritizing generative AI initiatives, including defining governance, identifying short-term efficiencies, and seeking longer-term transformation opportunities.

Read this article:

How generative AI impacts your digital transformation priorities - CIO

Read More..

Are Students Getting All the Math They Need to Succeed? – Education Week

Lindsey Henderson hopes to change the conversation about math in her state.

As student math performance declined in Utah and states across the nation over the pandemic, most learning-recovery efforts have looked to shore up basic numeracy and algebra skills. But that strategy is likely to worsen declines in students understanding of statistics and geometry that began long before COVID-19 became a household world, experts say.

These worsening declines come just as workforce needs for data analysis and graphics skills ramp up.

In Utah, which is set to update its math standards this fall, business leaders have warned that schools dont produce enough graduates capable of the data and statistical analyses needed for the technology and other jobs driving the state economy now.

Data is the new literacy of 2023, said Henderson, the secondary-math specialist for the state board of education. Its not that [students] just need to know how to calculate something or know the standard deviation of something off the cuff. We need a workforce able to take mathematical questions about the data and know where to go to find the solutions that help solve those questions. Those sorts of skills are the ones that our tech community wants our students to have experience with, specifically in a math classroom.

View the Complete Collection

The building blocks of data literacy may include familiar concepts like understanding basic coordinates on a graph or visually representing information. But teachers emphasize that its more than that; its also a way of thinking. Students must learn how to collect and measure data to answer a question, how to describe them, and how to make predictions based on them.

Say, for example, students are asked to help the cafeteria director figure out how much milk to buy. Theyd collect that data perhaps by collecting used containers, and from there discuss: Is one class an outlier, where everyone drinks juice? Do students consumption patterns change?

Developing this process of thinking about data prepares students for higher math even before they learn the technical terms like median versus mode or confidence intervals, experts say.

If data science does get incorporated into its revised standards, Utah would join California, Georgia, Oregon, and other states looking to rebalance math instruction across K-12. In some cases, though, adding more time for statistics will come only at the expense of geometry, where the push for change has been less insistent.

Statistics and geometry are hardly the only topics getting squeezed in a math curriculum that teachers say is often crowded with too many subjects to teach in any given school year. But the problem is compounded for those topics because they often get bumped to the end of the year. And teachers, particularly those in lower grades, often have less training in how to approach the concepts, making it more likely that students get superficial instruction, if any.

One of the powers of mathematics is the interrelatedness of it, but we often segment pieces of it, said Trena Wilkerson, a professor of math education at Baylor University and a past president of the National Council of Teachers of Mathematics. Well teach this particular thing, and well teach this thing, and then this thing, rather than necessarily looking at the bigger picture. If we could view mathematics from its connectedness, then students as well as teachers would understand more deeply.

In statistics and geometry, student performance has been falling in every grade level, particularly among disadvantaged students. On the National Assessment of Educational Progress, the set of tests known as the nations report card, the average 8th graders performance fell 16 scale points from 2011 to 2022 in probability and statistics and 9 points in geometry. That equates to a year or more of lost learning, as a 5-point drop on the NAEP scale means students on average performed about six months behind their previous cohort.

Fourth and 12th graders performances show similar slips. The declines in these two topic areas have significantly contributed to overall declines in math performance on the national assessment since 2013, particularly at the 4th and 8th grade levels, as well as historically low math achievement in the most recent NAEP.

In more concrete terms, the results mean that more than a third of 12th graders participating in the most recent NAEP cannot read a basic scatter plota chart that shows relationships among two variables, like height and weightand 71 percent of all 8th graders cannot tell the difference among median, mean, and mode. Three out of 4 low-income 8th graders cant gauge the probability of an outcome or use similar triangles to solve a geometry problem.

After three decades of growth, theres been absolute flatness and decline in the last decade, said Zalman Usiskin, the director of the University of Chicago School Mathematics Project. The prediction has to be that it gets worse from here.

In large part, student performance in data science and geometry on national and international tests have fallen as schools made space in the curriculum for traditional number operations and algebra.

Id probably put in statistics first, algebra second, geometry third if I had to stack them in terms of life applicability, said Sal Khan, the founder and head of the online tutoring program Khan Academy. But in terms of what the [school system] will filter you out on, for better or worse, it has always been algebra skills, and thats only gotten worse.

Policymakers generally attributed the early drop in geometry and data performance on NAEP from 2013 to 2015 to a mismatch between the national test and the recently implemented Common Core State Standards, which moved much of the existing primary school geometry and statistics content to middle or high school grades.

The shift of geometry and data-analysis content was made to allow a deeper focus on gateway topics such as fractions, place value, and decimals, concluded Gregory Camilli, an education psychology emeritus professor at Rutgers University who has studied drops in statistics and geometry performance on the NAEP following the implementation of the Common Core State Standards. Camilli suggested the drop in 4th graders performance in geometry and statistics was balanced by a small increase in student achievement in number and operations, particularly in fractions, which has been a stumbling block for many students. Improving students skills with fractions was a key goal of the common core.

Over the ensuing decade, data analysis and spatial reasoning got squeezed further in both elementary and middle school instruction, as teachers in upper grades increasingly contended with more students who didnt understand basic concepts in those areas.

Trena Wilkerson, professor of math education, Baylor University

Those standards really are set up to be this very far out vertical progression of learning in [statistics], said Josh Recio, the director of the data-focused Launch Years project at the University of Texas at Austin. And so anytime you see units that get skimmed overwhether thats from COVID or whatever reasonthats going to affect how prepared those students are.

The timing of the content shift couldnt have been worse. The era of big data exploded in the 2010s. As a result, statistician and data scientist are two of the 10 occupations projected to be the fastest growing through 2031 by, the Bureau of Labor Statistics. And data-science capabilities more generally have become some of the most sought-after job skills across fields, including business, science, policy, and even creative arts. Geometry-related careers, such as architecture, computer-assisted-design engineers, and fashion designers, by contrast, are growing slowly or at the average for the economywhich has made it more difficult to gain momentum to increase time for geometry.

Geometry has not been at the center of areas that people studied, but it cuts across all mathematics in argumentation, reasoning, and sense-making that are important to many, many careers, Wilkerson said.

While common core math pushed data-science and geometry content to higher grades, the Next Generation Science Standards, which were released around the same time, called for students to be fluent and comfortable with data and said that, by the start of middle school, students should begin applying those math concepts in science class. By 2018, when 20 states had adopted the NGSS and another 24 used them as a foundation for revising their science standards, educators and researchers warned of a growing disconnect between middle school math and science.

If our students are being assessed on spatial reasoning or probability at 4th grade and even 8th grade in the NAEP, theyve only had like two years barely looking at it, said Christina Tondevold, a K-8 math teacher-trainer in Orofino, Idaho. Their time to build their understanding hasnt really happened, and teachers arent making it a priority.

In part, thats because math teachers, particularly in elementary and middle grades, say they are less comfortable with and have less training in how to approach probability and spatial concepts than they do with other kinds of math, like number sense or algebraic thinking. Of those who spent at least some time teaching K-12 math, nearly two-thirds said they had spent a semester or less of their preservice training learning probability and statistics, according to a nationally representative survey conducted this spring by the EdWeek Research Center.

I think teachers who love math dont tend to go into early or middle school, said Wendy Lichtman, a math intervention specialist at MetWest High School in Oakland, Calif. Teachers go into elementary [education] because they love teaching reading concepts.

In Utah, Henderson said, new educators need five or more credits in calculus for a math-teaching endorsement but only two statistics credits and one geometry creditnone of which has to focus on pedagogy. And so when were talking about what teachers are going to skip, its probably the stuff that theyre uncomfortable with, which I would suspect is statistics and geometry because theyve had less experience with them, she said.

In an EdWeek Research Center survey last fall, 56 percent of teachers said they never cover probability concepts, and nearly 30 percent dont cover data representation, even in grades where the topics are part of state standards. Similarly, a quarter to 45 percent of teachers said they never cover geometry concepts such as two- or three-dimensional shapes and how they relate to each other or ways to measure capacity, area, volume, and angles.

This big hole in teacher training forces many teachers to rely heavily on textbooks, according to Dashiell Young-Saver, a high school statistics teacher in San Antonio. Teachers are, for the most part, one chapter ahead of their students in the textbook and teaching them on the fly, said Young-Saver, the founder of Skew the Script, a nonprofit that creates interdisciplinary statistics lessons.

When the topics are covered, they tend to be shuntedby both textbooks and teachers themselvesto the end of the school year. This means statistics and geometry topics come after testing season for most students, and teachers run a greater risk of running out of time to cover them if students dont progress quickly, a common problem after the long months of pandemic school shutdowns.

Even if you do have an ambitious teacher in 6th grade, perhaps, who does get to the geometry unit and wants to teach it, they are going to teach the 6th grade content without realizing that some of the students didnt get the 3rd grade content or the 5th grade content, said Julie Booth, a professor of STEM education and education psychology at Temple University and co-principal investigator of the GeometryByExample initiative at the Strategic Education Research Project. So the students may get through that unit, but they dont have the appropriate prior knowledge to really be able to learn it.

Momentum has started to shift back toward data science at the state level. The newly revised California mathematics framework emphasizes statistics and data science, but its push to incorporate more real data and problem-oriented instruction has drawn significant criticism by those concerned it will politicize classrooms and pull attention from pure math concepts.

Its about access and equity in mathematics. You cant wait for geometry and statistics until students get to high school, or only a select few can take it, Wilkerson, the Baylor University professor, said. You have to integrate the geometry, the statistics, etcetera, from pre-K through elementary and middle school, so that students are getting a deep understanding of what they need in order to be able to make choices about the mathematics that they take in high school and that they pursue post-high school.

For example, under current graduation requirements, Utah students should complete high school with three years of integrated math courses, with the final course covering concepts from trigonometry, Algebra 2, precalculus, and data analysis. In practice, though, more than half of students opt out of their final course, even though in order to do so, they must submit a so-called letter of shame from their parents acknowledging that they will not be college-ready in the subject.

Its a real problem when youre saying this math is important and students and parents arent seeing it as relevant, Henderson said. Only 18 percent of students in Utah take calculus in high school, so for the other 82 percent of students, what is their option to take an enriching mathematics course as either a junior or a senior in high school?

Thats why 60 high school math teachers across the state are piloting a new course pathway to prepare students for dual college and high school credit in data science and statistical reasoning. Zarek Drozda, the director of the nonprofit Data Science for Everyone, said the pilot courses take a more holistic approach to teaching data and statistics concepts, incorporating data collection, cleaning (or correcting errors in raw data sets) and analysis, technology use, and even discussions of ethics around the use of data and statistics. If the pilot proves successful, a new data-science strand could be added to Utahs graduation options and expanded in the states math standards, which are up for renewal this fall.

Utah is among the states in the Launch Years Initiative, a project by the Dana Center at the University of Texas at Austin to develop broader math pathways, particularly in data science, from high school to college.

For example, Georgia rolls out new math standards this fall that integrate data science statistics across K-12. This summer has been a flurry of training for teachers in how to incorporate probability and data modeling concepts into different grades and classes from Algebra 2 and geometry to the sciences.

While some content standards have been moved up or down a grade, probably about 85 percent of whats in algebra is still material that was in the previous course, said Kaycie Maddox, the director of 9-12 mathematics for northeast Georgias regional education service agency in Winterville. So teachers are not going to Mars to get new math stuff. Were doing what is good mathematics; were just looking at it from a different point of view, math modeling instead of being up there lecturing.

Kyle Peterson, asecondary-math specialist for the 85,000-student Alpine school district, Utahs largest, said he hopes the states data-science pilot can help push back against the tendency to focus on rote remedial work in response to students math learning gaps.

Especially with the pandemic [disruptions], as teachers are choosing their essentials to cover, there are still times where geometry and statistics standards get less of a priority, Peterson said. Not just in this district, but I think around the nation, kids come in and they sit in rows and take notes, and math is a silent, spectator sport. We need to help teachers shift to focus on student thinking and then facilitate discussions around math.

Yet leading these discussions is one of the most challenging changes for teachers, Henderson said. Math teachers may have less experience than those of science or social studies in wrangling student debates, particularly if they are grappling with real-life data and problems.

While such conversations can arise in any math class, statistics and geometry lend themselves more frequently to interdisciplinary projects and problems situated in live data. For example, Californias revisions to its math standards dragged on for four years, in part due to the heated debates over how to discuss social issues, particularly in data-driven math classes.

In Utah, Henderson said professional development will be crucial to getting teachers comfortable with both integrating more statistics content and having more in-depth, organic conversations about math.

Not teachers talking at kids, but kids talking to each other mathematicallythat was something that was really unique and scary to me as a teacher because I was like, what if they talked about the mathematics wrong? Henderson said. And it turns out, thats important. Being wrong, having ideas, and having a safe space to try to make connections with your peers is superimportant in the learning of mathematics.

In a June teacher training in Pleasant Grove, Utah, Henderson watched as teachers divided up on either side of the room based on their definition of a trapezoiddoes it have at least two or exactly two parallel sides?and debate their reasoning.

We want our students to be good citizens in their real life. Maybe if we get practice debating about trapezoids, then when we get to the political discussions, it could be a little more civil, said Anand Bernard, an 8th grade math teacher at Sunset Junior High in the Davis school district, another one of the school systems participating in the Utah pilot, who led the debate exercise.

Historically, thats how math is developed: through talking with each other, sharing ideas, writing letters back and forth, Bernard said. Sometimes, well be doing a task, and I see like six different methods and then two of them Ive never even seen before. I can just kind of get blown away.

Geometry and statistics concepts provide good foundations for exercises like that, Henderson said. If youre teaching algebra and geometry, algebra and statistics together, its like double bang for your buck. Youre teaching both standards and youre creating those rich connections between the two strands for students so that they can see that you dont just do algebra by itself.

Go here to see the original:

Are Students Getting All the Math They Need to Succeed? - Education Week

Read More..

Unleashing the Future: How Data Science and AI Empower the … – Medium

Photo by Nathan Dumlao on Unsplash

Welcome to the forefront of a digital revolution where data science and artificial intelligence (AI) converge, unleashing transformative powers that shape the way we live, work, and innovate. In this blog, we embark on an exhilarating journey through the boundless potential of data science and AI, exploring their profound impact on various sectors and how they are reshaping decision-making and personalized experiences.

Data science and AI have emerged as the driving forces behind the Fourth Industrial Revolution,ushering in an era of unprecedented possibilities. With advanced algorithms, machine learning models, and vast troves of data at our disposal, we are witnessing an accelerated pace of innovation that touches every aspect of our lives.

Lets delve into how data science and AI are transforming industries:

AI-powered adaptive learning platforms are tailoring educational experiences to individual student needs, fostering a more effective and engaging learning environment.

Customer-centricity is at the heart of the retail revolution powered by data science and AI. Personalized recommendations, dynamic pricing strategies, and supply chain optimization are enhancing customer satisfaction and driving growth for businesses.

Financial institutions are leveraging data science and AI to detect fraud, assess risk, and optimize investments. Smart chatbots and virtual assistants are also streamlining customer service, providing personalized financial advice, and improving user experiences.

The healthcare sector is witnessing a paradigm shift, with AI-powered diagnostic tools revolutionizing disease detection and treatment planning. From medical imaging analysis to drug discovery, data-driven insights are enhancing patient outcomes and accelerating medical research.

As we draw to a close, let us reflect on the awe-inspiring journey we have taken through the transformative power of data science and AI. From their individual wonders to their seamless integration, these technologies have redefined what was once thought impossible. By harnessing their potential responsibly, we can continue to shape a future that thrives on innovation, personalization, and progress.

This blog is an invitation to embrace the limitless possibilities of the digital age,inspiring readers to delve deeper into the captivating world of data science and AI. Unleash your curiosity and embark on a voyage of continuous learning, where the only limit is your imagination. Together, we can unlock the true transformative power of data science and AI and usher in an era of unprecedented advancement. Welcome to the future!

We hope you enjoyed reading our latest blog on "Unveiling the Digital Renaissance: How Data Science and AI Revolutionize Our World"" Were thrilled to witness your interest in this captivating world of innovation and digital transformation.

Stay tuned for more exciting updates, as we delve deeper into the realm of data science and AI,uncovering new breakthroughs and real-world applications. Our journey together has just begun, and we can't wait to explore the limitless possibilities that lie ahead.If you have any suggestions, questions, or topics you'd like us to cover in future updates, please feel free to share your thoughts. Your feedback is invaluable in helping us create content that resonates with you.

Thank you for being part of our community, and we look forward to continuing this exhilarating journey with you!

Best regards,The OpraPedia Team

For future updates, type "follow" to receive the latest blog posts directly to your inbox. Stay informed and never miss a chance to explore the transformative world of data science and AI!

The rest is here:

Unleashing the Future: How Data Science and AI Empower the ... - Medium

Read More..

How to Enable or Disable Bitlocker Encryption in Windows – Tom’s Hardware

Bitlocker is a feature of certain versions of Windows that encrypts your hard drives contents. Without the right decryption key, its virtually impossible to crack this protection. So, even if someone were to physically open your PC, take your internal drive out and attach it to another computer, they could not read the data without that key. If its your drive and you lost the encryption key, see our article on how to find a BitLocker key.

Only some computers and some editions of Microsoft Windows can use the BitLocker feature. If youre using any of the Home versions of Windows, youre out of luck. Only Pro, Enterprise, and Education licenses are eligible.

There are some hardware requirements too, such as having a TPM 1.2 or better (Trusted Platform Module) chip in your system. If your computer does not have the required TPM, youll have to use a USB drive that will be formatted with your encryption key to start up and run the computer. Your C drive must also be set as your first boot device, and not (for example) your USB or optical drives.

Dont worry too much about the requirements, since if your computer isnt ready for BitLocker, you wont find the option to enable it. If youre worried about the possibility that you could lose your files if you encrypt with BitLocker, first read our guide on how to find a BitLocker key and recover files from encrypted drives.

With that said, let's look at how to turn BitLocker on.

Assuming that your computer complies with the requirements, heres how to activate BitLocker on your Windows PC. Were using Windows 11 here, but the same steps apply to Windows 10:

1. Sign into Windows with an Administrator account. If this is your personal computer and youre the only user, youre most likely already the administrator. If not, youll have to ask the administrator to activate BitLocker for you. On a work computer, this is almost certainly someone from the IT department.

2. Open the Start Menu and search for Manage Bitlocker then click on it.

3. Select Turn on BitLocker. BitLocker is individually applied to each one of your drives. So if you have more than one drive, ensure that you turn it on for all of them, assuming you want all of them protected by encryption. Also, bear in mind that if you copy a file from your encrypted drive to a non-encrypted drive, the file will no longer be protected!

4. Select your backup key method and click Next. The backup key will let you decrypt the drive in case you forget your passcode. There are three options here and you can choose more than one. Since weve signed in with our Microsoft Account, well choose that as the backup method here, since that means the key can be recovered from Microsofts servers.

5. Choose how much of the drive to encrypt and click next. Simply choose the method that matches your circumstances. If this is a new PC, choose the first option. If its a PC thats been in use, choose the second.

6. Choose your encryption mode and click next. Choose the first mode for fixed drives, and the second for drives that will be used with other PCs.

7. (Optional) Tick the system check box. This makes sure that your encryption keys are readable. We recommend doing this, even though its not strictly necessary.

8. Click Start Encrypting.

9. Restart your computer.

10. Open BitLocker Management again. Follow steps 1 to 3 again, and now youll see a message that the drive is being encrypted. You can keep using your PC, but you might notice worse performance until the process is done.

Thats it; once encryption is done, your PCs drive is now protected, and even if someone got their hands on it and plugged it into their own computer, its impossible to decrypt the data without the key, or some sort of quantum supercomputer that doesnt exist yet.

If you no longer want to have your drive encrypted, you can turn BitLocker off as easily as you turned it on. However, do note that you dont have to decrypt your drive before formatting it. Formatting will erase all data on the drive, whether encrypted or not. To turn off BitLocker, do the following:

1. Repeat steps 1 to 3 above. This will take you back to the BitLocker Management Window.

2. Click Turn Off Bitlocker next to the drive in question.

3. Click Turn Off Bitlocker again in the confirmation window that pops up.

The drive is now decrypting.

Just as when you encrypted the drive, this process will take a while to complete, but you can keep using your computer as normal with the possibility of slightly worse performance. Most modern computers should have no noticeable performance differences with BitLocker switched on, so theres little downside to using the feature unless you lose your recovery key, but then your most important data should always be backed up in more than one location, such as cloud storage.

Originally posted here:
How to Enable or Disable Bitlocker Encryption in Windows - Tom's Hardware

Read More..

Practical Applications of spaCy in Data Science | by Harshita Aswani … – Medium

spaCy is a powerful and efficient Python library for natural language processing (NLP). It provides pre-trained models, efficient tokenization, part-of-speech tagging, named entity recognition, and much more. In this blog post, we will explore the practical applications of spaCy and demonstrate how it can simplify and enhance your NLP tasks.

Before we dive into the examples, lets install spaCy using pip, the Python package installer.

spaCy provides a simple and intuitive API for performing common NLP tasks. Lets consider an example of using spaCy to perform tokenization, part-of-speech tagging, and named entity recognition (NER) on a text.

# Load the English language modelnlp = spacy.load('en_core_web_sm')

# Process a texttext = "Apple Inc. is planning to open a new store in New York City."doc = nlp(text)

# Perform tokenizationtokens = [token.text for token in doc]print("Tokens:", tokens)

# Perform part-of-speech taggingpos_tags = [(token.text, token.pos_) for token in doc]print("POS Tags:", pos_tags)

# Perform named entity recognitionentities = [(entity.text, entity.label_) for entity in doc.ents]print("Named Entities:", entities)

spaCy allows you to customize and train your own models for specific NLP tasks. This includes training entity recognition models, part-of-speech taggers, and more. Lets consider an example of training a named entity recognition model using spaCy.

# Load the base English language modelnlp = spacy.load('en_core_web_sm')

# Define training datatrain_data = [(u"Apple Inc. is planning to open a new store in New York City.", {"entities": [(0, 10, "ORG"), (45, 58, "GPE")]}),# Add more training examples here]

# Define and initialize a new NER pipelinener = nlp.get_pipe("ner")

# Add labels to the NER pipelinener.add_label("ORG")ner.add_label("GPE")

# Train the NER modelfor epoch in range(10):for text, annotations in train_data:doc = nlp.make_doc(text)example = Example.from_dict(doc, annotations)nlp.update([example], losses={ner: 1.0})

# Save the trained modelnlp.to_disk("trained_model")

# Load the trained modelnlp_loaded = spacy.load("trained_model")

# Process a text with the trained modeltext = "Apple Inc. is planning to open a new store in New York City."doc = nlp_loaded(text)

# Perform named entity recognition with the trained modelentities = [(entity.text, entity.label_) for entity in doc.ents]print("Named Entities:", entities)

spaCy is a powerful and user-friendly library for natural language processing tasks. In this blog post, we explored the practical applications of spaCy, including tokenization, part-of-speech tagging, and named entity recognition. We also demonstrated how to train a custom named entity recognition model using spaCy.

With spaCy, you can process and analyze text data with ease, perform advanced NLP tasks, and even train your own models for specific domains or applications.

Connect with author: https://linktr.ee/harshita_aswani

Reference:

Read the original:

Practical Applications of spaCy in Data Science | by Harshita Aswani ... - Medium

Read More..

Hidden Racial Variables? How AI Inferences of Race in Medical … – Stanford HAI

Among the hottest sectors for artificial intelligence (AI) adoption is health care. The Food and Drug Administration has now approved more than 500 AI devicesmost within just the last couple of yearsto assist doctors across a range of tasks, from gauging heart failure risk to diagnosing cancer.

Amid this surge, recent studies have revealed that AI models can predict the demographics of patients, including race, directly from medical images, even though no distinguishing anatomical or physiological features are evident to human clinicians. These findings have sparked concern that AI systems could discriminate against patients and exacerbate health care disparity. By the same token, though, the systems could enhance monitoring of patient outcomes connectable to race, as well as identify new risk factors for disease.

Calling attention to these implications, James Zou, an affiliate of the Stanford Institute for Human-Centered AI, and colleagues have penned a new Perspective article for the journal Science. In this interview, Zouan assistant professor of biomedical data science and, by courtesy, of computer science and of electrical engineering at Stanforddiscusses the promise and peril of AI in predicting patient demographics.

Why is race a potentially problematic concept in health care settings?

Race is a complex social construct with no biological basis. The concept of who is this or that "race" varies across time and different contexts and environments; it depends on what country you're in, what century you're in.

There are other human variables, such as genetic ancestry and genetic risks of different diseases, that are more nuanced and likely more relevant for health care.

Is AI plausibly discerning biological differences that human clinicians and anatomists have overlooked in medical imagery or is it more likely that AI is drawing inferences based on other features in available data?

Many AI models are still essentially uninterpretable "black boxes," in the sense that users do not really know what features and information the AI algorithms are using to arrive at particular predictions. That said, we don't have evidence that there are biological differences in these images across different groups that the AI is picking up on.

We do think that the quality and features of the data and of the training sets used to develop the AI play a major role. For instance, patients in one hospital area or clinic location may be more likely to have certain co-morbidities, or other medical conditions, which actually have various manifestations in the images. And the algorithm might pick those manifestations up as artifacts and make spurious correlations to race, because patients of a particular racial category happen to live near and thus go to that hospital or clinic.

Another possibility is systemic technical artifacts, for instance from the types of machines and the methods used to collect medical images. Even in the same hospital, there can be two different imaging centers, maybe even using the same equipment, but it could just be that staff are trained differently in one imaging room compared with the next, such as on how long to image a patient or from what angle. Those variances could lead to different outputs that can show up as systemic patterns in the images that the AI correlates with racial or ethnic demographics, "rightly" or "wrongly," of course keeping in mind that these demographic categories can be crude and arbitrary.

How could AI's discernment of hidden race variables exacerbate health care inequalities?

If the algorithm uses race or some race proxy to make its diagnostic predictions, and doctors are not aware that race is being used, that could lead to dangerous under- or over-diagnosing of certain conditions. Looking deeper at the imaging machines and training sets I mentioned before, suppose patients of a certain race were likelier to have scans done on Type A X-ray machines because those machines are deployed where those people live.

Now suppose that positive cases of, say, lung diseases in the training set for the AI algorithms were collected mostly from Type B X-ray machines. If the AI learns to factor in machine type when predicting race variables and whether patients have lung disease, the AI may be less likely to predict lung disease for people tested on Type A machines.

In practice, that would mean the people getting scanned by Type A machines could be under-diagnosed for lung diseases, leading to health care disparity.

On the flip side, how can AI's ability to infer racial variables advance goals of health care equity?

AI could be used to monitor, assess, and reduce health care disparity in instances where medical records or research studies do not capture patient demographic data. Without these data, it's very hard to know whether a certain group of patients is actually getting similar care or having similar outcomes compared with other groups of patients. In this way, if AI can accurately infer race variables for patients, we could use the imputed race variables as a proxy to assess health care efficacy across populations and reduce disparities in care. These assessments would also feed back in helping us audit the AI's performance in distinguishing patient demographics and making sure the inferences the AI is making are not themselves perpetuating or introducing health care disparity.

Another important benefit is that AI can potentially provide us with far more granular classifications of demographic groups than the standard, discrete categories of race we often encounter. For example, anyone of Asiatic descentwhether from South Asia or East Asia, or whether Chinese or Japaneseis usually grouped under one crude umbrella, "Asian," on standard survey forms or medical records.

In contrast, AI algorithms often represent individual patients on a continuous spectrum of variation in ancestry. So there's interesting potential there with AI to learn about more granular patient subgroups and evaluate medical services provided to them and their outcomes.

Stanford HAIs mission is to advance AI research, education, policy and practice to improve the human condition.Learn more.

See the original post here:

Hidden Racial Variables? How AI Inferences of Race in Medical ... - Stanford HAI

Read More..