Page 2,688«..1020..2,6872,6882,6892,690..2,7002,710..»

Could Data Science Diversify the STEM Field? Why Courses Designed This Century Feel so Relevant to All Students – MindShift – KQED

It's kind of a unique opportunity because there wasn't a high school data science course before, said Suyen Machado, director of the Introduction to Data Science program, which was started as a partnership between UCLA and the Los Angeles Unified School District nearly ten years ago. The program was funded with a National Science Foundation grant to increase the amount of students going into STEM careers and to bring computational and statistical thinking to underrepresented high school students, according to Machado.

Engaging lessons that are inquiry driven, student driven and collaborative are really well suited for underrepresented groups, and you will find all of that in our curriculum. And they're good for students in general, Machado said.

The data science curriculum gives students opportunities to look at real data instead of abstract formulas.

It's just so much fun, said James Molyneux, a professor at Oregon State University who was involved in the development of IDS. For example, students can collect their data and compare themselves to larger government data sets, like the American Time Use Survey from the Bureau of Labor Statistics. Students can measure how much time they spend grooming, eating, being with family and consuming social media, according to Molyneux.

Among students, theres a growing interest in data sets, such as pollution in school communities and which gender character is most likely to survive a horror film. For IDS participants, the most popular data project involves snacks.

It honestly made me more aware of what I was taking in and putting in my body, said student Linda Solares of Leuzinger High School of the snack project. Not to worry, the unit is not about encouraging weight loss or anything. Students used the IDS app to track information like the amount of salt, sugar content, cost, number of ingredients or their reasons for eating.Were in quarantine, we're eating a lot more out of boredom and stuff. So honestly, it really helped me, said Solares. After I finished the survey, I was like, whoa, she said, I was really eating not so healthy.

Surveys of IDS students in LAUSD found that coding was the most challenging part of the course, but also, the most important skill students learned. Using programming tools, like RStudio, they persisted by trying over and over again to get their code right. And that helped boost confidence in their ability to problem solve.

The lab is a lesson for us to learn about the codes and how we can implement them in certain situations, said Leuzinger student Peter Tran, who would test different variables against one another, like finding the most common time of day students ate unhealthy snacks.

An important part of the data science curriculum is understanding privacy matters, and knowing how data is collected about people and used against them. This knowledge can help develop a person's media literacy.

"There's a lot of misinformation out there," said Boaler. "Having students develop a critical perspective that's one of the things we can teach in data science. Be skeptical of data that's put in front of you, ask questions of it, think about who put that data together, what purpose did they have for it."

The messiness of the data sets is part of the appeal for students; its what engages them in learning and not shying away from unknown outcomes, according to Concord Consortiums Chad Dorsey.

It's almost sort of pre-chewed and preordained, said Dorsey of traditional curriculum that doesnt engage students. And when we do that, we take a lot of the discovery away. We're finding the value in putting students into the place of needing to ask and answer questions with data that might be ambiguous or that might have a missing value, said Dorsey. As part of an NSF grant, the group developed the free CODAP tool so teachers can integrate data skills into their classes, such as science. The group also provides teachers with professional development.

We're finding the value of putting students in the driver's seat to do the exploration themselves, to uncover new things in the data that maybe the teachers didn't understand was there in the first place and where students are finding something different than their neighbors, said Dorsey.

For Leuzinger High School IDS teacher Ding-ay Tadena, that has meant giving students agency over the topics they want to investigate, such as sports. They learn how to think deeper and then use these math skills and eventually they love it, says Tadena, who has seen students of all math levels succeed in data science. She says that in data science class, students see themselves as more than the math track theyre in.

It taught them how to dream bigger rather than just being profiled as lower performing in terms of math, she said. And that is the beauty of it because you teach them how to code, how to do this data, how to scrape data from the internet and push it in R in the field that interests them. Tadena, who has been teaching math for about two decades, says data science is in many ways a respite for math teachers like herself who are looking for ways to engage their students.

The students are so interested, Tadena said. Theyre so into it.

For science teacher Emerlyn Gatchalian, having Concord Consortiums CODAP tool makes understanding the periodic table easier for some of her students. They're looking at the different properties of elements in the periodic table using data like the atomic size, ionic size, she said. Because they're using data using CODAP, it's so easy for them to look for patterns and trends and make them feel that they can actually understand and interpret data instead of using all the equations that they're learning in math.

For high school special education teacher Michelle Murtha, students ability to graph their data using digital tools helped them understand it. "Sometimes, graphing itself is so hard for the students. But because the program helps them through it, she said, they're able to actually see the graph. And for us, that's more important, so they can actually analyze the data versus, can you plot this point?

REDEFINING HIGH SCHOOL

When Emilio Jaime was a student at Phineas Banning High School, he was on track to take AP Calculus his senior year. He had been confident about math throughout school, but decided to take IDS based on a teachers suggestion. Plus, one less AP class would help ease his senior year course load.

"I decided to let go of calculus and took on IDS, which I'm so glad I did, because I guess I was just scared because it wasn't the norm that students were doing," said Jaime, who graduated from UC Berkeley last spring.

What he liked about data science was the ability to play with formulas and not feel limited by right and wrong answers that were the hallmark of his math education. This is how the formula is, and this is the answer, and there is a wrong answer, he said of his earlier relationship to math. But data science was more fluid. On our projects, I tried so many different graphs and so many different solutions to try to create so many different conclusions.

I think IDS and data science really allows students to try different things without being scared to fail, he said.

IDS trains teachers across the country and abroad on how to teach data science as a course. Its one of several programs, including ones operated by the Concord Consortium and Boalers YouCubed. The outcome of getting more underrepresented students in the STEM field has yet to be seen. But for now, these educators are shifting students experiences with STEM to increase the odds that theyll stay.

All of these skills will hopefully help students become better informed members of society.

I think that's the biggest gift that we can give students right now no matter how we're doing it is to help them understand that there are data all around them, that those data have answers, that they come from people, and that the things that they are doing are generating data all over, and to give them the ability to start to feel empowered to work with this data themselves, said Dorsey.

Visit link:

Could Data Science Diversify the STEM Field? Why Courses Designed This Century Feel so Relevant to All Students - MindShift - KQED

Read More..

Perfect data science team: The right blend of roles, responsibilities and skills – ETCIO.com

Organizations today have a stronger impetus than ever to embrace the ethos of digitalization and the current era of ambiguity has necessitated embedding data science at the hub of decision-making processes. Data science provides a competitive advantage to companies who embed intelligence in their omni channel interactions and processes.

But cooking the right roles, responsibilities and skills to make up a perfect data science is not easy. Businesses might not be able to see the benefits of data science in a case the team is not structured correctly.

An archetypal data science project team comprises Business Analysts aka Data & Analytics Translators, Data Architects (Data Management), Data Scientists (Statistical Knowledge), ML Engineers (productionizing solutions), Platform Engineers (Foundation Setup) and Visualization Specialists (Reporting). Though diverse, data science is a human pursuit best accomplished in a team setting - the roles played can be slightly fluid around the individual strengths and weaknesses of team members, said Prashanth Kaddi, Partner, Deloitte India.

These experiments heavily rely on availability of big data around transactional and behavioural aspects of users. The high scale of our business also mandates us to automate a large part of our business processes. These automations rely on Machine learning models that are built on rich data training by using various types of user data. The key roles in our team become data engineers, business Intelligence engineers, risk analysts, data scientists and machine learning engineers. said an Ubers spokesperson.

Even a large spreadsheet worth of data is meaningless unless properly contextualised and presented. Teams of business analysts ensure that data from across the company is surfaced to the right teams so that they can make data-driven business decisions. Hence Business Intelligence Engineers become important for Uber.

At times, humans take longer to complete tasks when compared the capabilities of machines and algorithms to make sense of a particular dataset. Ubers team of data scientists and machine learning engineers have fine tuned the art of developing the best models of reality that help generate decisions in such cases.

While these roles fit best for Ubers business model. For a manufacturing company, these would be different.

Lets assume that the scenario is to implement a machine condition monitoring system in a car manufacturing factory. There are infinite reports which can be generated from the data recorded by the sensors, but a very small subset would actually be of interest to the senior management and an extremely small subset of analysis would be so disproportionately useful to them that it will make the entire project worthwhile.

Hence before spending any resources on a data science operation, there has to be one person who understands the business and technology and would be able to do a thorough analysis of all the processes in the business, discard everything that doesnt need to be analysed by sanely calculating the benefit-cost ratio and then be an interface between the management and the data science team. Lets call this person a Business Analyst or a Data Leader, said Shishir Thakur, Co-founder, Cranberry Analytics.

The next person we need is someone who can design a data pipeline from the sensors (or any other data source) to the cloud/on-prem server (data sink). This profile is dedicatedly a Data Architects working in close collaboration with Data Engineers.

In cases where the data volume and throughput are very large, we might also need a DevOps Engineer, a DataOps Engineer, and a dedicated Database Administrator. DataOps Engineers help the rest of the team with operational aspects of data flow and make the process more streamlined for continuous delivery and integration of data, Thakur further explained.

When everything has been done with the data be it cleaning, processing and integration, at the last stage, the team needs a profile of Data Visualizer.

Data Visualization, which is not just a part of science, its also an art. Thats why Data Storytelling is sometimes a separate profile. No matter how complex your analysis was, how much fidelity your sensors had and what amazing tools you invented in the process, if at the end, its not helping the top management with the bottom line and is not telling a clear, concise, and actionable story, its all in vain, Thakur explained.

The job of a Data Visualization team is to present the final analysis in a way which is easily understandable by both technical users from other departments and the business team with zero technical know -how.

Having tales about the roles and responsibilities, it is also important to look at the soft skills of your data science team. Industry leaders believe the following are the top 3 skills to work on and master:

1, Communication: This is also a major challenge which can add a lot of value to a profile. Especially for profiles such as Data Scientist/Architect/Visualization. Everyone who needs to convey and consume cross domain information needs to be good at communication, both business communication and in person to better understand the requirements and better explain the challenges when they arise.

2, Ability to summarize their findings for leadership consumption (Business Acumen): Data scientists work with a humongous amount of data and complex algorithms which appears more like a black box to business people and the leadership team. Its really important that Data scientists learn how to simplify the key messaging and translate the technical findings and insights into easy to understand business messaging and actionable insights.

3, Intellectual curiosity and Dive deep: Its very easy to be deceived by data and numbers if not interpreted and questioned in the right way. A data scientist must ask several questions on any data analysis, model performance and experiment results to ensure what data is conveying is indeed the right message and there is no error, mistake or misrepresentation. This in fact is one of the most important skills which can distinguish an average data scientist from a great one.

(With inputs from Dhrumil Dhakan)

Originally posted here:

Perfect data science team: The right blend of roles, responsibilities and skills - ETCIO.com

Read More..

Data science and digital coding could soon be the new English – Mint

In a news cycle dominated by covid and the Afghanistan tragedy, most of us would have missed a report by the World Economic Forum (WEF), The Future of Jobs (bit.ly/2UsgjvU). There are profound implications to consider. In my work in technology in general and artificial intelligence (AI) in particular, the two questions asked most often are: Will AI and robots become our masters and will AI and technology take over our jobs? Let us leave the first question for another day. The WEF report addresses the second one in detail.

One obvious difference in this report, as opposed to earlier ones, is that the impact of technology on jobs is now layered with the impact of another big phenomenon: the covid pandemic. Many of us dismiss the pandemic as a one-off occurrence that will not have a lasting impact. I disagree. Covid is perhaps the beginning of a series of mega-disruptions. Many of us think of the pandemic as a Black Swan event, what Nassim Nicholas Taleb described as both disruptive and unpredictable. However, Taleb himself refuses to characterize this pandemic as a Black Swanaccording to him, though very disruptive, it was highly predictable. While the pandemic might not be a Black Swan event, the ensuing lockdowns certainly were. They disrupted everything, including work. This pandemic many go away in a few years, an eye-blink in the history of the planet, but there will be other cataclysmic events that will cause unanticipated disruptionsglobal warming-related natural calamities, for example, or wars.

Thus, says the WEF, automation, in tandem with the covid-19 recession, is creating a double-disruption scenario for workers. 43% of businesses surveyed indicate that they are set to reduce their workforce due to technology integration, 41% plan to expand their use of contractors for task-specialized work, and 34% plan to expand their workforce due to technology integration. By 2025, the time spent on current tasks at work by humans and machines will be equal." The last sentence is a shocker. In 2020 humans did two-thirds of work and machines the rest; just five years hence, the human-machine split will be nearly equal. There are some alarming predictions: job creation is slowing while job destruction accelerates. A sigh of relief might escape you when you read that 85 million jobs may be displaced by a shift in the division of labour between humans and machines, while 97 million new roles may emerge that are more adapted to the new division of labour between humans, machines and algorithms." So, humans still have an edge, albeit a narrow one. But the pace of job destruction is more than that of creation. So we will see more jobs lost before new jobs appear.

The other key phrase here is the division of labour between humans, machines and algorithms". While covid and other factors are impacting jobs, the greatest impact is that of technology. The pace of technology adoption is expected to remain unabated and may accelerate in some areas," the report declares. The adoption of cloud computing, big data and e-commerce remain high priorities for business leaders..."

The covid paradox, where the pandemic slowed down the world, but accelerated change" has sprayed jet fuel on this, and the future of work is no longer imminent but has arrived. If this is so, then what are the jobs of the futureor, rather, of the present? Let me invoke Kai Fu Lee, the acclaimed AI practitioner. His famous matrix has optimization-to-strategy on one axis, and no-compassion-to-full compassion on another. High optimization and low compassion jobs (like telesales, customer support and dish-washing) will be the first to go. High compassion roles like those of CEOs, M&A experts and teachers will be last. The jobs which will always be there for humans will require communication skills, empathy, compassion, trust, creativity and reasoning. Lee has created a cheat sheet of 10 jobs in his book, including AI-related research and engineering, psychiatry and medical care and teaching. The WEF has also created a version of this cheat sheet, detailed for most major economies. In the global list, technology jobs dominatedata scientists, AI and machine learning specialists and digital transformation experts. Data entry, administrative, door-to-door sales and many other jobs vanish.

As technology muscles into our lives, jobs that enable humans to manage it will gain dominance. In India, for generations, knowledge of English was the passport for the best jobs. Soon, data science and digital coding could become the new English.

Jaspreet Bindra is the author of The Tech Whisperer, and founder of Digital Matters

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Read more:

Data science and digital coding could soon be the new English - Mint

Read More..

Top Data Science Quizzes That You Must Give a Try in 2021 – Analytics Insight

Analytics Insight churned up the top data science quizzes list for you to become a pro.

Data in this tech-driven world is increasing. With the increase in data, the need for analyzing, interpreting, segregating the data also increases. This brings the demand for data scientists into the picture. But to be proficient in the skill of being a data scientist. It is extremely important to understand what your self-worth is to excel professionally. And for this, we have brought to you the top data science quizzes that you must attempt in 2021 to introspect your level of knowledge and understanding in the field of data science with which, you can have a clear idea on the area of betterment and become a pro in data science.

So here goes the list:

Gramener Inc is a data analytics and storytelling company that extracts insights from big data using state-of-the-art technology and shares them as stories for easy consumption. Gramener helps business users accelerate decision-making. This quiz has a total of 10 questions to complete within the time assigned.

Click here to attempt the quiz.

Proprofs quiz is a platform with several quizzes on various topics. This particular quiz has 25 questions on data science. It has 4990 attempts.

Click here to attempt the quiz.

Data Science online practice test enables you to look deeper and examine your data analytics and logistics skill set. The short quiz will also help you to know either you fit best to become Data Scientist, or there are specific subject areas which you need to practice. The test has ten multiple choice type questions that are to be solved in 2 minutes 30 seconds.

Click here to attempt the quiz.

If youre moving down the path to becoming a data scientist, you must be prepared to impress prospective employers with your knowledge. This quiz is more like an interview question format that helps you get prepared for a data scientist interview fully.

Click here to attempt the quiz.

This data science test assesses candidates skills in core data science topics such as statistics, machine learning, neural networks, and deep learning. The test is designed to help you identify entry- and mid-level data scientists.

Click here to attempt the quiz.

This again is more like an interview to help you crack the best job as a data scientist and excel in your skills and check your level of intellect in data science.

Click here to attempt the quiz.

This quiz is truly designed for professional development that has an essence for introducing a complex topic like data science and stands in the basic level of difficulty.

Click here to attempt the quiz.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Read the original:

Top Data Science Quizzes That You Must Give a Try in 2021 - Analytics Insight

Read More..

Governor Pritzker announced U of I will receive over $140 million funds – wcia.com

CHAMPAIGN, Ill. (WCIA) University of Illinois at Urbana-Champaign will get $140 million in state funds and $52 million in non-state funds through the governors bipartisan Rebuild Illinois capital plan.

Governor Pritzker joined University of Illinois System President Tim Killeen, University of Illinois Urbana-Champaign Chancellor Robert Jones, and other stakeholders for a back-to-school event on Monday. During the event, Gov. Pritzker announced his plans to invest in new facilities and renovations for the campus community of University of Illinois at Urbana-Champaign. University officials said the investments will go towards restoring Altgeld Hall and replacing Illini Hall with a new facility for the Department of Mathematics and Statistics. The project starts in July and is expected to be done by June 2026.

The Rebuild Illinois plan recognizes the importance of the University of Illinois system as a foundation for innovation and a core building block for our states workforce of the future,said Gov. Pritzker.Whats happening with these investments in higher education is a microcosm of what were doing across the state. Were fixing decades-old problems, creating and supporting good jobs, vaulting our state into the future, and invigorating opportunities for the next generation including the thousands of young people who call U of I home.

This investment in the University of Illinois Urbana-Champaign and the Illinois Innovation Network is an investment in the people of Illinois and a better future for us all,said University of Illinois System President Tim Killeen.The funding, made possible by the foresight of Gov. Pritzker and supported by other elected leaders, will provide a key, new home for the kind of cutting-edge data science that will power scientific advancement and economic improvement, and make certain that Altgeld Hall remains a vital part of the heart of the Urbana-Champaign campus, as well.

According to university officials, the $140 million state funds is a part of the capital plan that aims to invest a total of $686.3 million into the University of Illinois system over the upcoming years. $586.3 million in funding will go towards new construction, large-scale renovations, and deferred maintenance to repair infrastructure.

Officials also said the University system will also receive $100 million in capital funds to support quantum science infrastructure.

This investment is critically important, as it will firmly establish the Champaign-Urbana hub of the Illinois Innovation Network. We are proud it recognizes the unmatched expertise in data sciences and advanced analytics we have at the University of Illinois Urbana-Champaign,said Urbana Chancellor Robert Jones.The data science field is fast-growing, and the Altgeld-Illini Hall projects will be incubators for collaborative research and education in data science, as well as other mathematical sciences for generations to come.

We are very fortunate to have this large investment in the University of Illinois. The U of I is a powerful economic driver for Champaign County and these types of investments make not only the University stronger, but the entire county.These projects will create both academic opportunities as well as good paying jobs that are needed for these important capital investments.We in Champaign County are grateful for Governor Pritzkers leadership on this issue and his focus on investing in Champaign County,said Champaign County Board Chair Kyle Patterson.

See the original post:

Governor Pritzker announced U of I will receive over $140 million funds - wcia.com

Read More..

Lacklustre success of analytics in the public cloud – ComputerWeekly.com

Most IT decision-makers who took part in a recent survey (79%) said their organisations use data analytics in the cloud to support business outcomes.

Based on a survey of 272 IT decision-makers for global transformation consultancy Continos Data maturity in the public cloud research report 2021, the study found that 63% of organisations consider their cloud data programmes to be mature. However, the research found that only 16% of the IT decision-makers surveyed said they have fully realised the business benefits from moving data to the public cloud.

The research also found that security is the biggest technical challenge for organisations bringing data to the cloud, with organising and sharing data coming in second (33%).

The survey found that most organisations (90%) are engaging in data science.

The reports authors noted that the rise in popularity of data science is an ongoing trend as businesses seek to drive value from their data through the use of machine learning (ML), artificial intelligence (AI) and automated solutions driven by insights. We know that its possible to accelerate product cycles, reduce customer churn and build predictive models with data science capabilities. But almost 60% of organisations have only one data science team, or one data scientist, warned the reports authors.

The study found data science is still exploratory for many organisations, with many, according to Contino, taking a test and learn approach.

In the report, Contino recommended that to reap the benefits of data science, IT decision-makers need to have a defined and supported data strategy, organisational data maturity, a well-defined business use case and the right data.

The most successful data science programmes have teams that are closely aligned with the business, and understand the nature of the products and services on offer, before undergoing the rigors of data analysis, hypothesis testing, feature selection, algorithm selection and validation, the report said.

Cost savings came out as the most frequent business benefit of having an AI/ML cloud strategy. But when asked to provide a figure for the percentage of projects with customer satisfaction or acquisition as an outcome, Contino reported that almost half (48%) of the answers it received had customer outcome metrics. It said this points towards the trend of using AI/ML to drive stickiness, reduce churn and generate revenue

When asked to what degree their organisations have realised the business benefit they were trying to achieve with their public cloud data strategy, almost a quarter (23%) of IT decision-makers admitted they had not at all realised the benefits they had planned for, and only 16% said they had fully realised the business benefits of their public cloud data strategy.

According to Contino, the survey illustrates that after shifting to the cloud, many organisations are being challenged by a lack of in-house capabilities to build better cloud experiences and products for consumers.

Visit link:

Lacklustre success of analytics in the public cloud - ComputerWeekly.com

Read More..

Excelra launches a re-envisioned version of GOSTAR, its structure-activity relationship application, with an innovative set of new features to…

HYDERABAD, India, Aug. 23, 2021 /PRNewswire/ -- Today, Excelra, a global data science and data analytics company, unveiled an improved version of GOSTAR, its flagship chemistry insights engine. The enhancements offer advanced search, analysis, and visualization capabilities enabling drug discovery scientists to accelerate the hit identification and lead optimization process.

"Drug discovery researchers need to ensure that they are getting the broadest possible view of structure-activity data while being able to readily drill down to the most meaningful data points. The new GOSTAR version has been re-envisioned based on extensive user feedback to provide researchers with an intuitive interface that reduces time to market for new drugs. It incorporates analytical capabilities, eliminating the need to use multiple tools to assess compounds while also providing insights directly within the application. The result is a rapid and more confident decision-making process," said Norman Azoulay, Director, Scientific Products.

GOSTAR's industry-leading capabilities include:

All data in GOSTAR, which now includes over 8 million compounds, along with bioactivity data, is subjected to a three-tiered, QMS-ISO certified quality control process. The standardization and normalization of all data provide additional clarity while ensuring it is directly and readily comparable.

"Excelra aims to transform life science data into actionable insights for our R&D clients that help them to innovate and accelerate their drug discovery. The enhancements made to GOSTAR will improve the ability to make meaningful predictions and lead to more promising pipelines," said Anandbir Singh Brar, CEO, Excelra.

GOSTAR is available as an application for users to seek, find, and discover compounds. In addition, it is offered via APIs and as a downloadable dataset to power in-house libraries and machine learning models.

For more information about GOSTAR, visit http://www.gostardb.com

About Excelra

Excelra's data and analytics solutions empower innovation in life sciences from molecule to market. The Excelra Edge comes from a seamless amalgamation of proprietary data assets, domain expertise and, data science to accelerate drug discovery and development. For more information, visit http://www.excelra.com.

For media inquiries, contact:Jigesh ShahJigesh.shah@excelra.com+91-9820444994

View original content to download multimedia:https://www.prnewswire.com/news-releases/excelra-launches-a-re-envisioned-version-of-gostar-its-structure-activity-relationship-application-with-an-innovative-set-of-new-features-to-accelerate-the-drug-discovery-cycle-301360550.html

SOURCE Excelra Knowledge Solutions Pvt Ltd

Excerpt from:

Excelra launches a re-envisioned version of GOSTAR, its structure-activity relationship application, with an innovative set of new features to...

Read More..

The Best Udacity Nanodegrees for Data Analytics and Visualization – Solutions Review

A directory of the best Udacity Nanodegree programs for data analytics and visualization, compiled by the editors at Solutions Review.

Data analytics is a data science. The purpose of data analytics is to generate insights from data by connecting patterns and trends with organizational goals. Comparing data assets against organizational hypotheses is a common use case of data analytics, and the practice tends to be focused on business and strategy. Data analytics deals less in AI, machine learning, and predictive modeling, and more with viewing historical data in context.

With this in mind, the editors at Solutions Review have compiled this list of the best Udacity Nanodegrees for data analytics and visualization. Udacity is perfect for those looking to take multiple courses or acquire skills in multiple different areas, or for those who want the most in-depth experience possible through access to entire course libraries or learning paths. In sum, Udacity is home to more than 160,000 students in more than 190 countries.

Description: Advance your programming skills and refine your ability to work with messy, complex datasets. Youll learn to manipulate and prepare data for analysis, and create visualizations for data exploration. Finally, youll learn to use your data skills to tell a story with data. You should have experience working with Python (specifically NumPy and Pandas) and SQL.

Description: In this program, youll learn foundational data skills that apply across functions and industries. Youll learn to analyze data and build models with Excel, query databases using SQL, and create informative data visualizations with Tableau. This is an introductory program and has no prerequisites. In order to succeed, Udacity recommends having experience using a computer and being able to download and install applications.

Description: Learn to apply predictive analytics and business intelligence to solve real-world business problems. Students who enroll should be familiar with algebra and descriptive statistics and have experience working with data in Excel. Working knowledge of SQL and Tableau is a plus, but not required.

Description: Communicating effectively is one of the most important skills needed today, and every business is collecting data to make informed decisions. Build on your data or business background to drive data-driven recommendations. Whether you are a data analyst looking to communicate more effectively, or a business leader looking to build data literacy, you will finish this program able to use data effectively in visual stories and presentations.

Description: Youll master the in-demand skills necessary to become a successful data analyst like data pre-processing, visualization and analysis using Power BI as the primary tool. Udacity recommends that learners have prerequisite knowledge of Microsoft Excel.

Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.

See the original post:

The Best Udacity Nanodegrees for Data Analytics and Visualization - Solutions Review

Read More..

Bitcoin tops $50,000, hitting a more than 3-month high – CNBC

Bitcoin hit $50,000 on Sunday to reach a more than 3-month high, as the cryptocurrency continues to rebound.

The digital coin rose above that level around 10:40 p.m. ET on Sunday, according to data from Coin Metrics. It last traded at $49,471.77 at 4:55 p.m. ET on Monday.

Bitcoin hit an all-time high over $64,000 in April but sold off heavily in June and July, even dipping below $30,000. One of the major reasons was renewed regulatory scrutiny from Chinese authorities which has forced bitcoin mining operations to shut downand move elsewhere.

But since mid-July, bitcoin has been on a steady rise.

In the last few days, two key announcements have been positive for the cryptocurrency space. Last week, Coinbase said it would buy $500 million in crypto on its balance sheet and allocate 10% of profits into a crypto assets portfolio.

Vijay Ayyar, head of business development at cryptocurrency exchange Luno, said there was a lot of buying around the $29,000 to $30,000 level when bitcoin was roughly at a 50% discount to April's all-time high.

"Lots of large players took advantage of those prices," Ayyar said, adding that bitcoin could move "to test all-time highs again."

The value of the entire cryptocurrency market stood above $2.16 trillion on Sunday, according to data from Coinmarketcap. It crossed the $2 trillion mark for the first time since May earlier this month.

Read the rest here:
Bitcoin tops $50,000, hitting a more than 3-month high - CNBC

Read More..

Market Wrap: Bitcoin Stalls Near $50K Ahead of Options Expiry – CoinDesk – CoinDesk

Bitcoin stalled after approaching the $50,000 resistance level on Monday. The cryptocurrency was trading at about $49,500 at press time and is up about 8% over the past week. Analysts expect a period of consolidation ahead of Fridays option expiration date and news from the Federal Reserves annual economic policy symposium in Jackson Hole, Wyo.

The trend is bullish; however, caution is to be exercised at these levels due to the decline in volume as well as resistance from April and May, Marcus Sotiriou, a trader at GlobalBlock, wrote in an email to CoinDesk.

$51K would be a natural place for a short-term pause in the rally, Katie Stockton, managing director of Fairlead Strategies, wrote in a Monday newsletter.

Long-term momentum behind bitcoin has strengthened and the 200-day (40-week) moving average is rising again, supporting a bullish long-term outlook, she wrote.

Latest prices

Several analysts noted that extreme overbought conditions have unwound since April, which is providing support for the crypto rally.

Right now, bitcoin and other cryptos have enjoyed technical support (as they were becoming mildly oversold), Santiago Espinosa, a strategist at MRB Partners, wrote in an email to CoinDesk.At this juncture, some cryptos can continue to do well if policymakers neglect inflationary pressures and regulatory issues dont become a mainstream problem.

Bitcoin options expiry

Roughly 25% of bitcoin options open interest is set to expire on Friday. The largest concentration of open interest is seen at the $50,000 strike price, which is also a key technical resistance level.

Despite implied volatility softening over the past few weeks, $50K is a large psychological barrier and the open interest concentration could prove choppy going into expiration, Gregoire Magadini, co-founder and CEO of Genesis Volatility, wrote in a Telegram chat.

The bitcoin options market is placing a 45% chance of BTC trading above $50,000 by the end of September, according to options data provider, Skew.

Chart shows open interest at various strike prices.

Bloomberg's McGlone still bullish on bitcoin

Bloomberg Intelligences Mike McGlone, who won plaudits last year for being among the most prominent analysts predicting that bitcoin would go to $50,000, sees further upside now that the largest cryptocurrency has returned to the mark following a steep market correction.

Bitcoin, gold and long bonds are top assets set to outperform in the second half of 2021, McGlone wrote Monday in a report. The firstborn crypto may have solved the age-old problem of a global reserve asset thats easily transportable and transactionable, has 24/7 price discovery, is relatively scarce and is nobodys liability or project.

BTC holdings rise

The percentage of bitcoin profitable addresses (BTC value above the cost basis) reached a three-month high, according to Glassnode data.

The decline in realized losses of late could indicate that investors have found renewed conviction to hold on, or are potentially taking exits that are closer to their original cost basis, as price recovers towards the $50K range, Glassnode wrote in a Monday blog post.

Chart shows amount of held "lost" coins with price overlay.

Crypto fund inflows

Crypto funds saw $21 million of net inflows last week as digital-asset markets rallied, pushing the total assets under management (AUM) to $57.3 billion, the highest level since May, a new report shows.

The latest data reflected a reversal after six consecutive weeks of outflows, according to the report Monday by digital-asset manager CoinShares.

Funds focused on Solanas SOL token saw the largest inflow among all digital assets, at $7.1 million last week, the report shows. The token hit an all-time high of $82 on Saturday, according to Messari.

Investors redeemed $2.8 million frombitcoin-focused funds last week, the seventh consecutive week of outflows, despite the largest cryptocurrencys price upturn. The run matched the streak of outflows recorded in early 2018, the report noted.That was just before the crypto winter, when cryptocurrency prices tanked and failed to return to all-time highs for more than two years.

Crypto funds netted $21 million of inflows last week, snapping a six-week streak of outflows.

Altcoin roundup

Relevant news:

Other markets

Most digital assets on CoinDesk 20 ended up higher on Monday.

Notable winners of 21:00 UTC (4:00 p.m. ET):

Follow this link:
Market Wrap: Bitcoin Stalls Near $50K Ahead of Options Expiry - CoinDesk - CoinDesk

Read More..