Page 10«..9101112..2030..»

Shiba Inu Team Says Altcoin Season Is Coming – The Crypto Basic

Shiba Inus marketing lead, Lucie, hints at imminent altcoin season, a period characterized by increased activities and rising prices for alternative cryptocurrencies, like SHIB.

The marketing lead made the assertion while reacting to BlackRocks recent promotional video for its iShares Ethereum Trust ETF (ETHA).

In the video, BlackRock U.S. Head of Thematic and Active ETFs Jay Jacobs highlighted Ethereums unique features. According to Jacobs, Ethereums appeal lies in its utility.

He said Ethereum can be likened to a global platform for decentralized applications, noting the growing demand for ETH. As interest in Ethereum increased, BlackRock introduced the iShares Ethereum Trust ETF to give investors exposure to the second-biggest cryptocurrency.

BlackRock was among the eight issuers that received the SECs approval to launch a spot-based ETF for Ethereum.

The companys ETH ETF commenced trading yesterday at 09:30 AM (EST), shortly after the promotional video was published, hitting $22.5 million in volume for the first 15 minutes of trading.

Reacting to BlackRocks ETHA promotional video, Lucie noted that despite the growing interest in crypto, critics still circulate FUDs (Fear, Uncertainty, and Doubt), causing concerns among investors.

She suggested that BlackRock released an Ethereum ETF video for potential investors amid this panic. As a result, the marketing lead urged crypto investors to think big, adding that the altcoin season is on the way.

Most crypto investors are looking forward to this cycles altcoin season, which could potentially lead to increased price spikes in digital assets other than Bitcoin, like Shiba Inu.

Recall that during the 2020/2021 altcoin season, the price of Shiba Inu skyrocketed massively, rallying millions of percent to an all-time high of $0.00008845 on October 28, 2021.

At the time, the market did not have many positive events compared to now. Consequently, people speculate that the recently introduced ETFs could usher in the biggest and largest altcoin season. They project that funds would flow from the traditional financial sector into the altcoin market via the Ethereum spot ETFs.

In the meantime, experts have released bullish predictions for altcoins, including ETH and SHIB. As for SHIB, there are projections that the dog-themed token would surpass its ATH to new records ranging between $0.0001 and $0.001 during the imminent altcoin season.

Whether Shiba Inu will hit these targets before the end of the 2024/2025 altcoin season remains to be seen.

Disclaimer: This content is informational and should not be considered financial advice. The views expressed in this article may include the author's personal opinions and do not reflect The Crypto Basics opinion. Readers are encouraged to do thorough research before making any investment decisions. The Crypto Basic is not responsible for any financial losses.

-Advertisement-

Read more:
Shiba Inu Team Says Altcoin Season Is Coming - The Crypto Basic

Read More..

Epic Altcoin Rally Predicted for August and September – Crypto News Flash

Risk warning and disclaimer: The contents of this website are intended solely for the entertainment and information of readers and do not provide investment advice or a recommendation within the context of the Securities Trading Act. The content of this website solely reflects the subjective and personal opinion of the authors. Readers are requested to form their own opinions on the contents of this website and to seek professional and independent advice before making concrete investment decisions. The information found on this site does not contain any information or messages, but is intended solely for information and personal use. None of the information shown constitutes an offer to buy or sell futures contracts, securities, options, CFDs, other derivatives or cryptocurrencies. Any opinions provided, including e-mails, live chat, SMS or other forms of communication across social media networks do not constitute a suitable basis for an investment decision. You alone bear the risk for your investment decisions.

The rest is here:
Epic Altcoin Rally Predicted for August and September - Crypto News Flash

Read More..

Bitcoin Price Hovers Under $67,000 on Global Exchanges, Altcoin Values Rise After Price Correction Period – Gadgets 360

Several cryptocurrencies saw their prices rise on the crypto price chart on Friday, including the world's most widely used digital assets Bitcoin and Ether. Bitcoin price rose by 4.33 percent over the last 24 hours to trade at $66,945 (roughly Rs. 56 lakh) on international exchanges like CoinMarketCap. The value of BTC on Indian exchanges is still fluctuating in the aftermath of the WazirX hack. At the time of writing, Bitcoin was priced between the range of $53,785 (roughly Rs. 45 lakh) and $71,800 (roughly Rs. 60 lakh) on Indian exchanges.

BTC has rebounded strongly from its weekly losses, driven by positive momentum and the kick-off of a Bitcoin conference yesterday. The asset is now consolidating within an ascending channel. This suggests that its upward trend remains intact, and investor profitability is currently positive, Vikram Subburaj, CEO, Giottus told Gadgets360.

Ether also rose by 2.08 percent during the same period. At the time of writing, ETH was trading at $3,247 (roughly Rs. 2.71 lakh) on international exchanges. In India, Ether's price varies within the range of $2,554 (roughly Rs. 2.12 lakh) and $3,540 (roughly Rs. 2.96 lakh).

Substantial outflows from the Grayscale Ethereum Trust (ETHE) have caused ETH to trade lowly despite small gains. Investors are increasingly worried about regulatory scrutiny and unfavourable market conditions, leading to a noticeable shift in sentiment, the CoinSwitch Markets Desk told Gadgets360.

Binance Coin, Solana, Ripple, Dogecoin, Cardano, and Avalanche increased in value.

Shiba Inu, Polkadot, Chainlink, and Litecoin prices rose alongside Bitcoin and Ether, but Qtum, Status, Circuits of Value, and Near Protocol prices fell on Friday.

Bitcoin has traded sideways, while other altcoins are down significantly. Solana and its ecosystem tokens remain strong. Volatility is expected to remain high with today's PCE Price Index data announcement, the CoinDCX markets team told Gadgets360. China's surprise rate cut and the steepening of the US Treasury yield curve signalled panic, Avinash Shekhar, Co-founder and CEO, Pi42 told Gadgets360.

The overall crypto market cap rose by 3.27 percent over the last 24 hours. With this, the crypto market cap has come to $2.39 trillion (roughly Rs. 2,00,099 crore).

Cryptocurrency is an unregulated digital currency, not a legal tender and subject to market risks. The information provided in the article is not intended to be and does not constitute financial advice, trading advice or any other advice or recommendation of any sort offered or endorsed by NDTV. NDTV shall not be responsible for any loss arising from any investment based on any perceived recommendation, forecast or any other information contained in the article.

Read the rest here:
Bitcoin Price Hovers Under $67,000 on Global Exchanges, Altcoin Values Rise After Price Correction Period - Gadgets 360

Read More..

Leaders in the evolution of the liberal arts and sciences: SCHEV approves new W&M school – William & Mary

The evolution of the liberal arts and sciences took a significant step forward Tuesday.

The State Council of Higher Education for Virginia (SCHEV) approved William & Marys School of Computing, Data Sciences, and Physics. The school aligns with W&Ms academic mission and expands the universitys ability to prepare students to thrive in a data-rich world.

The school brings together four of the universitys high-performing units: applied science, computer science, data science and physics. These will move into the new school in the fall of 2025. The school will be the sixth at W&M since its inception and the first in over 50 years. A national search for the dean of computing, data sciences, and physics is underway.

I appreciate SCHEVs shared commitment to preparing broadly educated, forward-thinking citizens and professionals, said President Katherine A. Rowe. The jobs of tomorrow belong to those prepared to solve tomorrows problems. Machine learning, AI, computational modeling these are essential modes of critical thinking and core to a liberal arts education in the 21st century.

While the school and its new administrative structure were officially approved Tuesday, its foundations are already in place. The school, brought to life by an extensive feedback and consultation process, will coalesce four programs currently operating within the Faculty of Arts & Sciences.

William & Marys Board of Visitors unanimously approved the new administrative structure in November 2023. To be housed in the heart of campus with the completion of phase four of the Integrated Science Center in fall 2025, the school will be a space where graduate and undergraduate students excel in a combination of disciplines and where research opportunities will be expanded, continuing to attract world-class faculty and external investments.

Innovation has been part of William & Mary since its inception, and this school will serve as the catalyst for countless new discoveries, partnerships and synergies, said Provost Peggy Agouris. The School of Computing, Data Sciences, and Physics is launching at a pivotal time within these dynamic fields, and Im incredibly proud to continue our journey of interdisciplinary growth and excellence across our undergraduate and graduate program offerings. I am grateful to SCHEV Council members for their belief in our vision and to all involved who made this a reality.

The university submitted the formal application to SCHEV, the state agency that governs new schools and new programs, earlier this spring.

In establishing a standalone school, William & Mary will grant more visibility and autonomy to these high-performing academic areas; it will also provide a single point of contact for external collaboration. The school will strengthen existing partnerships for example, with the Thomas Jefferson National Accelerator Facility in Newport News while facilitating cooperation with external parties promoting scientific and technological advancement.

The four academic areas in the new school are already experiencing strong growth in external investment (over $9 million in 2023) and student numbers. Masters students from the new schools constituent areas represented one-third of all Arts & Sciences masters students, with this proportion rising to almost two-thirds when considering doctoral programs.

In the new structure, high-impact research in data-intensive fields will further converge with academic and professional career preparedness, meeting increased student and employer demand while achieving goals from the universitys Vision 2026 strategic plan.

Undergraduate candidates will not apply to the school directly. W&M second-year students in good standing will be able to enter the school as long as they meet criteria established by the school and the major, and will continue to have the opportunity to double major or minor in areas offered by other W&M programs. Interdisciplinary collaborations between the school and the rest of the university will be expanded, combining cutting-edge innovation with William & Marys distinctive strengths in the liberal arts and sciences.

We do our best work when we do it together, Agouris said. Aligning our computer science, data science, applied science and physics programs under one school will deepen the universitys impact on fields that are rapidly changing and increasingly important. Our students come here wanting to understand and change the world. Now more than ever, they will leave better equipped to do just that.

Antonella Di Marzio, Senior Research Writer

See original here:

Leaders in the evolution of the liberal arts and sciences: SCHEV approves new W&M school - William & Mary

Read More..

I Used to Hate Overfitting, But Now Im Grokking It | by Laurin Heilmeyer | Jul, 2024 – Towards Data Science

The surprising generalisation beyond overfitting 8 min read

As someone who spent considerable time with various computer science topics, where mathematical abstractions can sometimes be very dry and abstract, I find the practical, hands-on nature of data science to be a breath of fresh air. It never fails to amaze me how even the simplest ideas can lead to fascinating results.

This article faces one of these surprising revelations I just stumbled upon.

Ill never forget how the implementation of my Bachelors thesis went. While it was not about machine learning, it had a formative effect on me, and Ill manage to constantly remind myself of it when tinkering with neural networks. It was an intense time because the thesis was about an analytical model of sound propagation that aimed to run in the browser, thus having very limited performance leading to long running simulations. It constantly failed to complete after running for many hours. But the worst experience was interpreting wrongly configured simulations with confusing results, which often made me think the whole model was nonsense.

The same happens from time to time when I actually train neural networks myself. It can be exhausting to

Link:

I Used to Hate Overfitting, But Now Im Grokking It | by Laurin Heilmeyer | Jul, 2024 - Towards Data Science

Read More..

WOC and BEYA STEM community mourn the loss of a pioneering data scientist and cybersecurity leader – BlackEngineer.com

Dr. Nefertiti Jackson, who was a prominent figure in data science and cybersecurity, has died.

Her obituary states that there will be a memorial service on Wednesday, July 24 at 11:00 AM at the Allen Temple Baptist Church, located at 8501 International Blvd, Oakland, CA. Additionally, there will be a viewing on Tuesday, July 23, at C. P. Bannon Mortuary, located at 6800 International Blvd, Oakland, CA.

Dr. Jackson was a distinguished data scientist and technical leader who made significant contributions to the field of cybersecurity.

Her career included roles at the National Security Agency (NSA) where she played a key role in identifying and addressing system inefficiencies.

She held degrees in mechanical engineering, biomedical engineering, and a Ph.D. in Applied Physics from the University of Michigan in conjunction with Howard University.

During her career, Dr. Jackson worked on various groundbreaking systems, including detecting anomalies in network traffic and developing alert mechanisms for health and continuous systems monitoring.

She was also passionate about sharing her knowledge and expertise, regularly participating in STEM conferences and educational outreach programs.

In addition to her professional accomplishments, Dr. Jackson was deeply involved in promoting STEM education.

She served as a board member for a local private school and was instrumental in creating opportunities for young leaders to engage with cybersecurity and data science.

Dr. Jacksons impact extended to her alma mater, Tuskegee University, where she worked to establish a pipeline for future cybersecurity leaders focused on artificial intelligence (AI) and machine learning (ML).

Dr. Nefertiti Jacksons contributions to the fields of data science and cybersecurity, as well as her dedication to education and outreach, leave a lasting legacy.

Link:

WOC and BEYA STEM community mourn the loss of a pioneering data scientist and cybersecurity leader - BlackEngineer.com

Read More..

UC San Diego Launches New School of Computing, Information and Data Sciences – HPCwire

July 22, 2024 The University of California Board of Regents has approved the creation of the new School of Computing, Information and Data Sciences (SCIDS) at UC San Diego, a critical advance in UC San Diegos long history of leading innovation and education in artificial intelligence, computing and data science disciplines that are rapidly reshaping modern life.

One of 12 schools at UC San Diego and just the fourth to be added in the 21st century SCIDS will bring together faculty across disciplines to improve the human condition by better understanding how data shapes society, and to prepare the next generation of highly skilled workers driving artificial intelligence advancements.

The School of Computing, Information and Data Sciences exemplifies UC San Diegos commitment to addressing one of the most compelling needs of modern times transforming data into actionable knowledge, said Chancellor Pradeep K. Khosla. Computing and data literacy are key to meeting the needs of students and the state of California, advancing critical research areas like the future of artificial intelligence, and bolstering the universitys mission of public service.

SCIDS will be enlivened by an anticipated 8,000 students, including many who will come through a robust community college pipeline; and more than 50 faculty across 16 academic disciplines.

The school will play a key role in advancing data science across all disciplines, as well as advancing state-of-the-art computing applications. Additionally, it will serve as a catalyst for increasing collaborations across existing schools, academic departments and disciplines to establish new fields of inquiry.

By translating data science from the classroom to research and the broader workplace, the school will prepare students for their careers by providing opportunities for them to engage directly with industry and government partners, including emergency responders; municipal, state and national resource management organizations; and nonprofits. Students will learn first-hand how data science can allow organizations to better address societal problems ranging from climate change mitigation and social justice issues, to technical challenges and healthcare.

Pursuing cross-collaborative research opportunities and creating interdisciplinary educational programs are integral parts of the UC San Diego community, said Executive Vice Chancellor Elizabeth H. Simmons. The School of Computing, Information and Data Sciences is just the latest example of our commitment to working across disciplines to expand knowledge in a burgeoning field and improve our community and our world. It fits perfectly within our educational structure.

Foundational Pillars of the New School

The new school combines the strengths of theSan Diego Supercomputer Center(SDSC), a national leader in high-performance and data-intensive computing, and theHaliciolu Data Science Institute(HDSI), a pioneering interdisciplinary institute that advances data science and AI education and research.

Together these resources form the foundational pillars of SCIDS and will position UC San Diego to support the growing demand for data science and computing expertise across the research and educational mission of the university.

As part of a national effort to address a shortage of advanced computing resources, the U.S. National Science Foundation (NSF) established the General Atomics-affiliated SDSC in the 1980s, which transformed academic scientific communities like UC San Diego.

In its 40-year history, SDSC has provided computing resources to a range of domestic stakeholders and federal agencies, state agencies tackling crises like extreme weather and wildfire, and the UC San Diego and the UC system, providing researchers with in-house computational and data resources to accelerate scientific discovery.

The new school at UC San Diego will grow our impact on society via translational computing, information and data sciences, and bring AI education to community and teaching colleges across California and the nation via our AI infrastructure, said SDSC Director Frank Wrthwein, a founding faculty member of HDSI. Combining our strengths with those of HDSI optimizes our leadership in innovation for science, technology, education and society.

UC San Diegos depth in technology-related domains, anchored by engineering and mathematics, deepened further when the university established HDSI in 2018 with philanthropic support from Computer Science and Engineering Alumnus Taner Haliciolu.

Educating the next generation of machine learning engineers and data analysts, HDSI brings together an interdisciplinary team of faculty and researchers from areas ranging from computer science to communications to medicine. Working together, these researchers explore new computational methods, new mathematical models and guide societal and ethical impacts of data science.

HDSI, for example, is home to the NSF-funded AI Institute for Learning-Enabled Optimization at Scale (TILOS), which explores AI optimization and advances chip design, networks and contextual robotics.

HDSI and SDSC share the unique challenge of building transdisciplinary academics and research. Their coming together under SCIDS will involve new synergies and realize tremendous new possibilities in creating talent in emerging areas, including artificial intelligence, said HDSI Director Rajesh Gupta.

UC San Diegos undergraduate major in data science was first developed and shepherded in 2016 by the Department of Computer Science and Engineering before the degree programs transfer to HDSI. HDSI graduated its first class of bachelors students in 2020, and initiated masters and doctorate programs in 2022. HDSI also offers a minor in data science, with a growing population of students, and it is in the process of launching a joint M.S.-M.D. program with health sciences. Currently, there are 51 faculty appointments in HDSI. Student graduates include 814 bachelors students between 2020 and 2024, and 22 masters students between 2022 and 2024. One Ph.D. student graduated in 2024.

Source: UC San Diego

View original post here:

UC San Diego Launches New School of Computing, Information and Data Sciences - HPCwire

Read More..

Evolution of Data Science: New Age Skills for the Modern End-to-End Data Scientist | by Col Jung | Jul, 2024 – Towards Data Science

21 min read

In the 1980s, Wall Street discovered that physicists were great at solving complex financial problems that made their firms a bucket load of money. Becoming a quant meant joining the hottest profession of the time.

Twenty years later, in the late 2000s, as the world was on the cusp of a big data revolution, a similar trend emerged as businesses sought a new breed of professionals capable of sifting through all that data for insights.

This emerging field became known as data science.

In 2018, I transitioned from academia to industry while completing my PhD in modelling frontier cancer treatments, working for one of the largest banks in Australia.

I was joined by seven other PhD candidates from top universities across Australia, all specialising in different areas, from diabetes research and machine learning to neuroscience and rocket engineering.

Fascinatingly, all of us eventually found ourselves working in the banks big data division something we still joke about to this day.

Continued here:

Evolution of Data Science: New Age Skills for the Modern End-to-End Data Scientist | by Col Jung | Jul, 2024 - Towards Data Science

Read More..

Most Data Quality Initiatives Fail Before They Start. Heres Why. | by Barr Moses | Jul, 2024 – Towards Data Science

Show me your data quality scorecard and Ill tell you whether you will be successful a year from now. 7 min read

Every day I talk to organizations ready to dedicate a tremendous amount of time and resources towards data quality initiatives doomed to fail.

Its no revelation that incentives and KPIs drive good behavior. Sales compensation plans are scrutinized so closely that they often rise to the topic of board meetings. What if we gave the same attention to data quality scorecards?

Even in their heyday, traditional data quality scorecards from the Hadoop era were rarely wildly successful. I know this because prior to starting Monte Carlo, I spent years as an operations VP trying to create data quality standards that drove trust and adoption.

Over the past few years, advances in the cloud and metadata management have made organizing silly amounts of data possible.

Data engineering processes are starting to trend towards the level of maturity and rigor of more longstanding engineering disciplines. And of course, AI has the potential to streamline everything.

While this problem isnt and probably will never be completely solved, I have seen organizations adopt best practices that are the difference between initiative successand having another kick-off meeting 12 months later.

Here are 4 key lessons for building data quality scorecards:

The most sure way to fail any data related initiative is to assume all data is of equal value. And the best only way to determine what matters is to talk to the business.

Brandon Beidel at Red Ventures articulates a good place to start:

Id ask:

Now, this may be easier said than done if you work for a sprawling organization with tens of thousands of employees distributed across the globe.

In these cases, my recommendation is to start with your most business critical data business units (if you dont know that, I cant help you!). Start a discussion on requirements and priorities.

Just remember: prove the concept first, scale second. Youd be shocked how many people do it the other way around.

One of the enduring challenges to this type of endeavor, in a nutshell, is data quality resists standardization. Quality is, and should be, in the eye of use case.

The six dimensions of data quality are a vital part of any data quality scorecard and an important starting point, but for many teams, thats just the beginning and every data product is different.

For instance, a financial report may need to be highly accurate with some margin for timeliness whereas a machine learning model may be the exact opposite.

From an implementation perspective this means measuring data quality has typically been radically federated. Data quality is measured on a table-by-table basis by different analysts or stewards with wildly different data quality rules given wildly different weights.

This makes sense to a degree, but so much gets lost in translation.

Data is multi-use and shared across use cases. Not only is one persons yellow quality score another persons green, but its often incredibly difficult for data consumers to even understand what a yellow score means or how its been graded. They also frequently miss the implications of a green table being fed data by a red one (you know, garbage in, garbage out).

Surfacing the number of breached rules is important, of course, but you also need to:

So then what else do you need? You need to measure the machine.

In other words, the components in the production and delivery of data that generally result in high quality. This is much easier to standardize. Its also easier to understand across business units and teams.

Airbnb Midas is one of the more well known internal data quality score and certification programs and rightfully so. They lean heavily into this concept. They measure data accuracy but reliability, stewardship, and usability actually comprise 60% of the total score.

Many data teams are still in the process of formalize their own standards, but the components we have found to highly correlate to data health include:

Yay, another set of processes were required to follow! said no one ever.

Remember the purpose of measuring data health isnt to measure data health. The point, as Clark at Airbnb put it, is to drive a preference for producing and using high quality data.

The best practices Ive seen here are to have a minimum set of requirements for data to be on-boarded onto the platform (stick) and a much more stringent set of requirements to be certified at each level (carrot).

Certification works as a carrot because producers actually want consumers to use their data, and consumers will quickly discern and develop a taste for highly reliable data.

Almost nothing in data management is successful without some degree of automation and the ability to self-serve. Airbnb discarded any scoring criteria that 1) wasnt immediately understandable and 2) couldnt be measured automatically.

Your organization must do the same. Even if its the best scoring criteria that has ever been conceived, if you do not have a set of solutions that will automatically collect and surface it, into the trash bin it must go.

The most common ways Ive seen this done are with data observability and quality solutions, and data catalogs. Roche, for example, does this and layers on access management as part of creating, surfacing and governing trusted data products.

Of course this can also be done by manually stitching together the metadata from multiple data systems into a homegrown discoverability portal, but just be mindful of the maintenance overhead.

Data teams have made big investments into their modern data and AI platforms. But to maximize this investment, the organization both data producers and consumers must fully adopt and trust the data being provided.

At the end of the day, whats measured is managed. And isnt that what matters?

View post:

Most Data Quality Initiatives Fail Before They Start. Heres Why. | by Barr Moses | Jul, 2024 - Towards Data Science

Read More..

From Ephemeral to Persistence with LangChain: Building Long-Term Memory in Chatbots – Towards Data Science

8 min read

In a previous article I wrote about how I created a conversational chatbot with OpenAI. However, if you have used any of the chatbot interfaces like ChatGPT or Claude et al., you would notice that when a session is closed and reopened, the memory is retained, and you can continue the conversation from where you left off. That is exactly the experience I want to create in this article.

I will use LangChain as my foundation which provides amazing tools for managing conversation history, and is also great if you want to move to more complex applications by building chains.

Code for recreating everything in this article can be found at https://github.com/deepshamenghani/langchain_openai_persistence.

I will start by creating a loop for the user to input questions for the chatbot. I will assign this to the variable humaninput. For now, instead of an LLM output

Read more:

From Ephemeral to Persistence with LangChain: Building Long-Term Memory in Chatbots - Towards Data Science

Read More..