Page 1,610«..1020..1,6091,6101,6111,612..1,6201,630..»

Crypto Price Today: Bitcoin holds above 28k, Ethereum and other tokens gain – CNBCTV18

SUMMARY

Cryptocurrencies gained on Wednesday ahead of the FOMC rate decision. Bitcoin traded above $28,000-mark. The global crypto market cap stood at $1.18 trillion, with a volume of $67.2 billion in the past 24 hours.

1 / 7

Bitcoin | The world's largest and most popular virtual currency, Bitcoin, rose 1.4 percent to $28,176.4. Its market value stood at $546.6 billion. The trade volume was at nearly $35.6 billion.

2 / 7

"Bitcoin surged to the $28,000 threshold, ahead of the US FOMC meeting, which would determine whether to raise interest rates again. Many investors and traders expect interest rates to rise at a slower pace of 25 basis points. This follows the announcement by the U.S. that it will consider ways to secure client deposits at struggling banks in the wake of a recent crisis. The FED's decision is expected to result in some market volatility," says Edul Patel of Mudrex.

3 / 7

Ethereum | The second largest virtual currency, Ethererum or Ether jumped 5.7 percent to $1,800.8 at the last count, with a market capitalisation of nearly $221.3 billion. The trade volume of Ethereum was almost $10.8 billion in the last 24 hours.

4 / 7

Dogecoin | Meme-based virtual currency, Dogecoin, jumped 5.8 percent to $0.1. Its market value stood at $10.1 billion. The trade volume was at nearly $664.4 billion.

5 / 7

Solana | Solana fell 0.6 percent to $22.3 with a market capitalisation of $8.6 billion. The trade volume of Solana was $697.6 million in the last 24 hours.

6 / 7

Shiba Inu | Shiba Inu rose 3.5 percent with a market capitalisation of almost $6.5 billion. The trade volume was almost $334.3 million in the last 24 hours.

7 / 7

Polygon | Polygon gained 3.2 percent with a market capitalisation of $10.1 billion. The trade volume was $543 million in the last 24 hours.

Visit link:

Crypto Price Today: Bitcoin holds above 28k, Ethereum and other tokens gain - CNBCTV18

Read More..

NYAG calls Ethereum a security and destroys decentralization myth – CoinGeek

Ethereums developers have locked in a date for the networks next big upgrade, dubbed Shanghai: April 12. While ETH fans will laud the update for introducing the ability to unstake ETH, it couldnt be happening at a worse time.

SEC chair Gary Genslerindicated again this week that proof-of-stake modelslike those used by Ethereumare securities. Meanwhile, the NYAG filed charges against the KuCoin exchange for illegally selling securities and specifically named ETH as a security in large part due to the fact that Ethereum is not decentralized after all.

Add to that the fact that the past 12-24 months have seen legislators, regulators, and law enforcement all clamp down on illegalities within the industry in different ways. For instance, in January, theSEC took actionagainst lend-to-earn programs jointly operated by Genesis and Gemini on the basis that they amounted to illegal securities offerings. Nexo Capital paid the SEC a$45 million dollar settlement over its lend-to-earn program around the same time.

Most notable about these actions is that they confirm that the SEC is beginning to deconstruct themyth of decentralizationand the role it plays in determining whether a given asset offering amounts to a security. Look at the language used by Gensler in aninterviewwith The Block in 2021, right as the SEC began to look at lend-to-earn products:

I would note to your readers that if youre investing on a centralized exchange or a centralized lending platform, you no longer own your token. Youve transferred ownership to the platform. All you have is a counterparty risk. And that platform might be saying, as many of them do, well give you a four percent or seven percent return if you stake your coins with us or you actually transfer ownership and we the platform will stake your tokens. That takes on all the indicia of what Congress is trying to protect under the securities laws.

This should represent an enormous red flag for Ethereum and BTCtwo projects that the SEC has long considered the rare exceptions to its general position that most digital assets are, in fact, securities. Thats because these exceptions arise from the SECs assumption that both projects are decentralized: without a centralized body governing projects like these, according to the SECs logic, it cant be said that investors had a reasonable expectation of profits relying on theefforts of others, as is required by the Howey test for securities (their belief that BTC is not a security is also based on a misguided interpretation of that projects history).

These exceptions were always highly tenuous and at odds with reality. As time has gone on, they have become increasingly difficult to justify. Now, it seems, regulators and law enforcement have finally seen the lie for what it is.

Ethereum was always a securitybut now its obvious

The idea that Ethereum is anything but a security has always been a sham. This was obvious from Buterins own marketing at the time of the ETH ICO right through to the language used on Ethereum websites today. The point is made repeatedly: ETH production will slow dramatically over time, increasing scarcity and, as a consequence, value.

The Ethereum foundation website alsorefersto the fact that ETH is viewed as an investment.

But the biggest lie of all was that Ethereum is or has ever been decentralized. This was obvious back in 2016 when a decentralized autonomous organization (DAO) holding almost 15% of all ETH in circulation was hacked. Ethereums developers, led by founder Vitalik Buterin, proposed, approved, and implemented the solution to fork the network and recover the stolen assets. There was the charade of a vote on the proposal, but just 6% of all ether holders participated, and 25% of the votes came from a single address, while the proposals at the heart of the vote were still authored by Ethereums core team. This can be seen again as recently as this week with the Shanghai update announcement: If Ethereum is decentralized, then who is designing, proposing, and implementing these radical upgrades? That would be the very same people that intervened after the DAO hack, of course. This is not decentralization by any realistic definition.

The notoriousEthereum 2.0 upgrade, which moved the networks consensus mechanism from Proof-of-Work to Proof-of-Stake, did nothing to change this reality. If anything, the migration to Proof-of-Stake moved Ethereum even deeper into security territory. In contrast to Proof-of-Work, where miners are at least undertaking work of their own in solving hash puzzles, those staking under a Proof-of-Stake system are relying entirely on the work of others to deliver profits on whatever coins have been staked. In fact, this week, Gensler again indicated that Proof-of-Stake networks are likely to trigger U.S. securities laws, saying:

Whatever theyre promoting and putting into a protocol, and locking up their tokens in a protocol, a protocol thats often a small group of entrepreneurs and developers are developing, I would just suggest that each of these token operators seek to come into compliance, and the same with the intermediaries, he said,reportedby The Block.

Authorities are already taking action on the basis that ETH is a security

This sea change by authorities is not just a speculation over some hypothetical future enforcement drive. Instead, the enforcement drive is already here.

Last week, the New York Attorney Generalfiledcharges againstexchangeKuCoin over failing to register as a securities and commodities broker-dealer: ETH, which is listed by the exchange, was expressly labeled a security in the charges. The language used by the NYAG couldnt be clearer:

ETHs development and management is largely driven by a small number of developers who hold positions in ETH and stand to profit from the growth of the network and the related appreciation of ETH.

Further down:

Buterin and the Ethereum Foundation retain significant influence over Ethereum and are often a driving force behind major initiatives on the Ethereum blockchain that impact the functionality and price of ETH.

The language could have been taken straight fromHowey: ETH is not decentralized, and the continued development of the network is closely governed by a core group of developers who promote the asset on the basis that its value will increase over time. It is a classic security.

The NYAGs action is against KuCoin rather than anyone directly connected to Ethereum, but between Ethereums recent upgrades, Genslers public statements on centralization and the SECs recent enforcement actions, it seems that the SEC charges against Ethereum and its developers directly cant be far away. The ramifications of an SEC case against Ethereum can be much wider reaching: the SEC is empowered to levy fines and penalties against those violating the U.S. securities regime, including disgorgement of all profits made in connection with an unregistered securities offering and injunctions against carrying on further business. In short, it could spell the end of Ethereum altogether.

And if the myth of decentralization with respect to ETH has finally been pierced, then you can bet that BTC is up for the same treatment.

FollowCoinGeeks Crypto Crime Cartelseries, which delves into the stream of groupsfromBitMEXtoBinance,Bitcoin.com,Blockstream,ShapeShift,Coinbase,Ripple,Ethereum,FTXandTetherwho have co-opted the digital asset revolution and turned the industry into a minefield for nave (and even experienced) players in the market.

New to Bitcoin? Check out CoinGeeksBitcoin for Beginnerssection, the ultimate resource guide to learn more about Bitcoinas originally envisioned by Satoshi Nakamotoand blockchain.

Original post:

NYAG calls Ethereum a security and destroys decentralization myth - CoinGeek

Read More..

636,000 Ethereum (ETH) Worth $1.1B Lost Forever – The Crypto Basic

Coinbase Director Makes Shocking Revelation On $1.1B+ Ethereum (ETH) Lost Forever.

Coinbase Director of Project Strategy & Business Operations Conor Grogan has taken to his official Twitter account to shed light on the vast amount of ETH coins lost forever. Conor revealed that up to 636,000 ETH worth $1.15B+, equal to 0.5% of all circulating supply, will not be recovered again.

The Coinbase director categorized these losses as Ethereum typos, user errors, and buggy contracts. Sharing his thoughts via a Twitter thread, Conor said the number is too much as they cant be sold while admitting that crypto trading could be difficult sometimes.

Conor Grogan also listed some of the biggest losses he found. Conor started with the Web3 foundation loss of 306K ETH, worth $538M due to the Parity Multisig bug, and then Quadrigas 60K ETH, worth $108M to a faulty contract.

- Advertisement -

Also on the list is Akutars loss of 11.5k ETH because of a wrong NFT-mint, plus the 24k ETH collectively sent by people to a burn address for unknown reasons.

He further said the $1.1B+ number significantly undershoots the actual lost/inaccessible ETH amount- It just covers instances where Ethereum is locked forever. For example, it doesnt cover lost private keys or things like Genesis wallets that have presumably been forgotten.

According to IntoTheBlock stats, 66,000 Ethereum have been burned from the supply.

ETH currently trades at $1,810.33 with a 24-hour trading volume of $10,612,154,207 (10.6B).

- Advertisement -

Read more from the original source:

636,000 Ethereum (ETH) Worth $1.1B Lost Forever - The Crypto Basic

Read More..

Institutional Investors Shift Focus On These Altcoins, Sell off Bitcoin and Ethereum – Coinpedia Fintech News

Bitcoin price has reached a crucial resistance level of around $28k as traders await tomorrows Federal interest rates. With over $126 million liquidated in the past 24 hours, Bitcoins volatility is expected to increase before and after the FOMC statement. The Bitcoin market continues to enjoy a bullish sentiment fueled by the increased fear of a global banking crisis.

The pressure on Jerome Powell to save the banking industry and reduce the dollar inflation to 2 percent has fueled the recent Bitcoin pump. Moreover, Bitcoin has annual inflation of less than 2 percent, and next years halving event will reduce the figure much lower. As such, institutional investors and retail traders have gained confidence in stashing more Bitcoins to flee the inflationary fiat market.

Furthermore, Bitcoin price has rallied over 70 percent YTD while the United States Dollar DXY index and the DOW Jones are down 0.41 and 2.69 percent, respectively, in the same timeline.

A recent report by CoinShares has, however, indicated a rather intriguing phenomenon. Reportedly, Bitcoin recorded a total of $113 million in outflows despite a 17 percent spike during the week. Notably, CoinShares reported that the overall outflow in the Bitcoin market was due to the need for liquidity rather than the negative sentiment.

In stack contrast to the broader market, Bitcoin remained the focus of negative sentiment, seeing outflows in investment products totaling $113 million last week, with the last six weeks outflows totaling $424 million, CoinShares noted.

Notably, the altcoin market, except Ethereum, which registered an outflow of $13 million last week, generally posted cash inflows of approximately $1.3 million. This is despite the fact that the number of Non-Zero Ethereum addresses just reached an ATH.

Read more:

Institutional Investors Shift Focus On These Altcoins, Sell off Bitcoin and Ethereum - Coinpedia Fintech News

Read More..

Bitcoin, Ethereum, Dogecoin Slide On Credit Suisse Worries – Benzinga

Major coins traded in the red on Wednesday evening as investors reacted to the possibility of a European banking crisis arising fromworries surroundingCredit Suisse Group(NYSE:CS).

$0.069

What Happened:Apex cryptocurrencyBitcoin(CRYPTO:BTC) experienced sharp volatility, dropping to $23,946 before quickly regaining ground to trade at $24,346. That was still a ways off its highs of the previous day when it surged past $26,000 in response to mildly upbeat consumer price index inflation data for February.

Enter your email and you'll also get Benzinga's ultimate morning update AND a free $30 gift card and more!

Ethereum(CRYPTO:ETH) was down about 3.97% at below $1,700.Dogecoin(CRYPTO:DOGE) was trading at $0.069, down 7.78% in the last 24 hours.

At the time of writing, the global cryptocurrency market capitalization stood at $1.06 trillion, a decrease of 3.74% over the last day.

U.S. stocks traded mixed on Wednesday. The S&P 500 dropped 0.7%, while the Nasdaq Composite eked out a small gain, rising 0.05%.

See More:Best Crypto Day Trading Strategies

News Highlights:Coinbase Global(NASDAQ:COIN) on Wednesday said it was preparing to end support for six altcoins built on Ethereum:Rally(CRYPTO: RLY),DFI Money(CRYPTO:YFII),Mirror(CRYPTO: MIR),OMG Network(CRYPTO:OMG),Loom Network(CRYPTO: LOOM), andAugur(CRYPTO:REP).

Euler Financesent a message to the hackers who caused the platform to lose nearly $200 million in assets: "If 90% of the funds are not returned within 24 hours, tomorrow we will launch a $1M reward for information that leads to your arrest and the return of all funds."

Analyst Notes: Bitcoin weakened as chaos across Wall Street saw another banking crisis trigger another wave of panic-selling of risky assets. Credit Suisse is a bigger story than SVB, and this has Wall Street extremely nervous. Bitcoins decline isnt that bad when you consider how much pressure is hitting stocks, oil prices, and the euro. Banking turmoil could ultimately prove to be rather bullish for Bitcoin, but for now crypto weakness is justified, saidEdward Moya, senior market analyst at OANDA.

Michael van de Poppe,founder and CEO ofEight Trading,urged people to buy Bitcoin and have a long-term horizon. He said, Banks are down 10% on the day, as the dominos start to fall apart. Saving a few banks doesn't say the system is safe, he said.

Charles Edwards, founder and CEO of investment firm Capriole is confident that Bitcoin is finally seeing a Textbook perfect Bump & Run Reversal" bottom, and he believes the target for Bitcoin is over $100,000.

Data analytics platform Santiment saidvolatility was "up in a big way" due to bank collapses, the shifting news on interest rates, and mounting fears over USDC.

Read Next: Web3 Token Surpasses Bitcoin, Ethereum In Weekly Gains With 33% Surge On Microsoft Tie-Up

2023 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Visit link:

Bitcoin, Ethereum, Dogecoin Slide On Credit Suisse Worries - Benzinga

Read More..

3 Key Ingredients for Making the Most of State Data Work – Government Technology

Despite the boom in data science, government projects that involve large data sets and powerful data tools still have a surprisingly high failure rate. Government agencies are increasingly seeking ways to use the data they already collect to achieve their measurement, evaluation and learning goals, but they often do not have the capacity or the right mix of staff to carry out data projects effectively.

The Center for Data Insights at MDRC, a nonprofit, nonpartisan research organization, recently partnered with state agencies to develop and execute a variety of data projects. We learned that barriers to success are not primarily about technical issues or analytic methods. Rather, data projects need three essential ingredients to be successful: people, perseverance and project scoping.

For example, MDRC worked with the New York State Office of Temporary and Disability Assistance to explore factors associated with long-term cash assistance receipt. The agency gathered an 11-person cross-functional team that included researchers, programmers, employment experts and operational staff members who worked regularly with local offices. Team members who did not have technical expertise provided content expertise and contextual information that were instrumental for both data quality assurance and interpretation of the analysis. The collaborative process prompted the technical staff to ask questions such as How can different local offices use this analytical information in a practical way? as they conducted their data analysis.

Perseverance is essential to success in any data project. Teams using new data techniques often go through a hype cycle in which high expectations for exciting results from a planned data analysis are frustrated by an analytic challenge. Successful teams persevere and adjust their original plans as needed.

The Colorado Department of Human Services was exploring the use of supportive payments, which are additional cash payments that can be used for the basic needs of the most vulnerable families who participate in the Temporary Assistance for Needy Families (TANF) program. They first wanted to know how the timeliness of certain types of supportive payments were related to employment outcomes, but the way the data had been recorded and tracked did not allow them to analyze data by payment type. Once they adjusted their research question to investigate the relationship between payment receipt and employment, they found selection bias issues that led to misleading findings about supportive payments. The team then tried several different ways to reduce the bias before identifying the approach that more accurately estimated the positive contribution of supportive payments to employment outcomes.

Project scoping is a way to set boundaries on your project by defining specific goals, deliverables and timelines. Designers should make room to be agile as they determine the scope of their data projects. The idea is to start small and then use what you learn to build more complex and nuanced analyses.

For example, the Washington Student Achievement Council (WSAC), the agency that oversees higher education in the state of Washington, wanted to learn about whether the United Way of King Countys Bridge to Finish campaign, which provides services to college students who may be at risk of insecurities for food, housing or other basic needs, could help students persist and earn a degree. The project scope began with a simple task: specify demographic and service use characteristics of students that may be associated with academic persistence and determine if these characteristics are measurable with the available data. This allowed the team to focus on the questions that were answerable based on data quality and completeness: Did the program recruit and serve students from historically marginalized groups? Was the program model flexible enough to address students most pressing needs?

If instead the project had been scoped to begin with more complexity, like building a predictive risk model to identify students who might not persist or complete college, the project would have been stymied because of insufficient data and an incomplete analytical tool. For the Bridge to Finish campaign, the simpler approach at the outset, with a project scope that was flexible enough to change as data challenges emerged, ended up leading to findings that were much more useful and actionable.

Setting up data projects for success is not primarily about data itself. Instead, it is about people who are planning designing, and pushing through challenges together. Projects that are scoped effectively and that encourage project teams to persevere through challenges yield better results, richer findings and ultimately help government agencies fulfill their missions.

Edith Yang is a senior associate with the Center for Data Insights at MDRC, a nonprofit, nonpartisan research organization.

More here:

3 Key Ingredients for Making the Most of State Data Work - Government Technology

Read More..

How has national wellbeing evolved over time? – Economics Observatory

National wellbeing is normally measured by surveying individuals and collating their responses. There are many well-known national and international sources, including the World Values Survey, the World Happiness Report and Eurobarometer.

Unfortunately, these measures only give us between ten and 50 years of data, which is not ideal for forming a long-run understanding of how national wellbeing has changed over time. We can supplement these measures with modern techniques from data science including text analysis, which make it possible to infer mood from the language that people use.

This technique allows us to roll back measures of national wellbeing to around 1800 and gives us considerable insight into how national wellbeing has evolved over time. In particular, we can see that income matters for wellbeing but perhaps not by as much as we might have thought. Aspirations matter too.

Health correlates well with wellbeing as we might expect, but perhaps the most important factor in keeping wellbeing levels high has been avoiding major conflicts. This analysis provides us with some understanding of the most striking peaks and troughs of human happiness over time.

National wellbeing is far from a new concept, but it has become increasingly normalised as a potential policy objective for governments as data have become more readily available.

Watershed moments include when the United Nations (UN) asked member countries to measure happiness and use the data to guide policy in 2011, publication of the first World Happiness Report in 2012 and the UN International Day of Happiness. This annual occasion was first celebrated in 2013 and has since become a global focus for all things related to happiness.

The UNs World Values Survey has contained a question on happiness since 1981. This initially covered 11 countries but the number had risen to 100 in the 2017-22 wave. Other regional or national surveys provide slightly longer duration data.

Eurobarometer a public opinion survey in the European Union is probably the most well-known of these surveys. It has data on life satisfaction going back to 1972 for a selection of European countries. The World Happiness Report also includes global data on wellbeing that amounts to around ten years worth of data.

What this means is that we have a maximum of around 50 years of data for a small number of countries, and perhaps ten years for most others. This is not enough to enable us to understand fully how an important socio-economic variable changes over time. Neither does it allow us to analyse how wellbeing responds to major social or economic shifts, wars, famines, pandemics and many other big events that tend to occur relatively rarely.

To go back further, we have to move beyond traditional methods of data collection and rely on non-survey methods.

Our work explores how we can measure national wellbeing before the 1970s using text data from newspapers and books. The principle is that peoples mood can be extracted from the words that they use (Hills et al, 2019). This allows us to supplement traditional methods by constructing a long-run measure of national wellbeing going back 200 years.

Many national and international surveys measure reported wellbeing. For example, the World Values Survey includes a typical question: Taking all things together would you say you were very happy/quite happy/not very happy/not at all happy?

This use of a short ordered set of answers, or a Likert scale, is also used in the World Happiness Report though with an expanded range of zero to ten rather than just four possible responses. Respondents are asked to place their current level of wellbeing somewhere in this scale. This has led to the idea of the Cantril ladder since respondents are asked to think of each number as a rung on a ladder.

Other national surveys also follow the Likert approach. For example, Eurobarometers life satisfaction measure asks respondents: On the whole, how satisfied are you with the life you leadvery unsatisfied/not very satisfied/fairly satisfied/very satisfied?

There is debate about how concepts such as happiness and life satisfaction differ. But most accept that life satisfaction is a longer-term measure, while current happiness is more vulnerable to short-term fluctuations.

Nevertheless, averaged across large numbers of respondents and across long periods, most measures that use words like wellbeing, satisfaction or happiness tend to be correlated. What all of these surveys have in common is the need to interview a large number of people, which is costly in terms of time and organisation. That explains why data tend to be annual. Unfortunately, this provides a limit on the speed with which we can build up a good supply.

To generate more data, especially from the past, we need to use non-survey methods. One approach is to make use of well-known results from psychology indicating that mood can be inferred from language. These insights have been used successfully at the individual level to pick up sentiment from social media posts and other sources (Dodds and Danforth, 2009).

To scale this to the national level, we need two things: a large body of text data (a corpus) and a way to translate text into numerical data (a norm). We use several examples of each, but to give a feel for how this works, Google have digitised millions of books published between 1500 and the present, allowing us access to billions of words. This is one of the core sources for our work.

The main norm used is Affective Norms for English Words known as ANEW (Bradley and Lang, 1999). This converts words into numbers that measure happiness (text valence) on a scale of one to nine. For example, the word joy scores 8.21, while stress scores only 1.79.

We then shrink the set of words down to a common 1,000 that appear widely across time and different languages. Finally, we construct a weighted average of implied happiness in text for a number of different languages and periods. For example, we take the weighted average text valence for each year in books and newspapers published in the UK from 1800 to 2009, and we call this the National Valence Index (NVI).

To see how this works, imagine two years in which the number of words is the same but there is a shift from words like joy to words like stress. In this case, the weighted average text valence score would fall significantly.

Validation is crucial: we need to be sure that our measure corresponds with survey measures. It is also necessary to recognise and control for changes in language over time and, of course, variations in literacy and the purpose of literature.

First, this measure is highly correlated with survey results. Further, the correlation is positive: when the nation is happy (according to survey data), the text we read and write tends to be happy (high valence). The reverse is true when the nation is sad.

Second, the measure needs to control for language evolving over time. We do this by looking at the neighbourhood around words. Specifically, if we see that a word is surrounded by different words over time, this tends to mean that the word has changed meaning. In this case, it is removed from the 1,000, and we go down to the 500 most stable words those that have the same words in a neighbourhood around them. This study also includes controls for literacy. It is limited to the period post-1800 when literacy levels were high in the UK and when text data are mainly coming from novels (as opposed to a large share being religious texts or legal documents as in the 1600s).

Using this text measure, we can document longitudinal shifts in happiness over time. But we need to be careful when interpreting graphical data. First, comparisons are best made over short durations. In other words, rates of change are always more valid than looking at long-run levels. Second, the quantity of data has risen over time, which makes more distant history more prone to error.

Figure 1 shows a book-based NVI measure for the UK. It highlights huge falls and rises surrounding the two world wars in the 20th century.

This provides a clue as to the major force that has driven wellbeing in the past: avoiding major conflicts. Analysis that looks at how our measure changes alongside variations in other major socio-economic variables also sheds light on other key drivers.

National income does correlate with national wellbeing, but the effect sizes are small. In other words, it takes a very large rise in national income to produce a small increase in wellbeing. National health, traced using proxy measures such as life expectancy or child mortality, unsurprisingly correlates with national wellbeing.

The data also show how powerful aspirations seem to be. To highlight this, we can look at the later 20th century. We see a sharp rise from 1945 up to 1957 (when Harold Macmillan famously said that the country had never had it so good), but then there is a slow decline through to 1978-79 (the aptly named Winter of Discontent).

In line with current thinking on what influences wellbeing, this seems to reflect expectations. In the period following the Second World War, hopes were high. But it seems that they were not fully realised, pushing wellbeing down. This occurred even though there were significant increases in productivity and national income, and improvements in technology between the 1950s and the 1970s.

Crucially, people seem to be largely thinking about their wellbeing relative to where they thought they might be. As a result, the 1950s seemed good relative to the 1940s, but the 1970s did not satisfy hopes relative to the 1960s.

Previous research has also argued that aspirations play a role in determining reported wellbeing (Blanchflower and Oswald, 2004). It has even been stated that more realistic aspirations are part of the reason why happiness rises after middle age for many people Blanchflower and Oswald, 2008).

To use language to measure happiness, we need books and newspapers to have been digitised and norms to be available, which restricts the number of countries we can analyse. One way around this is to use audio data.

Music is sometimes called a universal language and a language of the emotions. It can be sad, happy, exciting, dull, terrifying or calming and these emotions can span different cultures and time periods.

Working with a group of computer scientists, we have developed a machine-learning algorithm that can recognise 190 different characteristics of sound and use these to estimate the happiness embodied in music (Benetos et al, 2022).

The algorithm first needed to be trained on sound samples where we already know the embodied happiness this is the equivalent of using a norm for text. The equivalent of the corpus of text is the music itself, and to maximise the chances of measuring national mood, we focus on top-selling music.

This study finds that the mood embodied in a single popular song seems to be better at predicting survey-based mood than the vast amount of text data that we use. This seems remarkable until you remember that language contains a mixture of emotional content and information. This might explain why using music, which has a greater emotional content, could be a better way to capture wellbeing, especially for nations where text data are sparse. Putting this all together, our hope is that as data science, computational power and behavioural science advance, our understanding of national wellbeing will continue to improve. This can only help policy-makers to develop a better understanding of how government policy or major shocks are likely to affect the wellbeing of the nation.

Visit link:

How has national wellbeing evolved over time? - Economics Observatory

Read More..

LabWare Announces Foundational Integration to Software and … – PR Newswire

LabWare shares how they will use data science and machine learning to allow their customers to save time, money and resources, giving them a competitive advantage.

PHILADELPHIA, March 21, 2023 /PRNewswire/ -- LabWare announced today at Pittcon 2023, the premier annual conference on laboratory science, that it is making data science and machine learning foundational to its software. This concept is unique to the industry, and will enable labs of the future.

"This integration will revolutionize the way that laboratories handle data, enabling them to uncover insights that were previously hidden," said Patrick Callahan, Director of Advanced Analytics, LabWare. "As a global leader of laboratory information management systems, we need to stay one step ahead in the industry, and we have a responsibility to our clients."

As the pandemic has changed the way of the world, LabWare has played a key role in making sure the labs around the world operated in making life saving discoveries and producing results.

LabWare actively works with public and private sector organizations worldwide to apply their considerable know-how and advanced technology to enhance workflow and operational efficiency in the lab. These efforts to increase laboratory testing capacity have met the unprecedented public health testing demands. Data serves as the fabric inside LabWare's application and maximizing its potential is foundational to their platform development.

"In today's day and age, there's a huge need to not only acquire data, but also understand it and apply it to scientists' and lab manager's tasks without taking them outside their normal work streams," Callahan said. "That's where LabWare analytics comes in, to help our customers explore and leverage the data they've acquired. This will be critical as we move into new methods of Automation and discovery."

Through client conversations, Labware has found having data science and machine learning foundational to what they do enables their clients to succeed in the lab and beyond.

"We intend to ensure our customers have the competitive advantage they need by leveraging our solutions," Callahan said.

To learn more about LabWare, visitwww.labware.com.Visit LabWare at Pittcon March19-22 at Booth #2442

LabWare is recognized as the global leader of Laboratory Information Management Systems (LIMS) and instrument integration software products. The company's Enterprise Laboratory Platform combines the award-winning LabWare LIMS and LabWare ELN, which enables its clients to optimize compliance, improve quality, increase productivity, and reduce costs. LabWare is a full-service informatics provider offering software, professional implementation and validation services, training, and world-class technical support to ensure customers get the maximum value from their LabWare products.

Founded in 1978, the company is headquartered in Wilmington, Delaware with offices throughout the world to support customer installations in over 125 countries.

Contact:Katie Zamarra917.379.5422[emailprotected]

SOURCE LabWare

Read the rest here:

LabWare Announces Foundational Integration to Software and ... - PR Newswire

Read More..

Scientists bid to unlock the darkest secret of whiskys unique taste – Yahoo News UK

Scientists bid to unlock the darkest secret of whiskys unique taste (Image: Diageo)

FOR hundreds of years, it has been the process which gives whiskies their unique aroma, body and taste although nobody is exactly sure how.

But now scientists from Heriot-Watt University in Edinburgh are working with whisky giant Diageo to quantify exactly how whisky gets its flavour from the cask during maturation.

Heriot-Watt University has assembled a team of experts for the project, which includes researchers from the International Centre for Brewing and Distilling (ICBD), but also scientists who specialise in chemistry, physics, machine learning and data science.

Enjoying our unrivalled business coverage and analysis? Make it official with a Herald subscription for only 1 for three months.This offer ends Friday so click here and don't miss out!

The three-year Knowledge Transfer Partnership (KTP) will investigate various new analytical methods to get to the bottom of what goes on in a whisky barrel.

The scientists will use spectroscopic methods to identify the natural compounds and understand the chemistry of the process that imparts such distinctive and complex flavours to the maturing spirit.

This information will be coupled with sensory input such as visual inspection of newly-manufactured barrels and the evolving flavour and nose of the whisky.

Together it will be used to develop a data science platform that will demystify the process of maturation.

Read More:Scapa Distillery launches Orkney whisky tourism experience

Professor Annie Hill from Heriot-Watts ICBD said: Producing a quality Scotch whisky is an art.

The new KTP with Diageo is particularly exciting because it combines traditional and novel methods to generate big data that may be used to further understand whisky maturation.

The ability to more accurately predict the outcome of maturation based on the characteristics of the cask and new make spirit will enable significant improvements in inventory management and reduction of losses, leading to overall efficiencies in Scotch whisky production.

Story continues

Until now, no-one has been able to pin down scientifically about what exactly happens inside the cask.

Read More: The Macallan single malt Scotch whisky in deal for Jerez sherry casks

The maturation of a whisky takes place in three very essential steps through a complex chemical reactions described as additive, subtractive and interactive maturation.

During additive maturation, the distillate begins to draw flavours and colours from the wood which are then distributed throughout the liquid, and a large number of chemical compounds are formed.

In addition, during additive maturation, the whisky absorbs compounds from the liquid that was previously stored in the barrel, such as sherry.

After the whisky has built up a broad spectrum of aromas, subtractive maturation is about breaking down unwanted flavours.

The third area of maturation is the least fully understood and is called interactive maturation. This is where the peculiarity of the oak wood and the influence of the climate come into play.

During interactive maturation, gases enter and escape the cask and there is an exchange between the contents of the barrel and the environment through the barrel wall.

What enters or leaves is determined by temperature fluctuations and humidity, as well as the peculiarities of the warehouses and their location.

Matthew Crow, research partnerships manager with Diageos Global Technical team, said: Scotch is matured for at least three years and often much longer, a process that enriches and refines its flavour.

Read More: Amber nectar goes green as whisky has a low carbon makeover

However, a barrels potential for imparting flavour, and how the whisky will mature in that barrel, involves many complex factors.

The industry, and Diageo in particular, have a long history of research across whisky production, and Heriot-Watts scientists will help us to take our understanding to a new scientific level.

Professor Martin McCoustra is an expert on the interaction of chemical substances with complex surfaces and will be coordinating the cross-disciplinary team from Heriot-Watt University. He said: Well start with simple optical and ultraviolet imaging of freshly-prepared barrels and then use infrared and optical spectroscopies to give us their chemical fingerprints.

Well also use nuclear magnetic resonance, which is the laboratory equivalent of the MRI scan you would get in hospital, and mass spectrometry to trace how the chemical fingerprint of the spirit changes.

These chemical fingerprints will include information on the natural compounds that contribute to the flavour of the maturing spirit. The skills of the coopers, distillers and blenders will give us a sensory evaluation of the barrel and evolving spirit.

All this data will be used to train a machine learning system that will predict what the flavour quality of the whisky could be. This could significantly enhance whisky production, giving better data upon which to base fundamental decisions, such as how long a whisky should stay in a barrel.

Originally posted here:

Scientists bid to unlock the darkest secret of whiskys unique taste - Yahoo News UK

Read More..

Ottawa to host world-leading event on statistics and data science – Canada NewsWire

OTTAWA, ON, March 20, 2023 /CNW/ - Statistics Canada is pleased to announce that the 64th World Statistics Congress (WSC) will be held in Ottawa, Canada, from July 16 to 20, 2023, at the Shaw Centre.

The WSC is the leading international statistics and data science event, held every two years by the International Statistical Institute (ISI) since 1887. This marks the second time Canada has welcomed the WSC, having first hosted in 1963.

Statistics Canada is proud to support and participate in this year's event in the National Capital Region, which presents an opportunity to discuss the concrete statistical and data issues of our time, network and collaborate with experts, showcase data and statistical practices in Canada, and learn from practices and insights from other countries.

The four-day congress will provide a scientific program featuring hundreds of experts, including distinguished international statisticians, data scientists, industry leaders and renowned speakers from over 120 countries.

"The WSC is an incredible opportunity to discuss statistics that are crucial to decision making, share insights and learn from many other countries," says Anil Arora, Chief Statistician of Canada. "Statisticians have never been more relevant to helping solve global challenges than they are today."

"The congress encourages collaboration, growth, discovery and advancement in the field of data science," says ISI President Stephen Penneck. "I am excited to have the 64th World Statistics Congress visit Canada and look forward to the impact it will have on the industry. We are delighted to announce that one of the most influential statisticians, the former Director of the United States Census Bureau, Professor Robert M. Groves, will be joining as a keynote speaker."

Statistics Canada looks forward to welcoming experts in this field from around the world and taking part in presentations, panel discussions and more.

Associated links:

Registration information World Statistics Congress 2023 website

SOURCE Statistics Canada

For further information: Media Relations, Statistics Canada, [emailprotected]

Read the original here:

Ottawa to host world-leading event on statistics and data science - Canada NewsWire

Read More..