Page 1,854«..1020..1,8531,8541,8551,856..1,8601,870..»

Briefing On Vitalik Buterins Long-Term Vision for The Ethereum Blockchain – Cryptopolitan

If you spend any time online, youve undoubtedly heard about Ethereum. Just as a reminder, Ethereum (ETH) is a decentralized blockchain platform. It can be used by anyone to create digital technology. Software developers are able to build applications in finance, advertising, identity management, gaming, and web browsing, to name a few. Ether is the cryptocurrency that fuels the network it allows it to operate. Like Bitcoin, Ether can be used for payments. There are so many companies that accept ETH, so you can spend your Ethereum anywhere. Ethereum brings real value.

Ethereum helps create a decentralized computer, which makes possible smart contracts and DApps. Smart contracts are special kinds of programs that run when predetermined conditions are met. They operate based upon an if, then logic. The apps run according to the given instructions, so theres no chance of latency, restriction, deceit, or third-party interference. DApps, also called decentralized applications, exist on the peer-to-peer network of computers. More often than not, theyre accessible via traditional web browsers such as Google Chrome and Firefox.

As opposed to Bitcoin, which has a limited scripting language, ETH runs on Solidity, which creates machine-level code and incorporates it within the Ethereum Virtual Machine. Its similar in nature to C++ and pretty simple to learn. The Ethereum blockchain is capable of executing code of unmatched complexity. Ethereum 1.0 was an attempt to build a world computer. Ethereum 2.0 will be the world computer. It may subsume the functions of the Internet as we know it. What is certain is that the Merge will make a difference as regards the Ethereum ecosystem (and more than that).

At the time of the launch, Ethereum was one of the most formidable projects in the crypto space. Vitalik Buterin and his supporters wanted to change how the Internet works. Many argue that Ethereum is the Internets next step. Ethereum 2.0, the upgrade to the blockchain network, will improve the speed, efficiency, and scalability of the network. ETH will be used by more and more people. The transition to the Ethereum 2.0 world has been slow, though. In spite of this, adoption is still growing. Compared to other cryptocurrencies, the transaction volume is higher.

For the time being, the priority is to address the restrictions of proof-of-work. The platform is moving to the proof-of-stake consensus mechanism, which promises to use less energy (approx. 99% less) and help reach 100 000 transactions per second. Validators are chosen based on the number of tokens they possess. Those who spend money on coins practically invest in the networks continued success. Validators cant corrupt the system, as proof-of-stake has checks and balances in place to prevent this from happening.

Theres so much traffic on the Ethereum blockchain, and this overload can result in high transaction fees. The solution to this problem is simple: layer-2 chains. Chains like Polygon complete more transactions per second with lower gas fees. Speaking of which, Polygon is the most widely adopted layer-2 solution for ETH. It processes transactions outside the mainnet, which explains the increased throughput. In case you didnt know, Ethereums layer-2 solutions fall under several categories, namely channels, plasma, sidechains, rollups, and validium. Many of them are undergoing research, testing, and implementation.

As highlighted by Vitalik Buterin, soon enough, Ethereum will be run on a full node using lighter hardware. A single piece of client software will be enough to run a full node. When a transaction is added to the blockchain network, the full node validates the transaction and ascertains it complies with the Ethereum specification. The full node prunes its blockchain to save disk space. Thus, full nodes dont store data back to genesis. Most laptops are eligible for being full nodes. The more nodes in existence, the more unlikely it is for a cyberattack to succeed.

Cryptocurrencies achieve decentralized security and trust. Cryptography is the pillar of cryptocurrency processing. Encrypted information that is transmitted with an algorithm can be deciphered by a quantum computer, so threat actors can intercept that information. Quantum-resistant cryptography can protect data from threats down the road. Vitalik Buterin is looking ahead into the future and plans to upgrade the Ethereum platform for quantum resistance. Its believed there are several years ahead until ETH will experience a threat to its current cryptographic signatures, but its better to be safe than sorry.

Ethereum wont hide from quantum computers, so dont rush to sell your Ether just yet. Try to imagine what it will become in the future. Wise investors dont sell their coins. Quite the opposite, actually. They withdraw liquidity from exchanges like Binance, which has a positive effect on the Ethereum price. Getting back on topic, competitions are constantly organized for researchers to standardize new cryptographic protocols that protect against quantum attackers. Better protocols are needed for improving zero-knowledge proofs. The Ethereum Virtual Machine generates zero-knowledge proofs to guarantee the correctness of programs. ZPK systems can be post-quantum secure.

Given the current and planner improvements to the Ethereum blockchain, Ethereum might well become the dominant chain as far as transaction volumes are concerned. As the platform increases its efficiency, it might work alongside multichain technologies. The emergence of a highly competitive ecosystem will enable ETH to expand its capacity and allow for remarkable results. The one wont replace the other, that is for sure. Down the line, there will be more vertically focused blockchains for specific use cases, including healthcare and gaming.

All in all, Ethereum is working towards solving its problems. As businesses get funded via Ether, an ever-increasing number of people will become familiar with the digital asset. Startups that have raised money through an ICO will end up surviving in the long term and help promote the mainstream adoption of cryptocurrencies. Well just have to wait and see what the future holds for ETH. Good things dont come easy.

Read the original post:
Briefing On Vitalik Buterins Long-Term Vision for The Ethereum Blockchain - Cryptopolitan

Read More..

Ray, the machine learning tech behind OpenAI, levels up to Ray 2.0 – VentureBeat

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Over the last two years, one of the most common ways for organizations to scale and run increasingly large and complex artificial intelligence (AI) workloads has been with the open-source Ray framework, used by companies from OpenAI to Shopify and Instacart.

Ray enables machine learning (ML) models to scale across hardware resources and can also be used to support MLops workflows across different ML tools. Ray 1.0 came out in September 2020 and has had a series of iterations over the last two years.

Today, the next major milestone was released, with the general availability of Ray 2.0 at the Ray Summit in San Francisco. Ray 2.0 extends the technology with the new Ray AI Runtime (AIR) that is intended to work as a runtime layer for executing ML services. Ray 2.0 also includes capabilities designed to help simplify building and managing AI workloads.

Alongside the new release, Anyscale, which is the lead commercial backer of Ray, announced a new enterprise platform for running Ray. Anyscale also announced a new $99 million round of funding co-led by existing investors Addition and Intel Capital with participation from Foundation Capital.

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Ray started as a small project at UC Berkeley and it has grown far beyond what we imagined at the outset, said Robert Nishihara, cofounder and CEO at Anyscale, during his keynote at the Ray Summit.

Its hard to understate the foundational importance and reach of Ray in the AI space today.

Nishihara went through a laundry list of big names in the IT industry that are using Ray during his keynote. Among the companies he mentioned is ecommerce platform vendor Shopify, which uses Ray to help scale its ML platform that makes use of TensorFlow and PyTorch. Grocery delivery service Instacart is another Ray user, benefitting from the technology to help train thousands of ML models. Nishihara noted that Amazon is also a Ray user across multiple types of workloads.

Ray is also a foundational element for OpenAI, which is one of the leading AI innovators, and is the group behind the GPT-3 Large Language Model and DALL-E image generation technology.

Were using Ray to train our largest models, Greg Brockman, CTO and cofounder of OpenAI, said at the Ray Summit. So, it has been very helpful for us in terms of just being able to scale up to a pretty unprecedented scale.

Brockman commented that he sees Ray as a developer-friendly tool and the fact that it is a third-party tool that OpenAI doesnt have to maintain is helpful, too.

When something goes wrong, we can complain on GitHub and get an engineer to go work on it, so it reduces some of the burden of building and maintaining infrastructure, Brockman said.

For Ray 2.0, a primary goal for Nishihara was to make it simpler for more users to be able to benefit from the technology, while providing performance optimizations that benefit users big and small.

Nishihara commented that a common pain point in AI is that organizations can get tied into a particular framework for a certain workload, but realize over time they also want to use other frameworks. For example, an organization might start out just using TensorFlow, but realize they also want to use PyTorch and HuggingFace in the same ML workload. With the Ray AI Runtime (AIR) in Ray 2.0, it will now be easier for users to unify ML workloads across multiple tools.

Model deployment is another common pain point that Ray 2.0 is looking to help solve, with the Ray Serve deployment graph capability.

Its one thing to deploy a handful of machine learning models. Its another thing entirely to deploy several hundred machine learning models, especially when those models may depend on each other and have different dependencies, Nishihara said. As part of Ray 2.0, were announcing Ray Serve deployment graphs, which solve this problem and provide a simple Python interface for scalable model composition.

Looking forward, Nishiharas goal with Ray is to help enable a broader use of AI by making it easier to develop and manage ML workloads.

Wed like to get to the point where any developer or any organization can succeed with AI and get value from AI, Nishihara said.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Read the original here:
Ray, the machine learning tech behind OpenAI, levels up to Ray 2.0 - VentureBeat

Read More..

‘Machine Learning’ Predicts The Future With More Reliable Diagnostics – Nation World News

Headquarters of the Council of Higher Scientific Research (CSIC).

a bone scan Every two years for all women aged 50-69. Since 1990, that is The biggest testing challenge for the national health systemAnd it aims to prevent one of the most common cancers in Spain, that is Mother, The method is X-rays that detect potentially cancerous areas; If something suspicious is found, that test is followed by more tests, often High probability of false positives, harmful and costly,

they are curvature This is the main reason why screening is limited to the highest risk groups. By adding predictive algorithms to mammograms, the risk areas of a patients breasts would be limited and the reliability of diagnosis increased to 90 percent. Therefore, they can be done with Often and the age range of the women they target Expansion,

It is a process that already exists, which uses artificial intelligenceand that . develops a team of Superior Council of Scientific Inquiry (CSIC), specifically the Institute of Corpuscular Physics (IFIC). it is part of the scope of machine learning (machine learning) in precision medicine, and a research network that seeks to increase the efficiency with which each patient is treated and optimize health care resources.

To understand how, you must first understand the concepts that come into play. The first is artificial intelligence. the ability of a computer or robot to perform tasks normally associated with intelligent beings, defined as sara degli-apostic You carlos sierra, author of the CSIC white paper on the subject. That is, they are the processes that are used replace human work with robotsWith the aim of accomplishing this with greater accuracy and greater efficiency.

And where can artificial intelligence work in medicine today? On many fronts, he replies. dolores del castilloResearchers from CSICs Center for Automation and Robotics, From the administrative to the management of clinical documentation. And, in a more specific way, in the analysis of images, or in the monitoring and follow-up of patients. And where are the still bigger limits? Above all, in the field of health care, in legal and ethical aspects when dealing with important matters. And whats more, theres still a long way to go, explains Del Castillo, who works on the projects, among others. neurological movement disorderTraining for a large section of healthcare workers.

We find the second concept as a subfield of artificial intelligence, along with its advantages and disadvantages: machine learning, This can be translated as machine learning. That is, artificial intelligence that works through computers thatand find patterns in population groups, With these patterns, predictions are made about what is most likely to happen. machine learning translate data Algorithm,

Precision medicine to predict disease

and after artificial intelligence and machine learningThere is a third concept: the precision medicine, The one that suits the person, his genes, his background, his lifestyle, his socialization. a model that must first be able predictable disease, Second, Francisco Albiol from IFIC, continues to assess each patient, apply the best treatment based on clinical evidence, identify the most complex cases, and assess their inclusion in management programs.

It makes sense high impact disease, and does not make sense for serious diseases; For example, distinguishing the flu from a cold in primary care, as the benefits will not compensate for the effort required.

The key to the use of artificial intelligence in medicine is also cost optimization, which is very important for public health. Spains population has increased from 42 to 47 million people between 2003 and 2022, that is, more than 10 percent. and from 2005 to 2022, The average age of the population has increased from 40 to 44, We are getting older and older.

Therefore, Dolores del Castillo says, the best valued projects and, therefore, likely to be funded, are those that incorporate artificial intelligence techniques to address the prevention, diagnosis and treatment of cardiovascular diseases, neurodegenerative diseases, cancer and obesity. There is also a special focus on personal and home medicine, elderly care, and new drug offerings. The need for healthcare has been heightened by our demographics, and The aim should be to reduce and simplify the challenges with technology, we tried machine learning, summarizes Albiol.

Albiol is one of the scientists who led a program to improve breast cancer detection through algorithms. He defends, like other researchers, that if we mix machine learning with precision medicine, we should be talking about 4p medicine. Which brings together four features: Predictive, personal, preventive and participatory,

Because most purists confine precision medicine to the field of patient genetics, and would not include it in the bag that takes more characteristics into account. Those who do say that we are talking about something much broader: Applied to precision medicine, machine learning allows for Analyze large amounts of very different types of data (genomic, biochemical, social, medical imaging) and model them to be able to offer together individual diagnosismore precise and thus more effective treatment, summarizes researcher Lara Loret Iglesias of the Institute of Physics of Cantabria.

Lloret is part of a network of scientists who, like Albiol or Del Castillo, are dedicated to projects on machine learning and precision medicine. One of them developed by his team, which he leads together with fellow physicist Miriam Kobo Cano, is called Branyas. It is in honor of Spains oldest woman, Maria Branyas, who managed to overcome Covid-19: she has done so at the age of 113. In this they bring together the many casuistries of more than 3,000 elderly people, much less just genetics: machine learning establish Risk profile of getting sick or dying as a result of coronavirus, We derived data from the analysis of three risk profiles: a sociodemographic, a biological and an extended biological, which will add information on issues such as aspects related to the intestinal microbiota, vaccination and immunity.

Precision Medicine, Cancer and Alzheimers

also explain this Joseph Lewis Arcosfrom the Artificial Intelligence Research Institute. common diseases There are cancer and Alzheimers linked to precision medicine, but they have stood out with the Ictus project. Launched in the middle of a pandemic (which has made things difficult, he admits), he has treated patients at Barcelonas Belwitz Hospital who suffered strokes and, after a severe and acute phase, Have become long term,

In particular, those with movement difficulty in one hand or both. made over 700 sessions In which patients have been asked to play the keyboard of the electronic piano. Then, they transferred the analysis of finger movements to the results to see what the patterns of difficulties and improvements are. And theyve gotten particularly positive feedback among users because its not only doing an exercise, but it affects a very emotional part. The goal is now to expand it to hospitals in the United Kingdom.,

and future? Dolores del Castillo replies, I believe that the challenge of artificial intelligence in medicine is to incorporate research results into daily practice in a generalized way, but always without forgetting that it is the experts who have is the last word. To do that, doctors need to be able to rely on these systems and Interact with them in the most natural and simple wayEven helping with its design.

Lara Loret believes that we have to be able to build generalizable prediction systems, that is, the efficiency of the model does not depend on unnecessary things such as which machine the data is taken in, or how the calibration is. Francisco Albiol focuses on a problem that may be in the long run must have a solutionAt present, larger hospitals are preferred in these technologies than smaller cities or towns. convenience and reduce costs It also has to do with reaching out to everyone.

While it may include statements, data or notes from health institutions or professionals, the information contained in medical writing is edited and prepared by journalists. We advise the reader to consult a health professional on any health-related questions.

Continue reading here:
'Machine Learning' Predicts The Future With More Reliable Diagnostics - Nation World News

Read More..

Wind – Machine learning and AI specialist Cognitive Business collaborates with Weatherquest on weather forecasts for offshore wind platform -…

A data driven tool that predicts with 99.9 percent accuracy the safest and most successful windows for crew transfers to offshore wind platforms WAVES is the first technology of its kind and is already being used by RWE across its Robin Rigg and Rampion windfarms.

The collaboration has now seen RWE integrate Weatherquests API into the already operational WAVES platform on Robin Rigg to work alongside other forecast data to enable in-day and week-ahead O&M decision-support for turbine specific accessibility.

The integration of WAVES with Weatherquests API allows us to develop our unique technology yet further to make it an even more trusted tool for windfarm owners and operators to plan and schedule their O&M programmes said MD at Cognitive Business, Ty Burridge Oakland, speaking about the upgrades to its WAVES technology. WAVES is already a hugely accurate and relied upon technology in the industry for effectively, efficiently and safely deploying crews onto windfarms to conduct repairs and maintenance and by integrating weather forecast data, we can confidently say we have made an already highly valued technology an even more robust tool for managing and planning offshore wind repair and maintenance programmes.

Developed by Nottingham and London based, Cognitive Business in 2020, WAVES was funded in the same year by the Offshore Wind Growth Partnership to better predict safer and more successful windows for crew transfers to offshore wind platforms.

Steve Dorling, Chief Executive at Weatherquest, added that WAVES has developed a reputation within the offshore wind industry, over a number of years, for enabling owners and operators to deploy their crews with real accuracy and has been working to great effect on some of the UKs largest windfarms.

It therefore made absolute sense for us both, as data analysis businesses focused on supporting safety and productivity, to combine our expertise in this innovative way said Mr Dorling. Its great that we can further enhance the WAVES technology together in a market where it is already a trusted technology for identifying optimal windows for offshore wind crew transfers.

Cognitive Business is an industry leader in machine learning and applied A.I, developing a wide range of decision support, performance monitoring, and predictive maintenance solutions for offshore wind operations and maintenance applications.

Weatherquest is a privately owned weather forecasting and weather analysis company headquartered at the University of East Anglia providing weather forecasting support services across sectors in the UK and Northern Europe including onshore and offshore wind energy and ports.

For additional information:

Cognitive Business

Weatherquest

Read more:
Wind - Machine learning and AI specialist Cognitive Business collaborates with Weatherquest on weather forecasts for offshore wind platform -...

Read More..

Uber and AMC bring machine learning company Rokt onboard to drive revenue – Mugglehead

Rokt has partnered with Uber Technologies (NYSE:UBER) and AMC Theatres (NYSE:AMC) to help both companies make more money on their websites and mobile apps.

Rokt is an ecommerce tech company using machine learning to help tailor transactions to each shopper. The idea behind the technology is to give companies the chance to get additional revenue, find customers at scale and give extra options to existing customers by using machine learning to present offers to each shopper as theyre entering into the final stages of a transaction. The analog here would be the impulse buying section prior to a checkout line, except specifically tailored due to each consumer due to collected data.

Uber and AMC Theatres are two of the most recognized brands in the world and were extremely pleased to partner with both of them as we accelerate our growth globally. Our global partnership with Uber will support the Uber Eats internal ad network and unlock additional profitability for the company. Our partnership with AMC has already begun generating outstanding results for the company. We look forward to expanding our relationships with both of these companies in the future, said Elizabeth Buchanan, chief commercial officer of Rokt.

Rokts deal is ecommerce technology that helps customers find the full potential of every transaction to grow revenue. Existing customers include Live Nation, Groupon, Staples, Lands End, Fanatics, GoDaddy, Vistaprint and HelloFresh, but also extend out to include 2,500 other global businesses and advertisers. The company is originally from Australia, but its moved its headquarters to New York City in the United States, and has expanded out to include 19 countries across three continents.

Rokts partnership with Uber will initially launch with Uber Eats in the US, Canada, Australia and Japan, with Rokts machine learning technology driving additional revenue for Uber during the checkout experience. AMC has partnered with Rokt to drive revenue and customer lifetime value across the companys online and mobile channels.

As millions of moviegoers come to AMC each week to enjoy the unmatched entertainment of the big screen, its important that we are offering a guest experience thats personally relevant across the entire moviegoing journey. Our partnership with Rokt enables us to better personally engage our consumers and drive higher value per transaction by optimizing each online touchpoint without adding additional cost to the moviegoer, said Mark Pearson, chief strategy officer for AMC Theatres.

Rokt uses intelligence taken from five billion transactions across hundreds of ecommerce businesses to allow brands to create a tailored customer experience wherein they can control the types of offers on display to their customers. Businesses that partner with Rokt can unlock profit upwards to $0.30 per transaction through high performance techniques relevant to each individual from the moment where the customer puts the item in their digital cart to the time their payment goes through.

Read the original here:
Uber and AMC bring machine learning company Rokt onboard to drive revenue - Mugglehead

Read More..

Bfloat16: What it is and how it impacts storage – ComputerWeekly.com

Analysis and prediction are core to todays IT as organisations embark on digital transformation, with use cases that range from speech recognition and pattern analysis in science, to use in fraud detection and security, to AIOps in IT and storage.

As artificial intelligence and predictive methods machine learning, deep learning, neural processing, and so on become more prevalent, ways to streamline these operations have developed.

Key among these is the emergence of new ways of dealing with large numbers, and bfloat16 originally developed by Google is the most prominent among these.

In this article, we look at what bfloat16 is, the impact it will have on memory and back-end storage, and which hardware makers support it.

Bfloat16 short for Brain Float 16 is a way of representing floating point numbers in computing operations. Floating point numbers are a way that computers can handle very large numbers (think millions, billions, trillions, and so on) or very small ones (think lots of zeros after decimal points) while using the same schema.

In floating point schemes, there are a set number of binary bits. A bit indicates whether the number is positive or negative, some of the bits indicate the number itself, and the floating point element the exponent is a number of bits that say where the decimal point should be.

Bfloat16, as the name suggests, uses a 16-bit format to do all this. In doing so, it cuts in half the weight in bits of the most-prevalent existing standard, IEEE 754, which is 32-bit.

But bfloat16 uses an exponent that is the same size as that in IEEE 754, which allows it to represent the same range in size of numbers, but with less precision.

Bfloat16 was developed by Google the B represents the companys Brain project specifically for its tensor processing units (TPUs) used for machine learning. The key thing here is that for machine learning operations you dont need the levels of precision in terms of binary powers that other calculations might require. But you do want speed of operation, and thats what bfloat16 is aimed at.

The key benefits of bfloat16 are that it reduces storage requirements during processing and speeds up individual calculations during machine learning operations.

Bfloat16 takes half the memory of equivalent operations that use IEEE 754 32-bit numbers, meaning that more can be held in memory and take less time to swap in and out of memory. That means larger models and datasets can be used. Also, bfloat16 takes less time to load into memory from bulk storage.

Hardware support for bfloat16 is something that gets built into processors and processing units, so it will be tailored to the standard.

Back-end storage volumes are likely to be positively impacted. In other words, youll need less storage if you do a lot of machine learning operations with bfloat16. But its more likely, at least for a while, that IEEE 754 will predominate, and bfloat16 converts from that existing standard.

Bfloat16 was first deployed on Googles hardware TPUs, supported by Intel, which can be used via the providers cloud services (Cloud TPU) or bought as a product for customer on-premise use.

At the time of writing, it is also supported by Intels third-generation Xeon Scalable CPUs, IBMs Power10 and ARMs Neoverse processors.

Bfloat16 is also supported in a number of NPUs neural processing units, of which the TPU is one including in ARMs Trillium, Centaur, Flex, Habana, Intels Nervana and Wave.

See the original post:
Bfloat16: What it is and how it impacts storage - ComputerWeekly.com

Read More..

Heres why holding $20.8K will be critical in this weeks $1B Bitcoin options expiry – Cointelegraph

Bitcoin (BTC) experienced a 16.5% correction between Aug. 15 and Aug. 19 as it tested the $20,800 support. While the drop is startling, in reality, a $4,050 price difference is relatively insignificant, especially when one accounts for Bitcoin's 72% annualized volatility.

Currently, the S&P 500s volatility stands at 31%, which is significantly lower, yet the index traded down 9.1% between June 8 and June 13. So, comparatively speaking, the index of major U.S.-listed companies faced a more abrupt movement adjusted for the historical risk metric.

At the start of this week, crypto investors' sentiment worsened after weaker conditions in Chinese real estate markets forced the central bank to reduce its five-year loan prime rate on Aug. 21. Moreover, a Goldman Sachs investment bank strategist stated that inflationary pressure would force the U.S. Federal Reserve to further tighten the economy, which negatively impacts the S&P 500.

Regardless of the correlation between stocks and Bitcoin, which is currently running at 80/100, investors tend to seek shelter in the U.S. dollar and inflation-protected bonds when they fear a crisis or market crash. This movement is known as a flight to quality and tends to add selling pressure on all risk markets, including cryptocurrencies.

Despite the bears' best efforts, Bitcoin has not been able to break below the $20,800 support. This movement explains why the $1 billion Bitcoin monthly options expiry on Aug. 26 could benefit bulls despite the recent 16.5% loss in 5 days.

Bitcoin's steep correction after failing to break the $25,000 resistance on Aug. 15 surprised bulls because only 12% of the call (buy) options for the monthly expiry have been placed above $22,000. Thus, Bitcoin bears are better positioned even though they placed fewer bets.

A broader view using the 1.25 call-to-put ratio shows more bullish bets because the call (buy) open interest stands at $560 million against the $450 million put (sell) options. Nevertheless, as Bitcoin currently stands below $22,000, most bullish bets will likely become worthless.

For instance, if Bitcoin's price remains below $22,000 at 8:00 am UTC on Aug. 26, only $34 million worth of these put (sell) options will be available. This difference happens because there is no use in the right to sell Bitcoin below $22,000 if it trades above that level on expiry.

Below are the four most likely scenarios based on the current price action. The number of options contracts available on Aug. 26 for call (bull) and put (bear) instruments varies, depending on the expiry price. The imbalance favoring each side constitutes the theoretical profit:

This crude estimate considers the call options used in bullish bets and the put options exclusively in neutral-to-bearish trades. Even so, this oversimplification disregards more complex investment strategies.

Bitcoin bulls need to push the price above $22,000 on Aug. 26 to balance the scales and avoid a potential $140 million loss. However, Bitcoin bulls had $210 million worth of leverage long futures positions liquidated on Aug. 18, so they are less inclined to push the price higher in the short term.

With that said, the most probable scenario for Aug. 26 is the $22,000-to-$24,000 range providing a balanced outcome between bulls and bears.

If bears show some strength and BTC loses the critical $20,800 support, the $140 million loss in the monthly expiry will be the least of their problems. In addition, the move would invalidate the previous $20,800 low on July 26, effectively breaking a seven-week-long ascending trend.

The views and opinions expressed here are solely those of the author and do not necessarily reflect the views of Cointelegraph. Every investment and trading move involves risk. You should conduct your own research when making a decision.

Follow this link:
Heres why holding $20.8K will be critical in this weeks $1B Bitcoin options expiry - Cointelegraph

Read More..

Sudden crypto market drop sends bitcoin below $22,000 – CNBC

Bitcoin on Friday fell to its lowest level in more than three weeks, dipping below $22,000 amid a sudden crypto sell-off in early European trading.

Bitcoin plunged from $22,738 to below $21,12.34 at 2:30 a.m. ET, according to CoinDesk data. Earlier in the morning, the cryptocurrency fluctuated between $21,500 and $22,000.

It comes shortly after the world's largest digital coin surpassed the $25,000 level for the first time since June following a rise in U.S. stocks.

Ether fell from $1,808 to $1,728 at the same time before staging a muted rebound. It had slipped again, falling further to $1,683.90 by 4:00 p.m. ET.

A specific cause for a drop at that time, which also sent Binance Coin, Cardano and Solana falling, was not immediately clear.

"It's not showing the pattern of a flash crash, as the assets didn't immediately rebound sharply but sank even lower in the hours that followed," said Susannah Streeter, senior investment and markets analyst at Hargreaves Lansdown. "It seems likely that is was as a result of a large sale transaction, in the absence of other more external factors."

Bitcoin and ether ended Thursday in the red, but ether has surged more than 100% since mid-June.

Yu Chun Christopher Wong | S3studio | Getty Images

Streeter said it appeared Cardano made the first plunge downwards, followed by Bitcoin and Ether and then smaller coins like Dogecoin.

"This fresh chill has descended amid fears that the market is heading for a crypto winter," she added. "Although at $21,800 Bitcoin is still some way off its June lows of under $19,000, volatility is once again wracking the market."

The digital coins may also be following equities lower.

"US equity markets have pulled back since Wednesday's release of the July Fed meeting minutes, the key takeaway being that the Fed likely won't be finished with rate hikes until inflation is tamed across the board, with no guidance offered on future rate increases either," Simon Peters, crypto market analyst at eToro, told CNBC.

"With the tight correlation between US equities and crypto in recent months I suspect this has filtered through to crypto markets and it's why we are seeing the sell-off. The trend has also perhaps been exacerbated by liquidation of long positions on bitcoin perpetual futures markets."

Citing Coinglass data, Peters said Friday had been the biggest liquidation of long positions on futures since June 18, also the date bitcoin reached its lowest price of the year around $17,500.

Bitcoin and ether ended Thursday in the red, but ether has surged more than 100% since mid-June as investors prepare for a massive upgrade to the ethereum network.

The rest is here:
Sudden crypto market drop sends bitcoin below $22,000 - CNBC

Read More..

Crypto Whale Transfers 4,000 Bitcoin to Gemini – Finance Magnates

After a recovery of more than 30% from its recent lows, Bitcoin (BTC) again tanked by almost 11% in the last week. As a result, the network activity across the BTC network has decreased sharply in the past few days. However, whales are still moving the worlds most valuable digital asset in large amounts.

Yesterday, Whale Alert, a leading on-chain analytics and tracking platform, highlighted the movement of 4,000 Bitcoin worth more than $86 million from an unknown crypto wallet to Gemini. The transfer was executed at 22:06 UTC.

Dormant Bitcoin supply is surging. According to Glassnode, the percentage of BTC supply that was last active more than five years ago touched an all-time high of 24.4% on Monday.

The recent price uptrend also failed to attract a significant wave of new active users, which is particularly noticeable amongst retail investors and speculators. The monthly momentum of exchange flows is also not suggesting a new wave of investors entering the market, implying a relatively lackluster influx of capital, Glassnode highlighted in its recent report.

Keep Reading

Bitcoin is currently going through one of its worst market corrections. While large crypto transfers are still happening, the overall count of whale transactions is lower compared to 2020 and 2021.

The current market structure is certainly comparable with the late-2018 bear market, however, does not yet have the macro trend reversal in profitability and demand inflow required for a sustainable uptrend. Therefore, the ongoing cycle bottom consolidation phase is most likely, as Bitcoin investors attempt to lay a firmer foundation, subject of course to the persistent uncertainty and unfavorable events of the macroeconomic backdrop, the company added in the report.

Last week, $15 million worth of investment left BTC products. Almost 150,000 Bitcoin addresses are currently holding at least 10 BTC.

After a recovery of more than 30% from its recent lows, Bitcoin (BTC) again tanked by almost 11% in the last week. As a result, the network activity across the BTC network has decreased sharply in the past few days. However, whales are still moving the worlds most valuable digital asset in large amounts.

Yesterday, Whale Alert, a leading on-chain analytics and tracking platform, highlighted the movement of 4,000 Bitcoin worth more than $86 million from an unknown crypto wallet to Gemini. The transfer was executed at 22:06 UTC.

Dormant Bitcoin supply is surging. According to Glassnode, the percentage of BTC supply that was last active more than five years ago touched an all-time high of 24.4% on Monday.

The recent price uptrend also failed to attract a significant wave of new active users, which is particularly noticeable amongst retail investors and speculators. The monthly momentum of exchange flows is also not suggesting a new wave of investors entering the market, implying a relatively lackluster influx of capital, Glassnode highlighted in its recent report.

Keep Reading

Bitcoin is currently going through one of its worst market corrections. While large crypto transfers are still happening, the overall count of whale transactions is lower compared to 2020 and 2021.

The current market structure is certainly comparable with the late-2018 bear market, however, does not yet have the macro trend reversal in profitability and demand inflow required for a sustainable uptrend. Therefore, the ongoing cycle bottom consolidation phase is most likely, as Bitcoin investors attempt to lay a firmer foundation, subject of course to the persistent uncertainty and unfavorable events of the macroeconomic backdrop, the company added in the report.

Last week, $15 million worth of investment left BTC products. Almost 150,000 Bitcoin addresses are currently holding at least 10 BTC.

View post:
Crypto Whale Transfers 4,000 Bitcoin to Gemini - Finance Magnates

Read More..

Bitcoins Next Move: 5 Things to Watch – Barron’s

Bitcoin climbed almost 25% in July, but investors are unlikely to see a repeat of those gains in August. Digital assets continue to trade sideways, and Bitcointhe largest cryptocant seem to break out of the $20,000 to $24,000 range.

With Bitcoin still trading at less than one-third of its all-time high near $69,000, reached in November 2021, optimistic cryptocurrency holders are likely to continue hoping for something that will drive token prices higher. There are at least five trends that investors should be watching, according to Sheena Shah and Kinji Steimetz, analysts at Morgan Stanley.

One trend, the crypto equivalent of quantitative tightening, is the falling availability of stablecoins like Tethers USDT and Circles USDC , the analysts wrote in a note Friday. Stablecoins stand at the heart of the crypto world, forming the foundations of trading and lending activities, and their availability is a key sign of both liquidity in crypto and demand for leverage, or money borrowed to trade.

Changes in the market capitalization of stablecoinsa measure of the amount in circulation since each coin is meant to be worth a dollarcould be a leading indicator of Bitcoin prices, according to Shah and Steimetz. In June, Tethers market cap fell 20% in about a month, while Bitcoin fell 45% over the same period to below $30,000.

This week marked the first time since April that stablecoin market capitalization has stopped falling on a monthly basis, the Morgan Stanley analysts said. This may be a sign that the extreme institutional deleveraging appears to have paused for now.

A widespread halt to deleveraging in crypto could signal that the worst of the recent market turmoil is over, paving the way for institutions and other influential traders to turn bullish on Bitcoin again.

That is why changes in demand for leverage in crypto, similarly indicated by the market cap of stablecoins, is the second trend to keep an eye on. If demand for leverage in crypto rises, prompting people to move dollars into stablecoins, the market caps of stablecoins would likely rise. That could mark a bullish turn because leverage exacerbates prices swings and raises the prospect that gains from solid rallies would be juiced up.

However, there doesnt seem to be huge demand to re-leverage in the crypto world at this moment: decentralised finance (DeFi) platform lending is still down 70% this year, wrote Shah and Steimetz. In our opinion, it will be hard for this crypto cycle to bottom without fiat leverage growing or crypto leverage growing.

The third trend to watch is stablecoins market caps relative to one another, specifically swings in the relationship between the amount of issued USDT and USDC, the most influential stablecoins pegged to the U.S. dollar and the third- and fourth-largest digital tokens.

Typically, the market caps of USDT and USDC move in opposite directionsi.e. traders seem to generally rotate out of one and into anotherand Morgan Stanley sees a link between periods when USDC total value is growing and later gains in Bitcoin prices.

The general trends in USDC market capitalization growth appear to lead growth of Bitcoins price about two months later, the Morgan Stanley team said, noting that of course we cannot use this as a trading signal as the historical relationship is volatile, has outliers (Bitcoin rally in June) and not a long history.

Nevertheless, it is a compelling flag. Tethers market cap fell by $9 billion over the course of a week in early May, to $74 billion from $83 billion, while USDCs market cap jumped to $52 billion from $48 billion over the same period. Two months laterJulysaw Bitcoin notch its best month all year.

Now, this trend looks to be reversing course, with USDCs circulation now down almost $4 billion from its July peak while issuance of USDT has been growing. If the pattern holds, that could be negative for Bitcoin.

Ultimately, however, the macroeconomic picture is what matters, according to Morgan Stanley. While Bitcoin and its peers should, in theory, be uncorrelated to mainstream finance, they have shown to be largely linked to other risk-sensitive bets, like tech stocks. A lot of the gains for tech stocks, and crypto, in recent years can be put at the feet of loose central-bank policy that has injected liquidity into global markets.

Since 2013, Bitcoins market capitalization growth has generally tracked the growth of global fiat M2 money supply. When central banks eased and injected liquidity, that liquidity ended up in risk assets, including crypto, the analysts said. We expect [Bitcoins] correlation with the equity markets to remain high.

Expectations of future money supply growth, which is a function of the size of the Federal Reserves balance sheet and interest rates, are likely to be the most dominant force on Bitcoin prices, according to Morgan Stanley.

Near term crypto markets therefore will place most trading focus on inflation expectations and market pricing for rate hikes, said Shah and Steimetz.

The central bank has tightened monetary policy aggressively and raised interest rates as it battles the highest inflation in four decadesa pathway it isnt expected to veer from until 2023 at the earliest. That is why inflation and the Feds monetary policy plans are the fourth and fifth factors investors should watch for Bitcoins next move.

The coming days could bring more clarity for the market.

The Feds preferred measure of inflation is due Friday in the form of Julys core personal-consumption expenditures index. Also on Friday, Fed Chair Jerome Powell is due to speak at the Jackson Hole economic conference, which is likely to be key for clarifying investors expectations around Fed policy.

These events will no doubt be one of the most important short-term catalysts for crypto in the week ahead, to say nothing of expectations for inflation and rates in the months to come. Just as in stocks, crypto investors cant fight the Fed.

Write to Jack Denton at jack.denton@dowjones.com

See more here:
Bitcoins Next Move: 5 Things to Watch - Barron's

Read More..