Page 3,556«..1020..3,5553,5563,5573,558..3,5703,580..»

This bootstrapped startup built an eBay for Bitcoins and is heating up Indias crypto market – YourStory

India is headed towards one of its worst economic crises in 30 years, with all projections pointing towards negative GDP growth in 2020.However, over the last two months, local bitcoin trading, especially on peer-to-peer crypto exchanges, has reached record-breaking volumes.

This is believed to be a direct impact of the Supreme Court lifting RBIs unconstitutional two-year ban on cryptocurrencies in March, a few weeks before India went into a lockdown.

As a result, bitcoin (BTC) volumes traded in India have surpassed the spike of December 2017, when the digital currency was enjoying a bull run globally.

One of the biggest gainers of this recent spike is Paxful, a US-based P2P bitcoin exchange marketplace and cryptocurrency wallet.

Paxful co-founders Ray Youssef (right) and Artur Schaback

It is one of the newest entrants in Indias crypto economy, but has already surpassed other BTC exchanges in the market.

Bitcoin volumes on Paxful surged to a single-day high of $1.48 million on May 10, according to UsefulTulips, a crypto analytics firm. This is the highest for any platform in the country, ahead of closest peer LocalBitcoins.

Ray Youssef, Co-founder and CEO of Paxful, tells YourStory,

Paxful, which calls itself the eBay for Bitcoins, was founded in 2015 by Ray Youssef and Artur Schaback. It is, what the founders call, a people-powered marketplace where users can buy or sell bitcoins or even transact with over 7,300 vendors anywhere in the world.

Ray explains,

On Paxful, users can create their own cryptocurrency wallets to send, receive or store bitcoins. These transactions may happen between friends and family or traders and vendors. Money transfers can be done between two parties (P2P) by simply scanning a QR code or through a virtual payment address.

The startups goal is to democratise access to bitcoin and drive financial inclusion in emerging economies like India and Africa.

Paxful is a universal translator of money, which means that any form of money can be turned into any other form of money instantly, says the co-founder.

The Paxful app, which launched in 2019, allows users to stay abreast of their transaction history and global exchange rates in real time. The Android-only app has already crossed 100,000 downloads on Google Play Store, and is said to have ushered in an all-new ease of use for customers.

Coinciding with the rise in bitcoin trading in India since March-April, Paxful has witnessed record volumes on its platform.

In May, it recorded BTC trading volumes of $6.2 million, a 41 percent increase. It also grew new user signups from India by 12 percent for the month. For 2020, Paxfuls average monthly user sign-ups have grown by 28 percent, while average trading volumes are up $4.4 million.

Ray observes,

Paxful claims to have over three million users across the globe. The user base is growing at 40 percent year on year, while revenues are growing by 24 percent. Globally, it reported a 10x increase in bitcoin trading volume (from $2.2 million to $22.1 million) in H1 2020 over the same period in 2019.

Paxful Co-founder and COO Artur Schaback at the Bitcoin 2019 conference in San Francisco.

The platform charges an escrow fee to sellers. The percentage is undisclosed, but CoinDesk estimates it to be 1 percent of every transaction.

Interestingly, a large section of Paxful users are women. It is because this segment of the population, especially in developing economies, is largely unbanked or underbanked.

The more you make bitcoin visible, the more normal people will be able to use it. We are educating our traders that you dont need a lot of money to start, says the co-founder. However, some operational challenges remain.

Paxful is riding the popularity of bitcoin, the most well-known cryptocurrency in the world, according to surveys. (The next popular crypto is Ethereum.)

The 300-people startup is also looking to open its India office in Hyderabad by 2021. We were planning to launch this year, but it has been delayed due to COVID-19, the co-founder shares.

Team Paxful | Photo: Instagram

Paxful also wants to expand its B2B network in the country, and make bitcoins more acceptable in merchant payments as they provide the highest margins.

The startup operates in a market that is witnessing phenomenal growth. In the first quarter of 2020, $8.8 trillion was traded in cryptocurrencies globally, according to TokenInsight (a blockchain and crypto research firm).

Ray signs off by saying, The Indian market holds great potential and importance for the future of the crypto-economy. We are actively focusing our efforts on bringing cryptocurrency to the masses to aid in the eradication of poverty, boost economies, and create jobs, especially post-COVID-19.

(Edited by Teja Lele Desai)

Want to make your startup journey smooth? YS Education brings a comprehensive Funding and Startup Course. Learn from India's top investors and entrepreneurs. Click here to know more.

Go here to see the original:
This bootstrapped startup built an eBay for Bitcoins and is heating up Indias crypto market - YourStory

Read More..

The technical realities of functional quantum computers – is Googles ten-year plan for Quantum Computing viable? – Diginomica

In March, I explored the enterprise readiness of quantum computing in Quantum computing is right around the corner, but cooling is a problem. What are the options? I also detailed potential industry use cases, from supply chain to banking and finance. But what are the industry giants pursuing?

Recently, I listened to two somewhat different perspectives on quantum computing. One is Googles (public) ten-year plan.

Google plans to search for commercially viable applications in the short term, but they dont think there will be many for another ten years - a time frame I've heard one referred to as bound but loose. What that meant was, no more than ten, maybe sooner. In the industry, the term for the current state of the art is NISQ Noisy, Interim Scale Quantum Computing.

The largest quantum computers are in the 50-70 qubit range, and Google feels NISQ has a ceiling of maybe two hundred. The "noisy" part of NISQ is because the qubits need to interact and be nearby. That generates noise. The more qubits, the more noise, and the more challenging it is to control the noise.

But Google suggests the real unsolved problems in fields like optimization, materials science, chemistry, drug discovery, finance, and electronics will take machines with thousands of qubits and even envision one million on a planar array etched in aluminum. Major problems need solving such noise elimination, coherence, and lifetime (a qubit holds its position in a tiny time slice).

In the meantime, Google is seeking customers to work with them to find applications working with Google researchers. Quantum computing needs algorithms as much as it needs qubits. It requires customers with a strong in-house science team and a commitment of three years. Whatever is discovered will be published as open source.

In summary, Google does not see commercial value in NISQ. They are using NISQ to discover what quantum computing can do that has any commercial capability.

First of all, if you have a picture in your mind of a quantum computer, chances are you are not including an essential element a conventional computer. According toQuantum Computing, Progress, and Prospects:

Although reports in the popular press tend to focus on the development of qubits and the number of qubits in the current prototypical quantum computing chip, any quantum computer requires an integrated hardware approach using significant conventional hardware to enable qubits to be controlled, programmed, and read out.

The author is undoubtedly correct. Most material about quantum computers never mentions this, and it raises quite a few issues that can potentially dilute the gee-whiz aspect. I'd heard this first from Itamar Sivan, Ph.D., CEO, Quantum Machines. He followed with the quip that technically, quantum computers aren't computers. Its that simple. They are not Turing Machines. File this under the category of "You're Not Too Old to Learn Something New.

From (Hindi) Theory of Computation - Turing Machine:

A Turing machine is a mathematical model of computation that defines an abstract machine, which manipulates symbols on a strip of tape according to a table of rules. Despite the model's simplicity, given any computer algorithm, a Turing machine capable of simulating that algorithm's logic can be constructed.

Dr. Sivan clarified this as follows:

Any computer to ever be used, from the early-days computers, to massive HPCs, are all Turing-machines, and are thereforeequivalent to one another. All computers developedand manufactured in the last decades, are all merelybigger and more compact variations of one another. A quantum computer however is not MERELY a more advanced Turing machine, it is a different type of machine, and classical Turing machines are not equivalent to quantum computers as they are equivalent to one another.

Therefore, the complexity of running particular algorithms on quantum computers is different from the complexity of running them on classical machines. Just to make it clear, a quantum computer can be degenerated to behave like a classical computer, but NOT vice-versa.

There is a lot more to this concept, but most computers you've ever seen or heard of are Turing Machines, except Quantum computers. This should come as no surprise because anything about quantum mechanics is weird and counter-intuitive, so why would a quantum computer be any different?

According to Sivan, a quantum computer needs three elements to perform: a quantum computer and an orchestration platform of (conventional) hardware and software. There is no software in a quantum computer. The platform manages the progress of their algorithm through, mostly laser beams pulses. The logic needed to operate the quantum computer resides with and is controlled by the orchestration platform.

The crucial difference in Google's and Quantum Machines' strategy is that Google views the current NISQ state of affairs as a testbed for finding algorithms and applications for future development. At the same time, Sivan and his company produced an orchestration platform to put the current technology in play. Their platform is quantum computer agnostic it can operate with any of them. Sivan feels that focusing solely on the number of qubits is just part of the equation. According to Dr. Sivan:

While today's most advanced quantum computers only have a relatively small number of available qubits (53 for IBM's latest generation and 54 for Google's Sycamore processor), we cannot maximize the potential of even this relatively small count. We are leaving a lot on the table with regards to what we can already accomplish with the computing power we already have. While we should continue to scale up the number of qubits, we also need to focus on maximizing what we already have.

Ive asked a few quantum computer scientists if quantum computers can solve the Halting Problem.In Wikipedia:

The halting problem is determining, from a description of an arbitrarycomputer programand an input, whether the program will finish running, or continue to run forever.Alan Turingproved in 1936 that a generalalgorithmto solve the halting problem for all possible program-input pairs could not exist.

That puts it in a class of problems that are undecidable. Oddly, opinion was split onthequestion, despite Turings Proof. Like Simplico said to Galileo inDialogues Concerning Two New Sciences, If Aristotle had not said otherwise I would have believed it.

There are so many undecidable problems in math that I wondered if some of these might fall out.For example, straight from current AI problems, Planning in aPartially observable Markov decision process is considered undecidable. A million qubits? Maybe not. After all, Dr. Sivan pointed out that toreplicate in a classical processor, the information in just a 300 qubit quantum processor would require more transistors than all of the atoms inthe universe.

I've always believed that action speaks louder than words. While Google is taking the long view, Quantum Machines provides the platform to see how far we can go with current technology. Googles tactics are familiar. Every time you use TensorFlow, it gets better. Every time play with their autonomous car, it gets better. Their collaboration with a dozen or so technically advanced companies makes their quantum technology better.

The rest is here:
The technical realities of functional quantum computers - is Googles ten-year plan for Quantum Computing viable? - Diginomica

Read More..

Quantum Computing And The End Of Encryption – Hackaday

Quantum computers stand a good chance of changing the face computing, and that goes double for encryption. For encryption methods that rely on the fact that brute-forcing the key takes too long with classical computers, quantum computing seems like its logical nemesis.

For instance, the mathematical problem that lies at the heart of RSA and other public-key encryption schemes is factoring a product of two prime numbers. Searching for the right pair using classical methods takes approximately forever, but Shors algorithm can be used on a suitable quantum computer to do the required factorization of integers in almost no time.

When quantum computers become capable enough, the threat to a lot of our encrypted communication is a real one. If one can no longer rely on simply making the brute-forcing of a decryption computationally heavy, all of todays public-key encryption algorithms are essentially useless. This is the doomsday scenario, but how close are we to this actually happening, and what can be done?

To ascertain the real threat, one has to look at the classical encryption algorithms in use today to see which parts of them would be susceptible to being solved by a quantum algorithm in significantly less time than it would take for a classical computer. In particular, we should make the distinction between symmetric and asymmetric encryption.

Symmetric algorithms can be encoded and decoded with the same secret key, and that has to be shared between communication partners through a secure channel. Asymmetric encryption uses a private key for decryption and a public key for encryption onlytwo keys: a private key and a public key. A message encrypted with the public key can only be decrypted with the private key. This enables public-key cryptography: the public key can be shared freely without fear of impersonation because it can only be used to encrypt and not decrypt.

As mentioned earlier, RSA is one cryptosystem which is vulnerable to quantum algorithms, on account of its reliance on integer factorization. RSA is an asymmetric encryption algorithm, involving a public and private key, which creates the so-called RSA problem. This occurs when one tries to perform a private-key operation when only the public key is known, requiring finding the eth roots of an arbitrary number, modulo N. Currently this is unrealistic to classically solve for >1024 bit RSA key sizes.

Here we see again the thing that makes quantum computing so fascinating: the ability to quickly solve non-deterministic polynomial (NP) problems. Whereas some NP problems can be solved quickly by classical computers, they do this by approximating a solution. NP-complete problems are those for which no classical approximation algorithm can be devised. An example of this is the Travelling Salesman Problem (TSP), which asks to determine the shortest possible route between a list of cities, while visiting each city once and returning to the origin city.

Even though TSP can be solved with classical computing for smaller number of cities (tens of thousands), larger numbers require approximation to get within 1%, as solving them would require excessively long running times.

Symmetric encryption algorithms are commonly used for live traffic, with only handshake and the initial establishing of a connection done using (slower) asymmetric encryption as a secure channel for exchanging of the symmetric keys. Although symmetric encryption tends to be faster than asymmetric encryption, it relies on both parties having access to the shared secret, instead of being able to use a public key.

Symmetric encryption is used with forward secrecy (also known as perfect forward secrecy). The idea behind FS being that instead of only relying on the security provided by the initial encrypted channel, one also encrypts the messages before they are being sent. This way even if the keys for the encryption channel got compromised, all an attacker would end up with are more encrypted messages, each encrypted using a different ephemeral key.

FS tends to use Diffie-Hellman key exchange or similar, resulting in a system that is comparable to a One-Time Pad (OTP) type of encryption, that only uses the encryption key once. Using traditional methods, this means that even after obtaining the private key and cracking a single message, one has to spend the same effort on every other message as on that first one in order to read the entire conversation. This is the reason why many secure chat programs like Signal as well as increasingly more HTTPS-enabled servers use FS.

It was already back in 1996 that Lov Grover came up with Grovers algorithm, which allows for a roughly quadratic speed-up as a black box search algorithm. Specifically it finds with high probability the likely input to a black box (like an encryption algorithm) which produced the known output (the encrypted message).

As noted by Daniel J. Bernstein, the creation of quantum computers that can effectively execute Grovers algorithm would necessitate at least the doubling of todays symmetric key lengths. This in addition to breaking RSA, DSA, ECDSA and many other cryptographic systems.

The observant among us may have noticed that despite some spurious marketing claims over the past years, we are rather short on actual quantum computers today. When it comes to quantum computers that have actually made it out of the laboratory and into a commercial setting, we have quantum annealing systems, with D-Wave being a well-known manufacturer of such systems.

Quantum annealing systems can only solve a subset of NP-complete problems, of which the travelling salesman problem, with a discrete search space. It would for example not be possible to run Shors algorithm on a quantum annealing system. Adiabatic quantum computation is closely related to quantum annealing and therefore equally unsuitable for a general-purpose quantum computing system.

This leaves todays quantum computing research thus mostly in the realm of simulations, and classical encryption mostly secure (for now).

When can we expect to see quantum computers that can decrypt every single one of our communications with nary any effort? This is a tricky question. Much of it relies on when we can get a significant number of quantum bits, or qubits, together into something like a quantum circuit model with sufficient error correction to make the results anywhere as reliable as those of classical computers.

At this point in time one could say that we are still trying to figure out what the basic elements of a quantum computer will look like. This has led to the following quantum computing models:

Of these four models, quantum annealing has been implemented and commercialized. The others have seen many physical realizations in laboratory settings, but arent up to scale yet. In many ways it isnt dissimilar to the situation that classical computers found themselves in throughout the 19th and early 20th century when successive computers found themselves moving from mechanical systems to relays and valves, followed by discrete transistors and ultimately (for now) countless transistors integrated into singular chips.

It was the discovery of semiconducting materials and new production processes that allowed classical computers to flourish. For quantum computing the question appears to be mostly a matter of when well manage to do the same there.

Even if in a decade or more from the quantum computing revolution will suddenly make our triple-strength, military-grade encryption look as robust as DES does today, we can always comfort ourselves with the knowledge that along with quantum computing we are also increasingly learning more about quantum cryptography.

In many ways quantum cryptography is even more exciting than classical cryptography, as it can exploit quantum mechanical properties. Best known is quantum key distribution (QKD), which uses the process of quantum communication to establish a shared key between two parties. The fascinating property of QKD is that the mere act of listening in on this communication will cause measurable changes. Essentially this provides unconditional security in distributing symmetric key material, and symmetric encryption is significantly more quantum-resistant.

All of this means that even if the coming decades are likely to bring some form of upheaval that may or may not mean the end of classical computing and cryptography with it, not all is lost. As usual, science and technology with it will progress, and future generations will look back on todays primitive technology with some level of puzzlement.

For now, using TLS 1.3 and any other protocols that support forward secrecy, and symmetric encryption in general, is your best bet.

Continued here:
Quantum Computing And The End Of Encryption - Hackaday

Read More..

First master’s thesis in Quantum Computing defended at the University of Tartu – Baltic Times

On Tuesday, 2 June, student of the University of Tartu Institute of Computer Science Mykhailo Nitsenko defended his thesis Quantum Circuit Fusion in the Presence of Quantum Noise on NISQ Devices, the first masters thesis defended in the field of quantum computing at the University of Tartu.

In his thesis supervised by Dirk Oliver Theis and Dominique Unruh, Mykhailo Nitsenko studied a concept called circuit fusion, which proposes to reduce stochastic noise in estimating the expectation values of measurements at the end of quantum computations. But near-term quantum computing devices are also subject to quantum noise (such as decoherence etc.), and circuit fusion aggravates that problem.

Mykhailo Nitsenko ran thousands of experiments on IBMs cloud quantum computers and used Fourier analysis techniques to quantify and visualise noise and the resulting information loss.

According to Mykhailo Nitsenko, before he enrolled in the University of Tartu he had a strong opinion that quantum computing is an abstract idea that we will never be able to use or even implement. I just could not imagine how it is even possible to do computations on things without directly observing them. Quantum computing class showed me how it is done, and it became apparent to me that it is something I want to dedicate my academic efforts to, said Nitsenko.

If you dont want to wait for fault-tolerant quantum computers, you may endeavour to use the noisy quantum computing devices that can be built already now. In that case, researching the effects of quantum noise on computations becomes important: these effects must be mitigated, said Dirk Oliver Theis, Associate Professor of Theoretical Computer Science at the University of Tartu Institute of Computer Science. Theis added that he had expected that the mathematics which Mykhailo Nitsenko implemented in his thesis would help us understand some aspects of quantum noise which can be devastating to quantum computations, rendering the result pure gibberish.

In near-term quantum computing, one tries to run quantum circuits which are just short enough so that the correct output can be somehow reconstructed from the distorted measurement results. But quantum noise affects the results of computations on near-term quantum computers in complicated ways. In the mathematical approach based on Fourier analysis that Nitsenko implemented, some effects were predictable, such as a decrease in the amplitudes due to decoherence. What was surprising was that the low frequencies of the quantum noise showed distinct patterns. In future research, this might be exploited to mitigate the effect of quantum noise on the computation, said Theis.

This year, the Information Technology Foundation for Education (HITSA) granted funding to the University of Tartu Institute of Physics to continue and increase the training and research in the field of quantum computing at the university. With the support of this funding, new interdisciplinary courses focusing on quantum programming will be created.

Read the original:
First master's thesis in Quantum Computing defended at the University of Tartu - Baltic Times

Read More..

What’s New in HPC Research: Hermione, Thermal Neutrons, Certifications & More – HPCwire

In this bimonthly feature,HPCwirehighlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here.

Developing a performance model-based predictor for parallel applications on the cloud

As cloud computing becomes an increasingly viable alternative to on-premises HPC, researchers are turning their eyes to addressing latency and unreliability issues in cloud HPC environments. These researchers a duo from the Egypt-Japan University of Science and Technology and Benha University propose a predictor for the execution time of MPI-based cloud HPC applications, finding an 88% accuracy on ten benchmarks.

Authors: Abdallah Saad and Ahmed El-Mahdy.

Investigating portability, performance and maintenance tradeoffs in exascale systems

As the exascale era swiftly approaches, researchers are increasingly grappling with the difficult tradeoffs between major system priorities that will be demanded by such massive systems. These researchers a team from the University of Macedonia explore these tradeoffs through a case study measuring the effect of runtime optimizations on code maintainability.

Authors: Elvira-Maria Arvanitou, Apostolos Ampatzoglou, Nikolaos Nikolaidis, Aggeliki-Agathi Tzintzira, Areti Ampatzoglou and Alexander Chatzigeorgiou.

Moving toward a globally acknowledged HPC certification

Skillsets are incredibly important in the HPC world, but certification is far from uniform. This paper, written by a team from four universities in the UK and Germany, describes the HPC Certification Forum: an effort to categorize, define and examine competencies expected from proficient HPC practitioners. The authors describe the first two years of the community-led forum and outline plans for the first officially supported certificate in the second half of 2020.

Authors: Julian Kunkel, Weronika Filinger, Christian Meesters and Anja Gerbes.

Uncovering the hidden cityscape of ancient Hermione with HPC

In this paper, a team of researchers from the Digital Archaeology Laboratory at Lund University describe how they used a combination of HPC and integrated digital methods to uncover the ancient cityscape of Hermione, Greece. Using drones, laser scanning and modeling techniques, they fed their inputs into an HPC system, where they rendered a fully 3D representation of the citys landscape.

Authors: Giacomo Landeschi, Stefan Lindgren, Henrik Gerding, Alcestis Papadimitriou and Jenny Wallensten.

Examining thermal neutrons threat to supercomputers

Off-the-shelf devices are performant, efficient and cheap, making them popular choices for HPC and other compute-intensive fields. However, the cheap boron used in these devices makes them susceptible to thermal neutrons, which these authors (a team from Brazil, the UK and Los Alamos National Laboratory) contend pose a serious threat to those devices reliability. The authors examine RAM, GPUs, accelerators, an FPGA and more, tinkering with variables that affect the thermal neutron flux and measuring the threat posed by the neutrons under various conditions.

Authors: Daniel Oliveira, Sean Blanchard, Nathan DeBardeleben, Fernando Fernandes dos Santos, Gabriel Piscoya Dvila, Philippe Navaux, Andrea Favalli, Opale Schappert, Stephen Wender, Carlo Cazzaniga, Christopher Frost and Paolo Rech.

Deploying scientific AI networks at petaflop scale on HPC systems with containers

The computational demands of AI and ML systems are rapidly increasing in the scientific research sphere. These authors a duo from LRZ and CERN discuss the complications surrounding the deployment of ML frameworks on large-scale, secure HPC systems. They highlight a case study deployment of a convolutional neural network with petaflop performance on an HPC system.

Authors: David Brayford and Sofia Vallecorsa.

Running a high-performance simulation of a spiking neural network on GPUs

Spiking neural networks (SNNs) are the most commonly used computational model for neuroscience and neuromorphic computing, but simulations of SNNs on GPUs have imperfectly represented the networks, leading to performance and behavior shortfalls. These authors from Tsinghua University propose a series of technical approaches to more accurately representing SNNs on GPUs, including a code generation framework for high-performance simulations.

Authors: Peng Qu, Youhui Zhang, Xiang Fei and Weimin Zheng.

Do you know about research that should be included in next months list? If so, send us an email at[emailprotected]. We look forward to hearing from you.

Link:
What's New in HPC Research: Hermione, Thermal Neutrons, Certifications & More - HPCwire

Read More..

Liquidations May Loom as Altcoin Bulls Hold Despite Shorts Spiking – Cointelegraph

Despite yesterdays 8% crash in the price of Bitcoin (BTC) driving many BTC bulls from the markets, altcoin longs have only seen slight declines, with Ether (ETH) longs defying the trend with a slight rally test of recent all-time highs on Bitfinex.

However, altcoin bears are quickly emerging from the woods, with shorts against many top altcoins piling up quickly amid Bitcoins recent drop.

With many altcoin bulls holding on despite the increase in shorts, numerous leading crypto assets could see a surge in liquidations regardless of what direction the markets ultimately take.

The BTC shake-out saw long positions on Bitfinex plummet by over 10%, dropping from 28,800 to roughly less than 26,000. BTC/USD short positions also saw an increase of nearly 16%.

BTC/USD longs on Bitfinex, 4hr: Tradingview

The crash also drove wholesale liquidations across crypto derivatives exchanges, with more than $71 million in margin calls on BitMEX and $66 million on Binance Futures taking place in the last 24 hours the third-largest daily liquidations of the past month according to Cryptometer.

While altcoins did not see heavy liquidations across futures platforms, a sharp spike in short positions coupled with stubborn bulls may create the perfect storm for an aggressive round of margin calls.

While many Bitcoin bulls quickly became bearish, ETH longs saw a slight increase of 2.5% currently testing recent record highs of 1.77 million on Bitfinex.

The persistent buying activity suggests that some market participants are predicting that Ether can continue to make gains over the dollar despite bearish signals in the Bitcoin markets.

ETH/USD longs on Bitfinex, 4hr: Tradingview

However, the Bitcoin crash also drove an aggressive increase in ETH/USD short positions, which spiked by more than 18% in less than 60 minutes.

ETH/USD shorts on Bitfinex, 1hr: Tradingview

Similar trends can be observed across other leading altcoin markets, with Litecoin (LTC) longs dropping just 2.5% as shorts increased by over 30%, and XRP bulls retreating by only 3.5% as shorts flew by 15% over a few hours.

Follow this link:
Liquidations May Loom as Altcoin Bulls Hold Despite Shorts Spiking - Cointelegraph

Read More..

Tether.io becomes second largest altcoin behind Ethereum tokens – HedgeWeek

Tether.io, a blockchain-enabled platform that powers the largest stablecoin by market capitalisation, has eclipsed Ripples XRP to become the second largest altcoin behind Ethereum tokens (ETH).

USD Tether (USDt) has made a rapid ascent in 2020, amid challenging market conditions and, at times, extreme levels of market volatility. USDt is also playing an increasingly important role as a valuable source of liquidity in the nascent DeFi space, which has spawned innovative financial products such as flash loans that form part of an alternative financial system.

USD Tether (USDt) has grown to a market capitalisation of USD9.6 billion, dwarfing the size of rival stablecoins by market capitalisation, trading volume and number of users. USDts market capitalisation has eclipsed that of XRP, which currently stands at US$8.5 billion, according to data from CoinMarketCap, a provider of cryptocurrency market data.Tether is manifestly growing in popularity as the most liquid, stable and trusted stablecoin, says Paolo Ardoino, CTO at Tether. Tethers ascent to become the third biggest cryptocurrency underlines the pivotal role USDt plays in the cryptocurrency ecosystem. The march of USDt is gathering momentum amid growing recognition that stablecoins will play an important role in the future of finance as a trusted and robust form of digital money.

As of 12 May, 2020, USDt has a market share of 77.84 per cent among Ethereum-based stablecoins, according to recent research by The Block. USDts outstanding Ethereum-based supply has grown by about 113 per cent year-to-date to from USD2.3 billion to USD4.9 billion. The aggregate Ethereum-based stablecoin market capitalisation has increased 95.38 per cent year-to-date to USD6.25 billion, research from The Block found.

Tether functions as the reserve currency for the crypto market, says Ardoino. The recent market instability has demonstrated that there is a huge need for this asset. Investors want a safe haven to reduce the risk in their portfolios.

In addition to its Ethereum-based version of USDt, there are versions of USDt that work on Algorand, Ethereum, EOS, Liquid Network, Omni and Tron. Tether is driven to support and empower growing ventures and innovation in the blockchain space.

Read the original here:
Tether.io becomes second largest altcoin behind Ethereum tokens - HedgeWeek

Read More..

ETH Miners Will Have Little Choice Once Ethereum 2.0 Launches With PoS – Cointelegraph

As Ethereum is finally set to launch its Ethereum 2.0 upgrade later this year, putting an end to a long streak of delays, the network will start moving toward a proof-of-stake model.

Consequently, the network will abandon the proof-of-work consensus algorithm, leaving Ether (ETH) miners with very few options. Since their equipment will become obsolete, they will be forced to start mining altcoins, or recertify as ETH stakers. So, what is the current state of ETH mining, and what exactly will happen to the industry as a result of the upcoming transition?

The Ethereum consensus is currently based on the PoW system, which is similar to that of Bitcoin (BTC). Therefore, the mining process is nearly identical for Ethereum, as miners use their computation resources to earn rewards for each block they manage to complete.

However, there is still a major difference between these processes. While Bitcoin mining has become almost entirely reliant on ASICs large, loud machines designed specifically for cryptocurrency mining that are mostly clustered in regions with cheap electricity Ethereums PoW hashing algorithm, called Ethash, has been designed to favor GPU units issued by global chipmakers like Nvidia and AMD. GPUs are much cheaper and more accessible than ASICs, as Thomas Heller, the global business director of cryptocurrency mining pool F2Pool, explained in a conversation with Cointelegraph:

Because ASICs are very specialized machines, when a new generation is released, its often a huge technology jump. So, their hash rate is much higher, and energy efficiency is better than the previous generation. That means that those manufacturers have spent a lot of money to research and develop it. Their machines are often quite expensive, while GPUs are a lot more affordable.

Heller added that those using GPU miners have much more flexibility in what you can mine. For instance, an Nvidia GeForce GTX 1080 Ti card a popular choice can mine more than 15 different currencies, while ASIC units normally support just one currency.

Nevertheless, the Ethereum network is not entirely immune to ASIC miners at least, in its current state. In April 2018, Bitmain released the Antminer E3, an ASIC produced specifically for mining Ethereum. Despite being a widely successful model that boasts a hash rate of 180 megahashes per second and power consumption of 800 Watt, it has received mixed reactions from the Ethereum community. A substantial part of GPU rig owners seemed to have suffered from loss of profits once ASICs were plugged in, while some were even forced to switch over to different networks.

Its in the Whitepaper that ETH shall be ASIC resistant. I hope said whitepaper stands for something was one of the top comments in a r/EtherMining thread discussing the Antminer E3 around the time it was announced. 800 usd only for 180mh a different Reddit user argued. Hardfork or die eth.

Some Ethereum users went on to suggest that Bitmains mining device can lead to greater centralization and thereby increase the chance of a 51% attack. Soon, a group of developers proposed programmatic proof-of-work, or ProgPoW an extension of the current Ethereum algorithm, Ethash, designed to make GPUs more competitive, thereby promoting decentralization.

According to a March paper co-authored by Kristy-Leigh Minehan, a co-creator of the ProgPoW, around 40% of Ethereums hash rate is generated by Bitmain ASICs. Alejandro De La Torre, the vice president of Poolin the sixth-largest pool for ETH confirmed to Cointelegraph that GPU mining is still dominant for the Ethereum network, adding:

At present, the profit of ETH mining is not high, and the management threshold and cost of GPU devices are higher than that of Asic devices. Compared with Asic devices, however, GPU devices are more flexible as in, you can switch to other coins with different algos.

ProgPoW has not been integrated into Ethereum yet, and it is unclear when it will eventually happen in March, core Ethereum developers were debating whether ProgPoW would actually benefit the network for almost two hours and failed to reach a consensus. Notably, a Bitmain representative previously told Cointelegraph that the mining hardware giant doesnt plan to extend Antminer E3s lifespan to operate after October 2020: As far as we know, mining will approximately end during October or sometime after this.

Indeed, Ethereum will move away from mining in the future. Scheduled to launch later in 2020, Ethereum 2.0 is a major network upgrade on the blockchain that is designed to shift its current PoW consensus algorithm to PoS where miners are virtual and referred to as block validators.

More specifically, they are randomly selected with the consideration of users wealth in the network, or their stake. In other words, the more coins PoS validators choose to stake, the more coins they accumulate as a reward.

According to Ethereum co-founder Vitalik Buterin, the network will become more secure and costly to attack than Bitcoins as a result of the transition, although the debate over which consensus algorithm is better has been around for years in the crypto community. However, its still unclear when the launch of Ethereum 2.0 will take place, as numerous bugs and management problems are reportedly delaying the process.

Related: Ethereum 2.0 Release Date Set for the Eleventh Hour as Issues Persist

Another supposed benefit of a PoS system is that its much more energy-efficient than PoW blockchains. According to data from Digiconomist, the cryptocurrencys annualized total footprint is 59.31 terawatts per hour, which is comparable to the power consumption of the entire country of Greece. However, Bitcoin might not be as bad for the environment as it seems thanks to a July 2019 report that estimated 74% of Bitcoin mining is done using renewable sources of energy.

What will happen to actual Ethereum miners? According to the documentation of the Casper upgrade that is part of the Ethereum 2.0 roadmap, the network will initially support a hybrid model that would involve both PoW and PoS, therefore, leaving some space for both block validators and GPU/ASIC miners. There will certainly be a transition period where both networks are running, Jack OHolleran, the CEO of the Skale Network a blockchain platform based on Ethereum told Cointelegraph, elaborating that this process will take some time:

It will certainly take time for the majority of ETH1 to transition into ETH2 potentially years not months. The good news about the slowness of this transition is that DApps and DeFi platforms will be able to move over at their leisure based on real-world evidence of viability, security and adoption. This is a net positive for the Ethereum ecosystem.

Once Ethereum runs fully on the PoS rails, miners will have two options. One is to sell the equipment and use that money to accumulate more ETH and start staking, while the other option, which is available exclusively for GPU miners, is to simply switch over to other Ethash networks and mine altcoins. Nick Foster, a representative for United States-based mining equipment dealer Kaboomracks, told Cointelegraph that most ETH miners will pick the latter option:

I would say most miners are not really into mining to get ETH or a specific coin. Yes, a certain number mine and hold, but I would argue against the notion that a large population of altcoin miners hold their coins for any amount of time.

Foster went on to describe how he switched to mining Ravencoin (RVN), an Ethash peer-to-peer blockchain asset, with his 3GB GPU unit once it became unprofitable to mine ETH: Its mining raven, and I sell to BTC instantly for stability sake and sell to USD to pay my power right after. I would say lots of people are employing a strategy like this.

As Foster summarized, he expects ETH miners to hop off the network, while new players those who didnt invest in the power infrastructure or the rigs will be staking ETH. He described the following scenario:

I cant imagine how much of a dork I would be if I found a five-year lease with $0.04 power, and I was mining ETH and I decided to sell everything and just keep paying my lease so I could stake ETH as a replacement.

Marc Fresa, the founder of mining firmware company Asic.to, agreed with that sentiment in a conversation with Cointelegraph: If youre invested into mining, you dont want staking since you have the buildout for it.

One of the major altcoins that might benefit from PoW miners leaving Ethereum is Ethereum Classic (ETC), a more conservative version of the blockchain that reportedly has no PoS-related plans. Since it also runs on the Ethash algorithm, its hash rate might experience a significant spike as a result of the potential miner migration caused by the Ethereum 2.0 launch.

Related: Ethereum 2.0 Staking, Explained

Larger mining pools for ETH are left with similar options. When asked about his companys post-PoW plans for Ethereum, Heller told Cointelegraph that F2Pool launched a sister company called stake.fish earlier in 2018, following the Ethereum PoS upgrade announcement. Because the switch has been delayed numerous times, stake.fish has started offering staking services for other PoS and delegated PoS projects like Tezos (XTZ), Cosmos (ATOM) and Cardano (ADA). As for Poolin, it may temporarily give up supporting ETH mining, as a result of the transition to PoS, De La Torre told Cointelegraph.

Other top ETH mining pools, namely Nanopool, Ethermine, Mining Pool Hub, SparkPool and SpiderPool, have not responded to Cointelegraphs requests for comment.

As for Ethereums ecosystem at large, experts reassure that the transition to PoS will be conducted in an uncomplicated fashion, and network participants casual users and decentralized applications built on top of Ethereum will hardly notice the change. Viktor Bunin, a protocol specialist at blockchain infrastructure firm and Libra Association member Bison Trails, echoed that sentiment in a conversation with Cointelegraph, adding:

The Ethereum mainnet we know today is expected to be added as a shard on ETH2 in Phase 1.5. All that will change is the consensus mechanism, so DApps and users shouldnt notice any change.

Bunin went on further stating that: Any concerns that the network will split, with some folks remaining on the PoW chain or that DApps will experience disruption, are overblown. Furthermore, OHolleran told Cointelegraph that ETH 2 is a new network that will run on a new token and a new inflation model, elaborating:

The connection is that it will all be composable and compatible with the Ethereum ecosystem and that tokens from the first network can be burned and replaced with tokens from the second network. What this means is that DApps and users will not be directly impacted until they manually switch networks. The indirect and immediate impact will be in relation to how the supply and perceived value impact the price of tokens on both networks.

As for now, it is clear that there shouldnt be a shortage of Ethereum block validators. According to a recent report by cryptocurrency analytics firm Arcane Research, the number of Ethereum wallet addresses that include or exceed 32 ETH the minimum amount required for staking is approaching 120,000.

Read the original:
ETH Miners Will Have Little Choice Once Ethereum 2.0 Launches With PoS - Cointelegraph

Read More..

Justice Gets 15 Guilty Pleas for International Crime Ring that Laundered Money Through Cryptocurrency Exchanges – Nextgov

Fifteen people entered guilty pleas for involvement in an international scam that posted fraudulent auctions online and laundered money through cryptocurrency exchanges, according to the Justice Department.

One expert says the case could serve as a template for nation-state actors using cryptocurrency exchanges to cover their tracks more in the future.

Todays modern cybercriminals rely on increasingly sophisticated techniques to defraud victims, often masquerading as legitimate businesses, said Assistant Attorney General Brian A. Benczkowski of the Justice Departments Criminal Division in a press release Thursday. These guilty pleas demonstrate that the United States will hold accountable foreign and domestic criminal enterprises and their enablers, including crooked bitcoin exchanges that swindle the American public.

Law enforcement officials have noted that increased use of cryptocurrencies like Bitcoin has made it especially challenging to nail down cybercriminals. But while cryptocurrency exchanges encode transactions hiding the identity of the parties, those codes are permanently and publicly recorded across a large number of computers. Its possible to study them and identify patterns in transactions that could eventually lead to the identity of a suspect, and law enforcement might be getting better at this.

Compared to 2016, in 2019 and 2020, were seeing more cases where its clear that law enforcement is following this stuff and is not totally dumbfounded because something involves a Bitcoin address, Yaya Fanusie, a former CIA analyst and current senior fellow at the Center for New American Security told Nextgov. It gives us the assurance that at least U.S. federal law enforcement is capable of doing an investigation that involves cryptocurrency.

In the case Justice highlighted Thursday, the defendantsindividuals in their 30s based both in Romania and the United Statesadmitted their involvement in a scheme where they established their own cryptocurrency exchange and used it as a passthrough for traditional payments they got from advertising non-existent high-value items such as cars on auction sites such as eBay and Craigslist.

According to court documents, members of the conspiracy created fictitious online accounts to post these advertisements and communicate with victims, often using the stolen identities of Americans to do so, the press release noted.

The defendants also used IP addresses anonymizing services, according to court documents. Fanusie said cryptocurrency exchanges are part of an ecosystem primed for cyber malfeasance.

Cyber services like domain names, [virtual private networks], servers, that infrastructure is ready-made to be leveraged through cryptocurrencies, he said. Its already an environment that invites anonymous use. Cryptocurrencies are the native money of the internet. So if we know that were going to have more cyber threats, it makes sense that cryptocurrencies are going to play a part.

So its good that law enforcement doesnt seem daunted by the changing landscape of criminal activity, he says, because this time [a cryptocurrency exchange] was being used by criminal fraudsters, but there are definitely parallels in what weve already seen from nation-state actors.

Fanusie has written about how cryptocurrency exchanges were used to launder and steal money and delay law enforcements identification of suspected hackers in high-profile cases involving persons affiliated with China and North Korea and Russia.

He said the 2018 indictment leading up to Thursdays guilty pleas is almost a blueprint for how nation-state actors could be thinking about running their operations, adding that the case confirms that the main thing to look out for is not so much fundraising, the biggest thing is that cryptocurrency is one part of a laundering process. Its to move the money to somewhere else so you lose the trace.

For now, law enforcement officials seem empowered by their success after an investigation that involved the U.S. Secret Service, Kentucky State Police, Lexington Police Department, IRS Criminal Investigation, and U.S. Postal Inspection Service, the Justice Departments Organized Crime Drug Enforcement Task Forces and International Organized Crime Intelligence and Operations Center, as well as the Romanian National Police Service for Combating Cybercrime and the Romanian Directorate for Investigating Organized Crime and Terrorism.

Through the use of digital currencies and trans-border organizational strategies, this criminal syndicate believed they were beyond the reach of law enforcement, said Assistant Director Michael DAmbrosio, U.S. Secret Service, Office of Investigations. However, as this successful investigation clearly illustrates, with sustained, international cooperation, we can effectively hold cybercriminals accountable for their actions, no matter where they reside. I commend the hard work and perseverance of all those who joined together in this investigation and prosecution. This includes our partners in Europe, as well as those closer to home.

It was also helpful that Romanian officials secured and coordinated the arrests and extraditions from that country of more than a dozen defendants, the release said.

The 15 defendants who have pleaded guilty in this case have yet to be sentenced, the release notes. Two other defendants in the case are scheduled for trial starting on Sept. 14, 2020, before the Honorable Robert E. Wier of the U.S. District Court for the Eastern District of Kentucky. Three others are fugitives. This case is being prosecuted by Senior Trial Attorney Timothy C. Flowers and Senior Counsel Frank H. Lin of the Criminal Divisions Computer Crime and Intellectual Property Section and Assistant U.S. Attorneys Kathryn M. Anderson and Kenneth R. Taylor of the U.S. Attorneys Office for the Eastern District of Kentucky.

See the original post here:
Justice Gets 15 Guilty Pleas for International Crime Ring that Laundered Money Through Cryptocurrency Exchanges - Nextgov

Read More..

Why Bitcoin Suddenly Dropped 6% on Thursday – CoinDesk – CoinDesk

The week-long calm in the bitcoin market ended with a sudden $800 price drop on Thursday.

The over-6% drop saw the top cryptocurrency by market value register its biggest single-day decline in two weeks, according to CoinDesks Bitcoin Price Index. Prices briefly hit lows near $9,100, a level last seen on May 27.

Theres three likely factors as to why this happened:

Stock market sell-off

Global equities cratered and traditional safe havens like U.S. government bonds and the Japanese yen gained value as comments by the U.S. Federal Reserve that the economy may take years to recover gave a reality check to investors hoping for a V-shaped recovery.

Bitcoin initially showed resilience by holding above $9,700 during the Asian and European trading hours. However, the sell-off in U.S. equities was too big to ignore for the crypto market traders some of whom likely offered bitcoin on the fear that financial markets could be about to witness another round of panic like that seen in March.

The Dow Jones Industrial Average (DJIA) fell by 1,800 points on Thursday, reviving memories of multiple 1000 point drops seen during the first half of March.

A few observers had warned of an impending price drop in conversation with CoinDesk during Thursdays European trading hours. At that time, bitcoin was trading near $9,800.

A switch to risk-off in global markets could lead to further downside pressure for major cryptocurrencies, Matthew Dibb, co-founder of Stack, a provider of cryptocurrency trackers and index funds, told CoinDesk.

Dump fears

Big on-chain transactions, especially ones related to controversial wallets and addresses, can create panic in the cryptocurrency markets. Thats because, in the past, malicious entities have liquidated stolen coins in the market, causing sudden price declines.

On Thursday, hackers moved over 400 BTC (or $4.1 million worth of cryptocurrency) stolen from the cryptocurrency exchange Bitfinex to unknown wallets, according to twitter bot Whale Alert.

These transfers happened in 20 transactions during the Asian hours and were noted by the crypto market community. A few investors then began speculating about a price dump. At that time, bitcoin was hovering around $9,900.

Another big transaction worth $1.3 billion executed by an unknown wallet also elicited a similar response from the investor community.

Fears that so-called whales are preparing to dump large numbers of coins may have caused some bulls to exit the market. Further, savvy traders may have taken short positions in anticipation of the big dump, likely accentuating bearish pressures.

Charts leaned bearish

Technical traders had a strong reason to sell bitcoins, as the charts were reporting uptrend exhaustion.

The cryptocurrency has failed multiple times to establish a lasting foothold above $10,000 since the May 11 mining reward halving. Markets often test dip demand following multiple rejections at key resistance.

A bearish divergence of a key three-day chart indicator was also suggesting scope for a price pullback.

Thursdays price decline has only strengthened the case for a deeper pullback. The slide to $9,100 marked a downside break of the eight-day restricted trading range of $9,350$10,000.

Additionally, the daily charts relative strength index has dropped into the bearish territory below 50. Analysts see strong support around $9,100, which, if breached, would invite stronger selling pressure.

First support comes from the weekly downtrend resistance line which bitcoin broke and has been sitting above the last few weeks, said Chris Thomas, head of digital assets atSwissquote Bank. This week the level is around $9,000-$9,100, hence [were] likely to see good buying here, then $8,700 & $8,200, otherwise, the next downside zone is $6,500-$7,000.

At press time, bitcoin is changing hands near $9,440. The price bounce from Thursdays low may be associated with the 1% gain in the S&P 500 futures.

Disclosure:The author holds no cryptocurrency at the time of writing.

The leader in blockchain news, CoinDesk is a media outlet that strives for the highest journalistic standards and abides by a strict set of editorial policies. CoinDesk is an independent operating subsidiary of Digital Currency Group, which invests in cryptocurrencies and blockchain startups.

Read more:
Why Bitcoin Suddenly Dropped 6% on Thursday - CoinDesk - CoinDesk

Read More..