Page 3,698«..1020..3,6973,6983,6993,700..3,7103,720..»

WekaIO Recognized as One of CRN’s Top 100 Storage Vendors for 2020 – AiThority

WekaIO, the innovation leader in high-performance and scalable file storage for data-intensive applications, announced that it is being recognized by CRN, a brand ofThe Channel Company, in its first-ever 2020 Storage 100 list. This new list, carefully chosen by a panel of respected CRN editors, acknowledges leading storage vendors that offer transformative, cutting-edge solutions.

Todays data-intensive applications, stemming from artificial intelligence (AI), machine learning (ML), analytics, and genomics workloads, have placed extraordinary pressure on IT infrastructure demanding highly scalable storage that delivers extreme performance

According to CRN, not only do these Storage 100 companies push the boundaries of innovation, but the list itself is also a valuable tool for solution providers looking to find vendors who can guide them through the intricate storage technology market. The Storage 100 list will become an annual reference for solution providers who are seeking out vendors offering superior storage solutions in areas such as software-defined storage, data protection, data management, and storage components.

Recommended AI News:Alcott Enterprises Announces the Formation of Its Leadership and Advisory Board

CRNs Storage 100 list is our newest recognition of the best of the best in storage innovation, said Bob Skelley, CEO of The Channel Company. These companies are at the forefront of storage technology advancements, delivering state-of-the-art solutions built for the future. We acknowledge and congratulate them for their investment in R&D, engineering, and innovation. Their efforts enable solution providers to offer the best technology for their customers.

Our flagship solution, the Weka File System, is revolutionizing the storage world by breaking through the limitations of previous generation products, said Liran Zvibel, CEO and co-founder at WekaIO. WekaFS was uniquely built for organizations that solve big problems in their industry and demand datacenter agility. We deliver that by running on-premises, in the cloud, or with a hybrid approach; and our customers get unprecedented throughput and low latency performance with any Infiniband or Ethernet-enabled CPU or GPU-based cluster. Furthermore, we provide high security with state-of-the-art encryption, enterprise features, and the ease of use of shared NAS, including multiprotocol support for NFS and SMB.

Recommended AI News:Datapred Raises Series A From JOIN Capital

Todays data-intensive applications, stemming from artificial intelligence (AI), machine learning (ML), analytics, and genomics workloads, have placed extraordinary pressure on IT infrastructure demanding highly scalable storage that delivers extreme performance, added Barbara Murphy, vice president of marketing at WekaIO. Weka delivers the industrys best performance at any scale, with 10x the performance of legacy network-attached storage (NAS) systems and 3x the performance of local server storage. The current release introduces additional security and management features: encryption that ensures that data is kept safe both in-flight and at-rest, and snapshot-to-object that facilitates workload migration, disaster recovery, and archiving.

WekaFS was purpose-built for high-performance technical computing and data-intensive applications. Our clients across industries see immediate business value in how WekaFS can get a them to the next level in gleaning value from their data, said Frederic Van Haren, CTO of HighFens, a Weka Innovation Network (WIN) Leader partner.

Recommended AI News:Veteran Growth Executive John Connolly Joins SmartDrive Board of Directors

Originally posted here:
WekaIO Recognized as One of CRN's Top 100 Storage Vendors for 2020 - AiThority

Read More..

Global Machine Learning Market expected to grow USD XX.X million by 2025 , at a CAGR of XX% during forecast period: Microsoft, IBM, SAP, SAS, Google,…

This detailed research report on the Global Machine Learning Market offers a concrete and thorough assorted compilation of systematic analysis, synthesis, and interpretation of data gathered about the Machine Learning Market from a range of diverse arrangement of reliable sources and data gathering points. The report provides a broad segmentation of the market by categorizing the market into application, type, and geographical regions.

In addition, the information has analysed with the help of primary as well as secondary research methodologies to offer a holistic view of the target market. Likewise, the Machine Learning Market report offers an in-house analysis of global economic conditions and related economic factors and indicators to evaluate their impact on the Machine Learning Market historically.

This study covers following key players:

MicrosoftIBMSAPSASGoogleAmazon Web ServicesBaiduBigMLFair Isaac Corporation (FICO)HPEIntelKNIMERapidMinerAngossH2O.aiOracleDomino Data LabDataikuLuminosoTrademarkVisionFractal AnalyticsTIBCOTeradataDell

Request a sample of this report @ https://www.orbismarketreports.com/sample-request/61812?utm_source=Puja

The report is a mindful assortment of vital factors that lend versatile cues on market size and growth traits, besides also offering an in-depth section on opportunity mapping as well as barrier analysis, thus encouraging report readers to incur growth in global Machine Learning Market. This detailed report on Machine Learning Market largely focuses on prominent facets such as product portfolio, payment channels, service offerings, applications, in addition to technological sophistication. All the notable Machine Learning Market specific dimensions are studied and analysed at length in the report to arrive at conclusive insights. Apart from highlighting these vital realms, the report also includes critical understanding on notable developments and growth estimation across regions at a global context in this report on Machine Learning Market.

Besides these aforementioned factors and attributes of the Machine Learning Market, this report specifically decodes notable findings and concludes on innumerable factors and growth stimulating decisions that make this Machine Learning Market a highly profitable. A thorough take on essential elements such as drivers, threats, challenges, opportunities are thoroughly assessed and analysed to arrive at logical conclusions. Additionally, a dedicated section on regional overview of the Machine Learning Market is also included in the report to identify lucrative growth hubs. These leading players are analysed at length, complete with their product portfolio and company profiles to decipher crucial market findings.

Access Complete Report @ https://www.orbismarketreports.com/global-machine-learning-market-size-status-and-forecast-2019-2025-2?utm_source=Puja

Market segment by Type, the product can be split into

Professional ServicesManaged Services

Market segment by Application, split into

BFSIHealthcare and Life SciencesRetailTelecommunicationGovernment and DefenseManufacturingEnergy and Utilities

The report also lists ample correspondence about significant analytical practices and industry specific documentation such as SWOT and PESTEL analysis to guide optimum profits in Machine Learning Market. In addition to all of these detailed Machine Learning Market specific developments, the report sheds light on dynamic segmentation based on which Machine Learning Market has been systematically split into prominent segments encompassing type, application, technology, as well as region specific segmentation of the Machine Learning Market.

Some Major TOC Points:

1 Report Overview

2 Global Growth Trends

3 Market Share by Key Players

4 Breakdown Data by Type and ApplicationContinued

For Enquiry before buying report @ https://www.orbismarketreports.com/enquiry-before-buying/61812?utm_source=Puja

About Us :

With unfailing market gauging skills, has been excelling in curating tailored business intelligence data across industry verticals. Constantly thriving to expand our skill development, our strength lies in dedicated intellectuals with dynamic problem solving intent, ever willing to mold boundaries to scale heights in market interpretation.

Contact Us :

Hector Costello

Senior Manager Client Engagements

4144N Central Expressway,Suite 600, Dallas,Texas 75204, U.S.A.

Phone No.: USA: +1 (972)-362-8199 | IND: +91 895 659 5155

See the original post here:
Global Machine Learning Market expected to grow USD XX.X million by 2025 , at a CAGR of XX% during forecast period: Microsoft, IBM, SAP, SAS, Google,...

Read More..

Artificial Intelligence Is Going to Revolutionize the Executive Search World – BRINK

Todays machine learning and predictive analytics technologies are about to bring revolutionary changes to the executive search industry.

Photo: Shutterstock

Share this article

From the 1950s to the mid-1990s, executive recruiters sourced candidates by leveraging their Rolodexes; they made a lot of phone calls, starting with people they knew and requesting possible candidates and referrals. Their success as recruiters was largely governed by their personal network.

Internet job boards and resume databases began to change this paradigm.

For the first time, information about the workforce became freely available to diligent researchers. LinkedIn, for example, with its hundreds of millions of active profiles, allows recruiters to consider sources and candidates outside their phone networks. But combing through LinkedIn is an eye-wateringly laborious process; for every person of interest there are tens of thousands who sound similar but are not, and the information is not always accurate or up to date.

This is one of the reasons why recruiters are generally supported by large research teams and why the average search still takes three to five months from inception to completion.

Todays machine learning and predictive analytics technologies, however, with their ability to sift through huge volumes of data with previously unimaginable speed and precision, are about to bring revolutionary changes to the search world.

For the executive search industry, AIs most imminent and revolutionary application will be its ability to compile large, constantly evolving data sets and draw inferential deductions from that data.

Though it would have seemed impossible just a few years ago, AI algorithms can now aggregate personal and organizational profiles from billions of social, public and enterprise sources and use them to build a continuously updated portrait of the labor mark.

Odgers Berndtsons proprietary database, for example, updates every 30 to 45 days, adding 600,000 new executive profiles a month.

This data portrait, valuable in its own right, is then subjected to a highly nuanced machine learning engine, which can contextualize company and candidate profiles across a wide variety of key metrics.

Whereas a keyword-matching system measures a candidate against a few pre-programmed words deemed necessary for a role, proper machine learning tools can understand candidates and companies in the context of their ecosystem and make inferential deductions about their qualities, relationships and likely behavior.

In practice, this means two things.

First: AI-enabled search consultants have on-demand access to millions of corporate and candidate profiles.

Second: They have on-demand access to nuanced and customizable evaluations of those profiles and the relationships between them.

AI algorithms are capable of completing millions of pattern-matching comparisons per second and in some cases have seen and compared as many as two billion career progressions. They make complex and qualitative inferences about individual and corporate profiles and can do so on an incredible scale.

What this means, in practice, is that AI can evaluate candidates and companies with incredible precision.

Rather than simply filtering candidates by static traditional metrics job experience, education, diversity and leaving humans to make qualitative inferences, AI can identify candidates whove demonstrated patterns of excellence over the course of their careers.

It can sort relevant candidates by their likelihood to be interested in a new position.

And it can provide a quantitative and contextually comprehensive understanding of the moves of successful candidates going from one company to another over the last fifteen years, for example.

AI will be for executive search firms what the first tractors were to farmers: It wont change the substance of what search firms do, but it will allow them to do a better job faster.

Rather than spending weeks building a comprehensive, three-dimensional, long list of candidates, todays AI-enabled recruiters can compile nuanced long lists of candidates simply by feeding AI with a perfect profile and having it sweep through the database, identifying profiles that have similar skills, career trajectories and job titles.

This added efficiency will noticeably shorten the time and resources firms put into the front end of each search, freeing recruiters to focus on value-adding aspects of the job like candidate development, contract negotiation and onboarding.

In the long term, as AI becomes more ubiquitous, these efficiencies may shift industry expectations about search durations, decreasing the average project length from months to weeks.

These efficiency gains have structural implications for the recruiting landscape, particularly at the middle and lower ends of the hiring pyramid where commonalities across searches lend themselves to comprehensive automation.

Because machine learning algorithms learn from the tasks they accomplish, by the time an algorithm has finished 100 comptroller searches for 100 industrial companies, it will be pretty good at distinguishing between long-list and finalist-quality candidates.

At the executive level, however, each search is unique and even the minor differences between finalist candidates will have major implications for a clients future. AI will play a major role in the early phase of these searches, but its influence will fade in later stages.

AI has the ability to hugely reduce human bias in all levels of the talent acquisition landscape.

A search firm can now, for example, conduct the whole research phase of a project without knowing the candidates names, ethnicities, genders, sexual orientations, or places of origin. Candidate masking of this sort helps to reduce unconscious human biases and makes it far easier to embed diversity into the search process, allowing for real and numbers-based accountability in diversity efforts.

AIs far-reaching intelligence and numerical rationality can also help to combat other human biases, like those that favor some collegiate institutions over others.

An AI algorithm can be taught to draw its own conclusions about performance and quality; it makes judgments without relying on limited polls, human opinions or historic reputations. It can, for example, weigh an Ivy League university against a small, little-known college on an unbiased scale.

Because AI is working with real data, however, and because that data is generated by and reflective of a society in which bias has played a structurally organizing role, AI can accidentally perpetuate, rather than surpass, human prejudice.

To circumvent this and ensure that AI is not perpetuating the prejudices implicit in human society, AI algorithms can be trained to develop strategies to identify, quantify and work around the biases it finds.

Rather than simply evaluate individual performance in a diversity-blind way, for example, AI can measure the overall historic relationship between employees of diverse backgrounds and the companies theyve worked for, analyzing (a) how bias interacts with their career progressions, and (b) how each candidate ranks relative to each other in that same context.

In other words, it can look at whether a company seems to exhibit bias against certain employees, then judge those employees in ways that take these biases against them into account. This gives promising candidates of diverse backgrounds a way of being found by the algorithm, even when systemic bias would otherwise negatively impact their visibility.

The fact that these algorithms can be used to produce shortlists, pipelines and talent market maps of distinct subsets of the labor market is revolutionary.

For example, it will soon be feasible to identify roughly how many Native Americans have worked in New Yorks investment banks over the last decade what roles theyve had, how they performed and who the top performers were. That is valuable data. And as search firms get better at building and maintaining their AI databases, they will begin selling market insights like this as a commodity.

Though AI will streamline the search business and though it may eventually be technically capable of removing humans from the equation it is unlikely to fully obviate the need for human interaction.

Executive recruiters are valued not simply for their ability to find candidates, but for their ability to negotiate the details of recruiting packages for candidates and clients. They are, in a sense, allies to both sides.

To the candidate, a recruiter serves as a coach, career adviser and advocate; to the client, they are a market expert, deal negotiator and strategy consultant. Most importantly, executive search professionals are good at finding the best candidate for the client, then persuading this candidate that the role is important, that they are uniquely able to fill it and that this is an opportunity that they should consider and they do this by contextualizing data with narrative.

AI does not compare to humans in this sphere; it cannot take information about a candidate, a client or a strategy and turn it into the kind of compelling, fact-supported story with which humans make important decisions. But this is exactly what executive search consultants have done for clients and candidates since the industrys inception: They tell stories.

They tell stories about the candidates career and how this job is its logical next chapter; they tell stories about the role itself, how it interacts with the companys goals and how the candidate is acutely qualified for it; and they tell stories about the company, what it stands for, where its going and how being a member of that team will inform the candidates own career.

What AI can do is enrich the details in the storytelling.

See more here:
Artificial Intelligence Is Going to Revolutionize the Executive Search World - BRINK

Read More..

RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away – The Register

Quantum computers pose an "urgent but manageable" threat to the security of modern communications systems, according to a report published Thursday by influential US RAND Corporation.

The non-profit think tank's report, "Securing Communications in the Quantum Computing Age: Managing the Risks to Encryption," urges the US government to act quickly because quantum code-breaking could be a thing in, say, 12-15 years.

If adequate implementation of new security measures has not taken place by the time capable quantum computers are developed, it may become impossible to ensure secure authentication and communication privacy without major, disruptive changes, said Michael Vermeer, a RAND scientist and lead author of the report in a statement.

Experts in the field of quantum computing like University of Texas at Austin computer scientist Scott Aaronson have proposed an even hazier timeline.

Noting that the quantum computers built by Google and IBM have been in the neighborhood of 50 to 100 quantum bits (qubits) and that running Shor's algorithm to break public key RSA cryptosystems would probably take several thousand logical qubits meaning millions of physical qubits due to error correction Aaronson recently opined, "I dont think anyone is close to that, and we have no idea how long it will take."

But other boffins, like University of Chicago computer science professor Diana Franklin, have suggested Shor's algorithm might be a possibility in a decade and a half.

So even though quantum computing poses a theoretical threat to most current public-key cryptography and less risk for lattice-based, symmetric, privacy key, post-quantum, and quantum cryptography there's not much consensus about how and when this threat might manifest itself.

Nonetheless, the National Institute of Standards and Technology, the US government agency overseeing tech standards, has been pushing the development of quantum-resistant cryptography since at least 2016. Last year it winnowed a list of proposed post-quantum crypto (PQC) algorithms down to a field of 26 contenders.

The RAND report anticipates quantum computers capable of crypto-cracking will be functional by 2033, with the caveat that experts propose dates both before and after that. PQC algorithm standards should gel within the next five years, with adoption not expected until the mid-to-late 2030s, or later.

But the amount of time required for the US and the rest of the world to fully implement those protocols to mitigate the risk of quantum crypto cracking may take longer still. Note that the US government is still running COBOL applications on ancient mainframes.

"If adequate implementation of PQC has not taken place by the time capable quantum computers are developed, it may become impossible to ensure secure authentication and communication privacy without major, disruptive changes to our infrastructure," the report says.

RAND's report further notes that consumer lack of awareness and indifference to the issue means there will be no civic demand for change.

Hence, the report urges federal leadership to protect consumers, perhaps unaware that Congress is considering the EARN-IT Act, which critics characterize as an "all-out assault on encryption."

"If we act in time with appropriate policies, risk reduction measures, and a collective urgency to prepare for the threat, then we have an opportunity for a future communications infrastructure that is as safe as or more safe than the current status quo, despite overlapping cyber threats from conventional and quantum computers," the report concludes.

It's worth recalling that a 2017 National Academy of Sciences, Engineering, and Medicine report, "Global Health and the Future Role of the United States," urged the US to maintain its focus on global health security and to prepare for infection disease threats.

That was the same year nonprofit PATH issued a pandemic prevention report urging the US government to "maintain its leadership position backed up by the necessary resources to ensure continued vigilance against emerging pandemic threats, both at home and abroad."

The federal government's reaction to COVID-19 is a testament to the impact of reports from external organizations. We can only hope that the threat of crypto-cracking quantum computers elicits a response that's at least as vigorous.

Sponsored: Practical tips for Office 365 tenant-to-tenant migration

See the original post:
RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away - The Register

Read More..

Quantum computing: When to expect the next major leap – TechRepublic

What's up next for quantum computing? Possibly weather forecasting and online dating.

Dan Patterson, a Senior Producer for CBS News and CNET, interviewed futurist Isaac Arthur about what's next for quantum computing. The following is an edited transcript of the interview.

Isaac Arthur: It's always hard to guess with computers, and we're a little bit spoiled by Moore's Law from the fifties and sixties just taking us from these really simple devices to what we have nowadays.

We do not want to make the same mistake we made with, for instance, nuclear fission and fusion where we got the development in 20 years and just assume the next one will get to us in another 20.

Quantum computing might be many decades before we see any real major progress, but at the moment, we have made quite a few major leaps and actually are doing some real calculations with this.

SEE:Managing AI and ML in the enterprise 2020 (free PDF) (TechRepublic)

We have a whole bunch of problems in terms of making it better, though. The biggest one is actually getting the right answer out of it. As an example, if we were using the random source before--let's say I locked somebody inside a quantum box with a phone book, and I told them, 'I want you to find a phone number, and if you call this correct phone number and here's the phone number in this book, someone's going to come by and let you out of this box.'

That person is then given a random number generator, and we shut the box, and they search. A whole bunch of different quantum ghosts of them appear, searching various pages, but the one who finds the right one calls that, and the person comes and opens the door. That's one example of a data extraction, though that would never work in actual reality because quantum doesn't do both on the macroscopic scale, but you can get errors from things like that.

First, imagine one of these quantum people searching that page didn't call the right number, but instead accidentally called a pizza delivery place that showed up and opened the door to deliver a pizza. Now, we have a wrong answer. We have things like this happen with quantum computing where we have an error, in terms of the data. We used to have this with normal computing too, but we solved it fairly early on.

This is probably going to be a lot harder to do, and in many ways, it's the hardest part other than actually keeping all of these protocols entangled. It's not just trying to keep one particle like this. We have to keep several thousand potentially--or millions--all entangled with each other simultaneously. This also allows them to be at just a hair above absolute zero temperature-wise. And then, of course, we have our third problem that has to be overcome, which is the software.

SEE: Augmented reality for business: Cheat sheet (free PDF) (TechRepublic)

All this runs on algorithms being had on class computers fed into these things, and those algorithms are the only way that we still have to do a lot of work on to improve them because we're not quite using the original pure algorithms like Shor's [algorithm], but ones we've had to adapt along the way. Those are kind of three areas--the software and the hardware areas are the ones that are going to really control limitations on advancing.

How much bigger can we make the entangled system? How well can we actually pull the right answer, and how do we actually get the right algorithms to ask the right question, as well?

What we tend to think--you know, with the modern phone and the laptop--that this would be something you have at your home, that you'd have a quantum computer, but in fact you probably never actually have a quantum computer in someone's house. They have to be run at such very low temperatures. Even though they are very small devices in terms of the entanglement, there's so much associated equipment that isn't likely to get too miniaturized. Most likely, you would always have class computers, and people access it through the cloud, and you'd just buy time--or get time--on a quantum computer that you will link up to.

The thing that we're most likely--for one individual person to use, would probably be something like encryption, but for stuff that we would actually get to see on our computer would probably be stuff like weather forecasting, for instance. It has a lot of options to allow us to do way better weather forecasting than we do now.

There are a lot of other examples in terms of the science; there are great things. It might finally let us model how the lifestyle of abiogenesis in the deep oceans, which is one of those examples where our models can't really be. We have approximation algorithms that we use to cover these really huge numbers, but they don't really seem to be up to snuff for covering things like those chemical interactions in the early deep oceans, and then those same algorithms, ironically enough, would be the kind of things we'd use for dating services in terms of finding the most optimal match for a person based on not just a simplified number of traits.

We have to simplify traits, normally. Here, we could actually have a thousand different traits with a thousand different subtypes, and a quantum computer could actually match up and optimize all of those. And then of course, there's the possibility of using election modeling.

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays

Image: Getty Images/iStockphoto

Read the original:
Quantum computing: When to expect the next major leap - TechRepublic

Read More..

Cambridge Quantum Computing Performs the World’s First Quantum Natural Language Processing Experiment – Quantaneo, the Quantum Computing Source

This is the first time that natural language processing has been executed on a quantum computer. Furthermore, by achieving the results without relying on quantum RAM, CQC scientists have created a path to truly applicable quantum advantage within the Noisy Intermediate-Scale Quantum (NISQ) era.

By using CQCs class-leading and platform-agnostic retargetable compiler t|ketTM, these programs were successfully executed on an IBM quantum computer, achieving meaning-aware and grammatically informed natural language processing - a dream of computer scientists since the earliest days of the computer age. CQC looks forward to providing further details in the near future including ways to scale the programs so that meaningfully large numbers of sentences can be used on NISQ machines as they themselves scale in quantum volume and using other types of quantum computers.

The full article with details and links to the appropriate GitHub repository is noted here.

See the article here:
Cambridge Quantum Computing Performs the World's First Quantum Natural Language Processing Experiment - Quantaneo, the Quantum Computing Source

Read More..

Is 1 Bitcoin Enough for You to Retire On? This Analyst Thinks Yes – Bitcoinist

More analysts than ever are encouraging young people to take advantage of the current market dip and begin investing in Bitcoin for retirement. Whereas this idea is nothing new, current forces in the legacy financial space are making it more appealing. At least one analyst asserts that a mere one Bitcoin will provide a vastly better long-term return than traditional savings.

Over the course of the past forty years retirement plans in developed countries have gradually shifted from fixed benefit programs, such as standard pension plans, to defined contribution programs, such as 401ks. Whereas the wisdom of this transition is subject to debate, there is no question that millions now rely on some form of personal savings for most, if not all, of their retirement income.

For those with ample nest eggs, this arrangement has been fine. However, decades of low inflation and brief recessions have played a role in this success. Should the current global financial crisis result in a surge of inflation, retirees could find themselves in serious trouble.

For those still in the workforce, long term devaluation of fiats such as Dollars and Euros could be devastating. Years of prudent investment could disappear as the earning power of retirement savings evaporates. Analyst Davincij15 has pointed this out in a recent tweet:

Simply put, he acknowledges the wisdom of beginning to save while young, yet notes that all may be for naught if inflation becomes a problem. Not surprisingly, he advocates Bitcoin as a possible hedge.

Much has been said of Bitcoin as a potential safe haven during the current economic meltdown. However, the long-term consideration of this idea is far more notable. The fact that crypto ownership skews toward the young is well-known, and more than ever workers under 35 are choosing to add blockchain assets to their retirement portfolios.

Part of this trend is, of course, related to the belief that crypto will continue to vastly outperform traditional investments. However, these young investors may now be making this choice to protect their retirement from inflation or other economic downturns. In other words, crypto is likely to be added to hard assets like gold and treasury bonds as a component of a properly managed portfolio.

There is little doubt that Bitcoin and other cryptocurrencies are a permanent element of the global financial landscape. Now, more than ever, current events are giving legitimacy to this new asset class.

Do you think Bitcoin is the nest retirement investment option available to us? Share what you think in the comments below.

Images via Aaron Burden from Unsplash, Twitter: @Davincij15

Read more here:
Is 1 Bitcoin Enough for You to Retire On? This Analyst Thinks Yes - Bitcoinist

Read More..

Peter Schiff: In The Next Years Gold Will Rise More Than Bitcoin Because Bitcoin Price Will Crash – CryptoPotato

Peter Schiff, a well-known economist and renowned author, has taken another stab at Bitcoin. As he commonly compares the digital asset with gold, outlining its lack of intrinsic value, the expert now says that the precious metal will outperform Bitcoin as the latter will simply crash.

Bitcoin is commonly compared to gold in its property to serve as a store of value. In fact, the Chairman of the US Federal Reserve, Jerome Powell, referred to it as a speculative store of value, just like gold.

This doesnt seem to be the opinion of Peter Schiff, though, as hes been known for refuting Bitcoins value.

He reiterated his views today, once again saying that Bitcoin will ultimately crash back to earth.

Being asked to short Bitcoin as means of putting his money where his mouth is, Schiff explained that hes already betting against the popular cryptocurrency.

I own no Bitcoin and am long lots of gold and silver, and even larger positions in precious metals mining stocks. Thats effectively a big bet against Bitcoin becoming the new gold, or taking a safe have/store of value market share away from gold. He said.

Having many qualities relative to gold, Bitcoin is also commonly referred to as digital gold. Indeed, the cryptocurrency is mined in a sense that it takes work to produce more of it, hence the Proof-of-Work (PoW) consensus algorithm.

Furthermore, Bitcoin is also scarce just as gold. Both assets have a predetermined amount. Theres only so much gold in the earth and so many bitcoins that will be minted. The difference is that we know how many bitcoins there will be (21 million), but not how much gold is left unmined.

Yet, an interesting point was brought up in Schiffs Twitter thread, outlining that gold is archaic, and it used to work for non-digital societies, which ours is not. To that point, Schiff argued that gold only became money about 700 BC. So its actually pretty modern. Plus, its a metal, not a rock. Bitcoin will never be money. Not even cavemen would be dumb enough to accept it.

Enjoy reading? Please share:

Click here to start trading on BitMEX and receive 10% discount on fees for 6 months.

Continued here:
Peter Schiff: In The Next Years Gold Will Rise More Than Bitcoin Because Bitcoin Price Will Crash - CryptoPotato

Read More..

Top 3 Price Prediction Bitcoin, Ethereum, Ripple: A consolidative phase before the bears return – FXStreet

The worlds no. 1 digital coin, Bitcoin, continues to trade range bounce around 0.6850 heading into the weekly closing. Ethereum and Ripple also keep their recent trading range amid quiet Easter trading. Ripple, however, outperforms the top 3 most dominantly traded digital assets. The total market capitalization of the top 20 cryptocurrencies now stands at $198.85 billion, as cited by CoinMarketCap.

The top three coins could likely resume Fridays corrective slide, with the FXStreets Confluence Detector tool suggesting key technical levels to watch out for in the week ahead.

Amid a tug-of-war between the bulls and the bears so far this Easter, Bitcoinis likely to face the immediate resistance at 6883, the confluence of the upper Bollinger Band on 15-minutes chart, SMA 10 4H and previous high 1H. Further up, a minor next hurdle awaits around 6950, where the Fib 38.2% 1D and Bollinger Band 1H Upper coincide.

The buying interest will intensify above the latter, with the strong resistance at 7026 back in play. The barrier is the confluence of the Pivot Point 1D R1 and Fib 61.8% 1W.

Having said that, the downside appears more compelling amid a lack of substantial levels. The immediate support is aligned at 6741, the previous week low and Pivot point 1D S2.

A failure to resist above the 6740 area will expose the next support at 6527, Pivot Point 1 Week S1.

At the current level of 157.80, any further upside attempts in Ethereumare likely to face a stiff resistance at 158.58, a cluster of Fib 38.2% 1D, SMA50 4H and SMA50 1H.

Only a sustained move above that level would revive the recovery momentum from Fridays sell-off.

To the downside, the next support is the Fib 61.8% 1W at 153.88 below which a test of the Fib 38.2% 1M at 152.24 is likely on the cards.

Rippleis on track to conquer the symmetrical triangle pattern target near 0.1960, which also marks the key hurdle for the bulls. That level represents the Fib 61.8% 1M.

On its way to that target, a minor resistance at 0.1943 needs to be taken-out, the intersection of Fib 38.2% 1W and SMA50 1D.

Any pullbacks will likely remain shallow, as a number of support levels are stack up, with the immediate one seen at 0.1900, the Fib 38.2% 1D and SMA50 4H intersection. A break below the last would call for a test of 0.1883, where the Fib 61.8% 1W and 1D meet.

If the sellers regain complete control below the latter, a test of the strong support of the previous year low at 0.1754 will be inevitable.

See all thecryptocurrency technical levels.

Follow this link:
Top 3 Price Prediction Bitcoin, Ethereum, Ripple: A consolidative phase before the bears return - FXStreet

Read More..

$1,006,057,985 Bitcoin (BTC) Transfer Triggers Whale Watchers, Crypto Giant Reveals Motive Behind Transaction – The Daily Hodl

A huge Bitcoin (BTC) transaction is catching the eye of crypto whale watchers.

A pseudonymous crypto trader and analyst who goes by the name Krisma spotted the transfer of 146,500 Bitcoin, which is worth $1,006,057,985 at time of publishing.

As traders tried to determine the reason for the transfer, Bitfinex chief technology officer Paolo Ardoino revealed that the leading Hong Kong-based exchange is behind the transaction.

According to Ardoino, Bitfinex was shifting funds between its hot and cold wallets. The exchange transferred 15,000 BTC to its hot wallet, and the rest was routed back to the Bitfinexs original cold wallet. The total cost of the billion-dollar transaction was just 69 cents.

Ardoino says hes thinking of announcing all big transfers in the future to avoid causing confusion.

This isnt the first time a crypto exchange has caused a stir due to large, unannounced movements of crypto. Last year, the Seattle-based crypto exchange Bittrex was linked to one of the biggest movements of BTC on record.

The analytics company Glassnode says Bittrex moved about $9 billion in Bitcoin between its own wallets in the span of an hour, across 21 separate transactions.

Featured Image: Shutterstock/agsandrew

Read more:
$1,006,057,985 Bitcoin (BTC) Transfer Triggers Whale Watchers, Crypto Giant Reveals Motive Behind Transaction - The Daily Hodl

Read More..