Page 2,052«..1020..2,0512,0522,0532,054..2,0602,070..»

Cardano Overtakes Ripple To Become The Sixth Biggest Cryptocurrency By Market Cap – Ethereum World News

Cardano is rapidly evolving into one of the leading cryptocurrencies as it has recently outrun Ripple to become the sixth biggest cryptocurrency by market capitalization. According to Crypto price tracker CoinMarketCap, the current market cap of Cardano is sitting at $22.57 billion as compared to Ripples $20.19 billion.

Cardano Soars Ahead In Anticipation Of Its Upcoming Vasil Hard Folk Upgrade.

Cardanos latest price surge and ascent can be accredited to the fact that the Cardano devs will soon launch its much anticipated Vasil Hard Fork upgrade. As per CoinMarketCap, Cardano has been consistently rising over the past seven days, reaching a high of $0.06692 on May 31 followed by a slight gradual slowdown. The cryptocurrency is currently sitting at a price of $0.06412.

The blockchain is scheduled to release its first testnet on June 2, followed by a full mainnet release slated to be launched on June 29, 2022. With the implementation of Vasil Hard Fork, the network will undergo significant changes including an improvement in Cardanos overall performance and scalability.

What is Vasil Hard Fork Upgrade All About?

The upcoming Vasil Hard Fork is the second major upgrade that Cardano will undergo to improve its overall performance and credibility. According to Cardanos dev and digital writer Sooraj, the new Vasil hard fork mechanism will introduce a CIP 33 protocol which will help lower the transaction size to eliminate the need to pay hefty fees each time a user initiates a blockchain transaction.

Furthermore, the new upgrade will also introduce CIP 31 and CIP 32 update programs that will help decentralized apps access transactional output without recreating them as before. In addition to this, the CIP 32 will assist developers in storing on-chain data which will further move Cardano towards developing a more decentralized infrastructure.

Founded in 2015, Cardano is an open-chain public blockchain built on proof-of-stake mining consensus. The network had earlier incorporated its first major upgrade known as Alonzo Hard Fork in September 2021, which enabled the network to deploy smart contract capabilities and further expand its use cases. The deployment of smart contract capability gave a significant boost to the network, allowing developers to build exclusive DApps and NFT projects on the particular blockchain.

See the original post here:
Cardano Overtakes Ripple To Become The Sixth Biggest Cryptocurrency By Market Cap - Ethereum World News

Read More..

Four cryptocurrency cybersecurity risks and how to avoid them Retail Technology Innovation Hub – Retail Technology Innovation Hub

Since exchange rates are highly volatile, cryptocurrency has the potential to yield significant returns for investors. However, since cryptocurrency is a technology-based digital asset, hackers can hack it as with other digital assets.

Moreover, as more people invest in cryptocurrency, it becomes easier for hackers to use various methods to steal sensitive data and crypto assets.

Given that investing comes with a few risks, below are the common cryptocurrency cybersecurity risks and the preventive measures you can take to avoid them.

While cryptocurrency is widely known for its transparency, its also well known for being vulnerable to crypto exchange hacks. Cybercriminals tend to target crypto exchanges because a single data breach could allow them to steal thousands of users assets.

So, when hackers compromise a crypto trading platform, the users could lose their funds due to cyber theft.

Notably, when it comes to the trading platforms that suffered from security breaches, take AscendEX as an example. Its one of the victims of hacking due to a compromised crypto hot wallet with over USD$80 million worth of cryptocurrencies stolen.

With that in mind, security must be your primary consideration when choosing a crypto exchange to minimise the risk of losing your crypto assets.

Choose an industry leading crypto trading platform that utilises advanced security features to protect you from fund and data theft. Itd also be better to choose a trading platform that allows you to download the full report on your tax quickly and easily based on a particular period you select.

Alternatively, you can also consider spreading the cryptocurrencies you buy across multiple crypto exchanges instead of just sticking to one platform to ensure you dont lose all your crypto assets at once.

Phishing is a social engineering attack that cybercriminals use to steal funds and sensitive information like credit card numbers or login credentials of targeted individuals.

This cyber type of cyber-attack happens when a cybercriminal, faking to be a reliable and reputable entity, tricks a victim into clicking an attachment, filling out an online form, or clicking a link. In particular, when it comes to crypto phishing attacks, hackers target crypto wallet private keys.

They send emails to bait their target individuals into clicking a malicious link that drives them to an online form, asking them to put their crypto private key information. Once hackers successfully get the information they need, they can finally get the cryptocurrency in those crypto wallets.

Accordingly, keeping your crypto wallet keys private is the best way to protect yourself from phishing attacks. If its your first time using a crypto wallet app, send only a small amount to confirm the apps authenticity.

You should also do your research before investing, especially if youre uncertain about a particular cryptocurrency.

Crypto-malware is malicious software that cyber criminals install on victims devices. Once theyre successful in doing so to their target individuals, it allows them to mine cryptocurrencies secretly using their victim's computing power.

This type of cybercrime is also known as cryptojacking. As with other malware, cybercriminals usually deliver crypto-malware as an email attachment which may be executable software disguised as documents.

They may even use social engineering tactics to trick their victims into downloading and executing malicious files, similar to phishing attacks. But apart from sending it as an email attachment, hackers may also deploy crypto malware through malvertising or malicious landing pages.

Although its challenging to detect when theres crypto malware in your computer system, there are several preventive measures you can take to protect your crypto assets.

Notably, you should install anti-crypto mining extensions and ad-blockers; use antivirus software that protects your device from any malware attacks, and disable JavaScript to shield your computer against cryptojacking.

Crypto third-party apps refer to applications created by somebody who doesnt manage the trading platform you choose.

Although third-party apps enable you to monitor crypto prices and calculate potential profits, giving them access to your information can pose a potential risk to your security and privacy.

In particular, one of the potential risks that these third-party apps can bring to you is a data breach wherein they may expose your sensitive or protected information to others without your permission.

Whats even worse is that when a third-party app causes a data breach, the effects of this problem can be a permanent issue for your finances.

So, to ensure your security and privacy, refrain from downloading applications that your trading platform doesnt control. Using a VPN can also help mask your legitimate location and protect your personal information online.

Overall, even though cryptocurrency comes with a few risks, you can still do something to protect your assets from those cyber threats by implementing proper preventive measures and being careful with the websites and applications you use.

Itd also be wise to use multi-factor authentication for additional security against potential cyber attacks.

Read more from the original source:
Four cryptocurrency cybersecurity risks and how to avoid them Retail Technology Innovation Hub - Retail Technology Innovation Hub

Read More..

The State of Web3 – Cryptocurrency Exchange Growth and Trading Volume in 2022 – hackernoon.com

The state of Web3 report assesses the growth, impact, and reach of the worlds best cryptocurrency exchanges. Weve pulled the stats from 58 of the most popular cryptocurrency exchanges to get visibility into the top performers and understand how the general trends are changing in usage and reach. The statistics are pulled from March 2020 through to April 2022 to identify yearly and monthly changes of these major platforms.

Founder of Growth Models

Web3, crypto, NFTs, and DAOs are divisive issues.

Half the world seems to believe this is the future. The other half believe its little more than a scam.

I, obviously, fell into the former camp.

But well never be able to convince the detractors that Web3 is a legitimate development of the web and business unless we make it safer.

Especially because the vast majority of media coverage focuses on the negative news of scams, collapses, and hacks.

Our mission here at DeRev is to help people act with more confidence and with more safety in Web3.

To that end, weve put this report assessing the growth, impact, and reach of the worlds best cryptocurrency exchanges.

Why? First, to identify who are the top performers in the space so you know who youre better off

And second, to understand how cryptocurrency exchanges are changing on a monthly and yearly basis in terms of growth, adoption, and usage. Were zooming out to get a macro image of the industry to help you make more informed decisions and understand if the industry is growing or shrinking as a whole.

This is the state of Web3 report,cryptocurrency exchange edition.

Weve pulled the stats from 58 of the most popular cryptocurrency exchanges to get visibility into the top performers and understand how the general trends are changing in usage and reach.

The statistics were pulled from March 2020 through to April 2022 to identify yearly and monthly changes to these major platforms.

If you dont have the time to read the full report, heres a quick overview of the key learnings.

The 12-month period ending April 2022 when compared to the prior 12-month period saw

Now lets get into the full breakdown of cryptocurrency exchanges.

Unsurprisingly, the total reach of platforms has increased drastically between 2020-2021 and 2021-2022.

In fact, overallcryptocurrency exchangessaw an average 363.47% increase in overall traffic.

A 363.47% increase in overall traffic. We see a similar number when looking at how many of the visitors were unique. The increase of unique visitors increased by 479.70% on average.

The increase of unique visitors increased by 479.70%

However, whats interesting to note is that the unique visitors outpaced the growth of the total visitors in this time period.

Not by much, but it could be the start of a trend.

Unique visitors increased more than total visitors. This shows that there are more people visiting these sites yet not coming back for more than a single visit.

This is most likely thanks to the massive increase in publicity and interest Crypto received during 2021.

The higher occurrence of single visits is likely thanks to the steep learning curve with cryptocurrency exchanges and huge amounts of negative press that deter people from taking action.

If were to look at the average overall percentage of unique visitors, its pretty interesting. If youre growing any similar platform, this is the average rate of returning visitors I would aim to ensure youre hitting the industry benchmark.

In the 2020-2021 period, the average unique visitor rate was 24.85%. In the 2021-2022 period it was slightly higher at 32.8%. Thats an increase of 131.98%.

% of unique visitors increased by 131.98% Lets finally look at overall engagement on the site.

To assess engagement were going to look at the overall time on site and the number of pages a user visits when using the exchange.

By the nature of an exchange, these will be high. People dont jump into one page and achieve all they want to.

Theyll have to find their wallet, check exchange rates, ensure theyve input the right chain to transfer through, and more.

So the visits per session should be quite high.

The average pages per session is consistent with what weve seen so far. Theyre slightly higher in the 2020-2021 period.

Likely because this period had more serious and experienced crypto people who were more interested in using these platforms.

The latter period will have had a lot of people heading to these platforms to check things out thanks to the increased press coverage.

However, many of these wont have much experience or knowledge on what to do or how to use the platforms effectively.

Pages per visit dropped 32.14%. The average time on site for these visitors seems to follow the same trend. Once again, ToS is reduced when you get into the 2021-2022s time period.

We see 2022s time period achieving only 78% of the ToS as the previous time period.

Visit duration dropped 22%. The overall visit duration to cryptocurrency exchange is still very high despite the drop.

One theory we have is that with newer traders coming in, they may not spend as long comparing various assets and prices. In addition, they may not be using agood hardware walletto store their assets.

Anyone who has used a cold wallet will tell you that it slows down any transaction.

Basically, there are more new people who aren't as knowledgeable or as security conscious as longer-term crypto investors.

All in all, we see an obvious trend in cryptocurrency exchanges reach and engagement.

Theres been a huge influx of new users in the crypto world, likely thanks to the huge press the industry has received throughout 2021.

However, the engagement rate of these new users isn't as high as it once was.

This is likely because a good portion of this new traffic is less educated on crypto and is either trying to get into digital currencies or is simply checking out whats available.

Do we think this trend will continue? Yes.

Do we think this is cause for concern? No.

On average, engagement has dropped by ~27%. However, traffic overall has increased by an astounding 363% on average.

Theres still a huge number of highly engaged people coming to cryptocurrency exchanges, and the trends all seem set to continue growing.

Lets quickly look at the top performers in the reviewed crypto exchanges.

Reach is only half the equation when ranking cryptocurrency exchanges.

We also need to look at the trading volume of these exchanges to identify the best performing platforms and understand what changes to reach and engagement do to trading volume.

There has, obviously, been a huge increase in trading volume over the same time period.

Whats really interesting is how the increase is very similar to the increase we see in the increased unique visitor rate.

Across all exchanges surveyed, theres been an average 424.65% increase in trading volume. Very similar to the 479.70% increase in unique visitors.

The traffic data seems to show an increase in overall traffic, but a decrease in return visitors and overall engagement.

Basically indicates more newbie traders who arent as serious about crypto as long-time traders.

As such, I would have expected the trading volume to have experienced less growth than traffic and unique visitors.

However, the increase in unique visitors and trading volume really is quite close.

Trading volume and unique visitors share a close correlation in growth. If we then compare the trading volume against unique users, the average trading value per unique visit is huge.

I say huge because I come from a software marketing background where the average order value is much, much lower.

If I were to pull the stats of recorded Binance users vs. trading volume, its in the 10s or thousands per user.

This metric, because unique visitors wont necessarily sign up for the service, is more conservative.

But on average, the average trading volume per unique visitor to cryptocurrency exchanges fell between the two time periods.

It was $9125.08 in the 2020-2021 time period.

That fell to $8077.92 in the following year. A total loss of around 11.48%.

The average trading volume per unique visitor fell 11.48%. However, once again as the number of unique visitors increased so dramatically, the overall impact is a positive gain for the exchanges.

When comparing trading volume with other metrics like pages per visit and time-on-site, theres no obvious correlation - at least none as strong as the unique visitors to trading volume.

I am surprised by this as I thought the exchanges with higher return traffic would be the ones to see higher trading volume.

Now lets get into specific exchanges and how they perform.

We now have an overview of how the overall cryptocurrency exchange reach has grown. Lets dig in to find the top performers.

There are a few surprises here. If youre already thinning one or two players are miles ahead of the pack, youre right.

The three biggest exchanges in the 2021-2022 period by visitor count areBinance,Coinbase, andeToro.

And when we say theyre ahead, theyre far ahead of the pack.

Binance is by far the leader here.

What I find most interesting here is that the big winners all seem to have very similar stats on one or two key metrics linked to engagement.

Repeat visitors are a key element in assessing the overall user experience and value service offers to a user.

If a site has a very high unique visitor rate, it often means that people check out the site one time and never return.

This indicates the site isnt solving the needs as people never come back.

When assessing cryptocurrency exchanges, I noticed that theres a key difference between the major players and smaller brands.

In particular, we see the same percentage of unique customers regardless of traffic month over month.

For example, if Binance has 100,000,000 users this month with a ~30% unique visitor rate, it means that 70% of their traffic is visitors coming back more than once.

Next month, they might generate 200,000,000 users. However, they still maintain a similar ~30% unique visitor rate. This means that 70% of the extra 100,000,000 are users who come back more than once.

This is similar to all of the big exchanges.

Youll see that traffic fluctuates wildly, but the % of unique visitors stays pretty consistent.

Even with wild traffic swings, the unique visitor % stays relatively constant. Basically, these sites have systems in place to ensure that when someone comes to their site, theres something there to get them to come back multiple times within the same time month.

This tells us that theyre very good at solving their users problems and ensuring they get the service theyre looking for.

Often youll see traffic spike with a lot of new users who never return. Not in these big brands.

Binance maintains a relatively steady 30% state of unique visitors and is the largest by overall traffic.

WhereasCrypto.comhas around 55% uniques. Meaning only 45% of their traffic is return visitors.

In short, this shows me that Binance is likely offering a better service than Crypto.com to their users as the majority of their traffic is from returning visitors.

Pages per visit are also a good indicator of user experience, especially for cryptocurrency exchanges.

Youre very unlikely to find the currency you want to invest in and action the trade by visiting a single page.

Youre going to have to view multiple pages to find the right currency, fund your wallet, and make the exchange.

The average page per visit for cryptocurrency exchanges across all those analyzed for the year is 3.22 per visit.

Which is what youd expect for the majority of people looking to buy a single currency like Bitcoin.

Getting that transaction done on 3 pages is normal.

Exchanges with higher than average pages per visit often have better navigation, leading to people clicking through to more pages.

Go here to see the original:
The State of Web3 - Cryptocurrency Exchange Growth and Trading Volume in 2022 - hackernoon.com

Read More..

Cryptocurrency NEAR Protocol’s Price Increased More Than 8% Within 24 hours – Benzinga – Benzinga

Over the past 24 hours, NEAR Protocol's NEAR/USD price rose 8.92% to $5.96. This continues its positive trend over the past week where it has experienced a 7.0% gain, moving from $5.65 to its current price. As it stands right now, the coin's all-time high is $20.44.

The chart below compares the price movement and volatility for NEAR Protocol over the past 24 hours (left) to its price movement over the past week (right). The gray bands are Bollinger Bands, measuring the volatility for both the daily and weekly price movements. The wider the bands are, or the larger the gray area is at any given moment, the larger the volatility.

The trading volume for the coin has climbed 50.0% over the past week, moving opposite, directionally, with the overall circulating supply of the coin, which has decreased 0.19%. This brings the circulating supply to 701.68 million, which makes up an estimated 70.17% of its max supply of 1.00 billion. According to our data, the current market cap ranking for NEAR is #22 at $4.18 billion.

Powered by CoinGecko API

This article was generated by Benzinga's automated content engine and reviewed by an editor.

Read more:
Cryptocurrency NEAR Protocol's Price Increased More Than 8% Within 24 hours - Benzinga - Benzinga

Read More..

Cryptocurrency Bitcoin Cash’s Price Increased More Than 3% Within 24 hours – Benzinga – Benzinga

Bitcoin Cash's BCH/USD price has increased 3.44% over the past 24 hours to $196.52. Over the past week, BCH has experienced an uptick of over 3.0%, moving from $191.4 to its current price. As it stands right now, the coin's all-time high is $3,785.82.

The chart below compares the price movement and volatility for Bitcoin Cash over the past 24 hours (left) to its price movement over the past week (right). The gray bands are Bollinger Bands, measuring the volatility for both the daily and weekly price movements. The wider the bands are, or the larger the gray area is at any given moment, the larger the volatility.

The trading volume for the coin has increased 10.0% over the past week while the overall circulating supply of the coin has increased 0.59% to over 19.08 million which makes up an estimated 90.85% of its max supply, which is 21.00 million. The current market cap ranking for BCH is #24 at $3.75 billion.

Powered by CoinGecko API

This article was generated by Benzinga's automated content engine and reviewed by an editor.

Go here to see the original:
Cryptocurrency Bitcoin Cash's Price Increased More Than 3% Within 24 hours - Benzinga - Benzinga

Read More..

A deep sleep to awaken your body – The New Indian Express

Express News Service

CHENNAI:Yoga is an ancient science that acts as a restorative therapy for overall relaxation of the mind and body. Yoga Nidra is a unique recuperative technique wherein sleep is used as a meditation process for healing purposes. Also known as yogic sleep, its a guided process wherein experts direct practitioners into a deep state of relaxation that stands on the edge of waking and sleeping, which results in calming the nervous system down.

Yoga Nidra does not involve performing asanas; instead its about relaxing, getting into a meditative state of mind and going into conscious sleep. While in meditation one is awake, in yogic sleep its possible to enter a state of bliss which is deeply healing; the mind and body are relaxed while the consciousness is awake.

Best practice forworking professionalsStress and anxiety have become normal part of working professionals life that puts untold strain on the autonomic nervous system. Stress and anxiety not only affects bodily functions such as breathing, blood flow, heart beat, digestion, etc but also disturbs sleep and decreases focus, creativity, clarity and concentration. All these problems have a direct and indirect effect on the productivity.With Yoga Nidra, our nervous system releases a powerful antioxidant called melatonin into the bloodstream that helps manage blood pressure, digestive ability, stress levels and the immune function better and also induces restful sleep. The relaxation achieved through Yoga Nidra helps in improved concentration, clairvoyance, focus and productivity.

Benefits of Yoga NidraImproves cognitive abilities: Yoga calms the mind and body, and slows down the nervous system activities. Thus, stress is reduced, and its physical and mental symptoms such as muscle tensions and headaches are also released. Yoga Nidra also enhances ones cognitive abilities since the mind stops stressing and instead focuses on thinking clearly, be it for problem-solving purpose, creativity thinking, etc. Not being overwhelmed by stress, the mind is free to function at its full capability. Hence, yogic sleep also stalls the cognitive aging of the brain, and thus results in improved attention span and memory, which are important to carry out everyday activities.

Improves focus and clarity of mind and encourages a state of mindfulness: When one is not stressed, it becomes easy to have a clear mind and focus better on relevant matters, and being mindful. The quality of mindfulness is born out of the acceptance of the present moment without any judgment or worry and living it fully. This quality is another positive effect of the Yoga Nidra practice. Integrating mindfulness into everyday life, allows one to live with a clear, calm purpose that fosters a good quality of life.

Growing self confidence and esteem: It has been noted that with the regular practice of Yoga Nidra, a persons self-esteem and confidence can be thoroughly improved. An essential step in guided meditation and Yoga Nidra is the setting of intentions or sankalpas for oneself, which are essentially goals that one desires to fulfil. Achieving a goal is exhilarating and does wonders for ones confidence and esteem, which is what Yoga Nidra propels one to do.

Improves the quality of sleep and overall health: Yogic sleep is immensely effective in improving sleep quality and regularising its patterns. Since one is less stressed, once one makes a sankalpa to sleep, they do so effortlessly, faster and with regularity. A good nights rest signals the absence of sleep disorders that are the cause and symptoms of many diseases. Through the frequent practice of Yoga Nidra, one enhances ones sleep cycles and it stands to reason that ones healthy as well, since their blood pressure and cholesterol levels are lowered, and while immune and nervous systems functioning is improved, and theres lots of energy at ones disposal.

Diminishes symptoms of stress, anxiety, depression, chronic pain and PTSD: All psychological disorders are primarily born of an unquiet mind and heightened negative emotions. Since Yoga Nidra calms the mind and releases pent-up emotions, it mitigates stress to lead the persons focus on thinking clearer and working better. Hence, Yoga Nidra has been adopted the world over to treat anxiety, depression, chronic pain and post-traumatic stress disorder (PTSD) among others.

All the disorders can be eliminated with the ability of the teacher to direct practitioners into a deep state of relaxation, which results in the mind and body having the opportunity to rest, recover, and recuperate. Since, Yoga Nidra also reduces the instances of inflammation by improving the immune system, most aches and pains are dealt with effectively as well.

Mental health of practitioners has always seemed to have benefitted from the practice of this ancient therapy. The impact of Yoga Nidra on the mental health of college professors 1 saw the intervention group show enhanced results than the control group, with zero exposure to yogic sleep. Some researches clearly state this practice as a simple, effective treatment for insomnia and sleep disorders.

How to practise Yoga Nidra?Practising Yoga Nidra requires a bit of patience initially, so wear comfortable workout clothes and lay down on a yoga mat in the Savasana pose, with eyes closed. Its best to choose a dark corner with no distractions in order to induce the required peace of mind.

Many people practise it right before turning in for the night as they believe it improves the quality of their sleep. Yoga Nidra can completely change ones life with regular practice. So if a relaxed, stress-free life is what you desire, consider adding this practice to your everyday course of life. It can be the secret of your successful and balance professional life.

(The writer is founder of Divine Soul Yoga)

Visit link:
A deep sleep to awaken your body - The New Indian Express

Read More..

‘I don’t really trust papers out of top AI labs anymore’ – Analytics India Magazine

The role of scientific research in pushing the frontiers of artificial intelligence cannot be overstated. The researchers working at MITs Computer Science and Artificial Intelligence Laboratory, Stanford Artificial Intelligence Laboratory, Oxford University and many other top labs are shaping the future of humanity. In addition, most top AI labs, even the private players such as DeepMind and OpenAI, publish on preprint servers to democratise and share knowledge.

But, how useful are these papers for the community at large?

Recently, a Reddit user published a post titled, I dont really trust papers out of Top Labs anymore. In the post, the user asked: Why should the AI community trust these papers published by a handful of corporations and the occasional universities? Why should I trust that your ideas are even any good? I cant check them; I cant apply them to my own projects.

Citing the research paper titled An Evolutionary Approach to Dynamic Introduction of Tasks in Large-scale Multitask Learning Systems, the Reddit user said, Its 18 pages of talking through this pretty convoluted evolutionary and multitask learning algorithm; its pretty interesting, solves a bunch of problems. But two notes. One, the big number they cite as the success metric is 99.43 on CIFAR-10, against a SotA of 99.40.

The Reddit user also referred to a chart towards the end of the paper that details how many TPU core-hours were used for just the training regimens that resulted in the final results.

The total is 17,810 core-hours. Lets assume that for someone who doesnt work at Google, youd have to use on-demand pricing of USD3.22 per hour. This means that these trained models cost USD57,348.

Strictly speaking, throwing enough compute at a general enough genetic algorithm will eventually produce arbitrarily good performance, so while you can read this paper and collect interesting ideas about how to use genetic algorithms to accomplish multitask learning by having each new task leverage learned weights from previous tasks by defining modifications to a subset of components of a pre-existing model, he said.

Jathan Sadowski, a senior fellow at Emerging Tech Lab, responded: AI/ML research at places like Google and OpenAI is based on spending absurd amounts of money, compute, and electricity to brute force arbitrary improvements. The inequality, the trade-offs, the wasteall for incremental progress toward a bad future.

The Reddit post has been a source of much debate on social media. Many pointed out that there should be a new journal for papers where one can replicate their results in under eight hours on a single GPU.

Findings that cant be replicated are intrinsically less reliable. And the fact that the ML community is maturing towards decent scientific practices instead of anecdotes is a positive sign, said Leon Derczynski, assistant professor at IT University of Copenhagen.

The replication crisis has been gripping the scientific community for ages. The AI domain is also grappling with it, mostly because researchers often dont share their source code. A replication crisis refers to when scientific studies are difficult or impossible to reproduce.

According to a 2016 Nature survey, more than 70 percent of researchers have tried and failed to reproduce another scientists experiments. Further, more than 50 percent of them have failed to reproduce their own experiments.

Reproducibility is the basis of quality assurance in science as it enables past findings to be independently verified.

The scientific and research community strongly believes that withholding important aspects of studies, especially in domains where larger public good and societal well-being are concerned, does a great disservice.

According to the 2020 State of AI report, only 15 percent of AI studies share their code, and industry researchers are often the culprits. The report criticises OpenAI and DeepMind, two of the worlds best AI research labs, for not open sourcing their code.

In 2020, Google Health published a paper in Nature that described how AI was leveraged to look for signs of breast cancer in medical images. But Google drew flak as it provided little information about its code and how it was tested. Many questioned the viability of the paper, and a group of 31 researchers published another paper in Nature titled Transparency and reproducibility in artificial intelligence. Benjamin Haibe-Kains, one of the papers authors, called Googles paper an advertisement for cool technology with no practical use.

However, things are changing. NeurIPS now asks authors/researchers to produce a reproducibility checklist along with their submissions. This checklist consists of information such as the number of models trained, computing power used, and links to code and datasets. Another initiative called the Papers with Code project was started with a mission to create free and open-source ML papers, code and evaluation tables.

Visit link:
'I don't really trust papers out of top AI labs anymore' - Analytics India Magazine

Read More..

A Deep Dive Into The World’s Most Popular Personality Test: The MBTI – mindbodygreen.com

Being what's probably the most popular personality assessment in the world today, the MBTI has, of course, come up against some criticism. In addition to the research's mixed results when it comes to the assessment's reliability, Hallett explains some of the main criticisms also include that people's results can change, or that people can feel "boxed-in" by the results.

In a 1993 paper titled "Measuring the MBTI and Coming Up Short," David Pittenger, Ph.D., a professor of psychology at Marshall University, reviews the research on the Myers-Briggs test and raises questions about its underlying concepts. "The MBTI reminds us of the obvious truth that all people are not alike, but then claims that every person can be fit neatly into one of 16 boxes," he writes. "I believe that MBTI attempts to force the complexities of human personality into an artificial and limiting classification scheme. The focus on the 'typing' of people reduces the attention paid to the unique qualities and potential of each individual."

To that, Hackston and Nardi explain that these types are about preferences, and your type doesn't suggest you can't move outside your own preferences. Nardi says you can think of it like whether you're left or right-handed. "If I'm right-handed, that doesn't mean I don't use my left hand, or I don't use my hands together," he explains.

Hackston notes the results are meant to be more of a "springboard" for understanding your preferences so you can recognize your own patterns and actively choose to "go against your type" when situations call for it.

Some experts also do not respect the work of Carl Jung, Katharine Cook Briggs, or Isabel Briggs Myers. Jung, for one thing, has received plenty of criticism, given how much of his theories were based on his own dreams and ideas as opposed to scientific fact. Cook Briggs and Briggs Myers were also not trained psychologists or mental health professionals, though Nardi points out that this particular criticism is "actually incredibly sexist because, at the time, it was very difficult for women to become psychologists or even get into college."

Another criticism of the MBTI is using it to assess or predict performance in the workplace, which Hackston, Hallett, and Nardi all agree is not what this assessment is intended for. "It's not about performanceit's about preference. No personality assessment should be used for hiring, and in some states, it's actually illegal to use it that way," Nardi notes.

See the rest here:
A Deep Dive Into The World's Most Popular Personality Test: The MBTI - mindbodygreen.com

Read More..

Edge coming to the rescue of cloud – ITWeb

Pramod Venkatesh, Group Chief Technology Officer, Inq.

Cloud computing is at risk of being throttled by its own success, overwhelmed by the rising tsunami of data. However, edge computing is riding to the rescue, particularly in regions like Africa, which are most vulnerable to clouds three major weaknesses: bandwidth limitations, excess latency and network congestion.

Thats the view of Pramod Venkatesh, Group CTO at Inq. While acknowledging that the concept of edge computing isnt new its roots go back to the earliest days of remote or distributed computing he maintains that edge computing is the next evolution of cloud computing.

Cloud computing itself is an evolution of traditional enterprise, client-server computing where data is moved from a users computer across a WAN or the internet to a centralised computer, where it is either stored or worked on, and the results sent back to the user.

With the rise of 5G networks, more companies than ever can harness comprehensive data analysis without the IT infrastructure needed in previous generations. Thats the power of the cloud, Venkatesh says.

But the quantity of data moving across the internet is enormous and getting more so by the minute. The World Economic Forum estimated that at the start of 2020, there was about 44 zettabytes of data in the world or 40 times more data in the digital realm than observable stars in the universe. By 2025, that unfathomable number would be added to by another 463 exabytes of newly generated data produced by an ever-increasing number of connected devices, every single day.

Gartner has predicted that by 2025, three-quarters of all enterprise-generated data will be created outside centralised data centres on or by these devices. The internet would buckle under the load.

Traditional cloud platforms, including those set up and operated by the worlds largest providers, which are struggling to cope now, could be overwhelmed. The impact on time- and disruption-sensitive data could be catastrophic. It doesnt take much imagination to appreciate the chaos that would result if data sent from a self-drive car for analysis at some distant data centre were delayed, disrupted or distorted by the time the confirmation came back that the car was approaching a hazard, lives could be lost.

The further the data centre where the analysis is to happen, from the end-point, where the analysis is needed the greater the risk of delay. And in Africa, those distances are not only large, theyre also with the exception of South Africa across borders.

From a regulatory perspective, cross-border data transfer can be problematic in some African countries, requiring certain types of data to be processed in-country, Venkatesh says.

Another major issue with traditional cloud computing in Africa is the cost of connectivity, which is still significantly higher than in the rest of the world. Add to that the fact that the locations from which data is being generated may be in hostile environments down a mine, for example with limited or intermittent connectivity.

According to Venkatesh, edge computing effectively addresses all these issues as the data is processed as close as possible to where it is generated, even possibly on the device that collects or generates the data in the first place.

The beauty of edge computing is that it has endless potential applications, particularly when those applications require some form of AI. This can range from security and medical monitoring to self-driving vehicles, video conferencing and enhanced customer experiences, he says.

Many users today are not even aware that they are using some form of edge computing. For example, its already widely used in entertainment and gaming: streaming music and video platforms often cache information to reduce latency, thus offering more network flexibility when it comes to user traffic demands, he says.

An ever-increasing number of devices are needed to communicate and process data in a localised environment, such as devices like voice assistants. Without the help of decentralised processing power, devices like Amazon Alexa and Google Assistant would take far more time to find requested answers for users.

Manufacturers use edge computing to keep a closer eye on their operations. Edge computing enables companies to monitor equipment and production lines for efficiency and even detect failures before they happen, helping to avoid costly downtime delays.

Edge computing is even being used in a mine in Zambia to detect dangerous snakes and warn miners of their location in real-time. Edge computing is also being used to detect cars and goods arriving and leaving company premises, thus preventing unauthorised use as well as theft.

As all or most of the computing work is done at the edge, only data that requires deeper analysis, review or other human interaction need be sent back to the main data centre. The amount of data to be sent is thus vastly reduced, requiring less bandwidth or connectivity time than would otherwise be the case. Edge computing is thus reshaping IT and business computing, he adds.

However, Venkatesh warns that edge computing comes with challenges of its own, not least of which is security, both physical and cyber, as well as the management and control of edge devices.

Nevertheless, he believes we are only just beginning to scratch the surface of edge computings potential and predicts that the uptake of edge computing, particularly in Africa, will continue to accelerate over the next decade.

See more here:
Edge coming to the rescue of cloud - ITWeb

Read More..

Going from COBOL to Cloud Native The New Stack – thenewstack.io

Virtually every technology publication these days is full of cloud stories, often about the success that has been achieved by webscale companies doing amazing things exclusively in the cloud. Unlike Netflix, Twitter and Facebook, however, most companies have a heritage that predates the availability of cloud computing.

Mark Hinkle

Mark has a long history in emerging technologies and open source. Before co-founding TriggerMesh, he was the executive director of the Node.js Foundation and an executive at Citrix, Cloud.com and Zenoss, where he led its open source efforts.

Unlike these relatively young companies that have the benefit of starting more recently and growing to maturity in the cloud native era, there are myriad companies that may feel that they are held hostage by legacy infrastructure that cant be migrated to the cloud for reasons of risk, compliance or compatibility.

Just because you have a legacy investment that would be disruptive to move doesnt mean you cant adopt cloud or cloud native systems that enable new digital initiatives and still capitalize on those legacy investments. However, it does mean that you need to find ways to integrate in a nondisruptive way.

There are a few practices you can put in place to get to a cloud native environment while still using your existing legacy investment. I advocate working on adopting cloud native practices and architecture patterns that can ease your implementation of cloud computing incrementally, which involves adopting cloud computing architecture patterns on-premises.

In the early days of the internet, the idea of stacks was prevalent. In regards to delivering web-based services, Microsoft had the WIMSA (Windows, IIS, SQL Server and ASP) and open source users had the LAMP (Linux, Apache, MySQL, PHP). The LAMP stack was the most democratic, allowing you to choose the vendors for your stack, and vendors provided a single throat to choke should something go awry. The choice to choose the layers of the stack was a benefit many users of legacy technology may not realize today.

When you look at todays applications, the gold standard for reliability is Java. Though you need to manage the JVMs, you need to tune the stack and use garbage collection to manage memory. You also need an app server to serve the instances. Taking a container-based approach to running individual services, you can leverage Kubernetes and Knative (both housed in the CNCF), which can simplify things by scaling containers automatically both up and down as needed.

Kubernetes and containers make application environments portable from on premises to the cloud and back again. An example of how you could get the best of both worlds is to consider Spring Boot, an open source framework for Java developers aimed at cloud native deployments that can be deployed in containers that can run on premises with Kubernetes or in the cloud.

Using composable infrastructure is the best practice, taking the best technologies and solutions to build systems that are decoupled but well integrated. Gartner describes the Composable Enterprise as a composable business made from interchangeable building blocks and follows four principles of composable business: modularity, autonomy, orchestration and discovery. The idea that any system or application can benefit from composability is often overlooked. Anything can be part of composable infrastructure, not just cloud services.

We experience batch processing every day. Our banks typically process our deposits overnight, and we dont see that in our banking app until after the batch processes. The same thing applies to our utilities that process the usage on a monthly basis, and we only see the consumption once a month.

Batch processing was used because the load placed on the data warehouse could potentially interrupt or slow down business operations. So the goal would be to move to an architecture that increases the speed of delivery of data without interrupting current business operations. Thats where extract, load, and transform (ELT) and event-driven architecture (EDA) can help.

Many times, we use the term replicating data and syncing data interchangeably. Technically, theres an important difference. Replication implies a copy of the data (or some subset thereof) is maintained to keep the data closer to the user, often for performance or latency reasons. Synchronization implies that two or more copies of data are being kept up to date but not necessarily that each copy contains all the data, though there is the idea that some consistency is kept between the data sources.

Using an event-streaming technology like Apache Kafka, you can replicate data from read-only data producers (databases, ERP systems, keeping your attack face smaller since you arent granting writes to the database). You can also choose to replicate only whats needed for other systems like mobile apps, web portals and other customer-facing systems without necessarily having them place the load on the canonical database.

Figure 1.1 Extract, transform, and load versus extract, load, and transform

When you look at any major cloud provider, the pattern of event-driven architecture is prevalent. In AWS, for example, services are decoupled and run in response to events. They are made up of three types of infrastructure: event producers, event consumers and an event router.

While AWS deals exclusively in services, your enterprise likely has things like message buses and server software that logs activity on the server. These systems can be event producers. They can be streamed via Kafka or consumed from your log server directly by an event router. In this usage, I suggest the project I work on, the open source TriggerMesh Cloud Native Integration platform to connect, split, enrich and transform these event sources.

For example, you can forward messages from your mainframe using the IBM MQ message bus to integrate your legacy and cloud services like Snowflake. Using the event payloads, you can create data replication without additional load on the producer. You can change that event to a format consumable by the event consumer by changing that event or enriching that event on the fly.

By decoupling the event consumer and producer, you can change the destinations in the event you change vendors (move from AWS to Google) or add additional sources where you may want to replicate data. You also get the benefit of creating synchronizations in real-time, which is in contrast to waiting on batched data to arrive.

EDA isnt a silver bullet. There are times when you may need to make synchronous API calls. Using APIs, you can make queries based on some set of conditions that cant be anticipated. In that case, I am a fan of using open source, cloud native technologies like Kongs API Gateway.

When you talk about code, you might have heard the term WET (Write Everything Twice) as opposed to DRY (Dont Repeat Yourself). In the world of development, WET refers to poor coding that needs to be rewritten and DRY is writing more efficient code that doesnt need to be rewritten. In integration, its not an exact correlation, but I believe synchronous API integration is often WET; you write to the API and then write the response that the API returns.

There are many good reasons to do this when you need to complete a complex integration that requires look-ups and a complex answer. However, it can be overkill.

Event-driven architecture (EDA) provides a way for DRY integration by providing an event stream that can be consumed passively. There are many advantages. If you are forwarding changes via the event streams, you can even do whats called change data capture (CDC).

Change data capture is a software process that identifies and tracks changes to data in a database. CDC provides real-time or near-real-time movement of data by moving and processing data continuously as new database events occur. Event-driven architectures can accomplish this by using events that are already being written but then can be streamed to multiple sources.

Many corporations face one of the most entrenched pieces of legacy technology in the cloud. Although, until I went digging, I didnt realize the full extent of this. Mainframes still run a large amount of COBOL. In fact, our whole financial system relies on technology that is unlikely to move to the cloud in the near future.

One of the most interesting and unforeseen integrations I have run into is the integration of mainframes with the cloud. While Amazon doesnt have an AWS Mainframe-as-a-Service, there is a benefit in integrating workflows between mainframes and the cloud. One global rental car company I work with has an extensive workflow that takes data stored in IBM mainframe copybooks and transforms it into events that are consumed to automate workflows in AWS SQS.

There are many reasons you might want to forward mainframe traffic and not just for workflows, but for data replication, real-time dashboards or to take advantage of cloud services that have no data center equivalent. Also, because you arent logging in to the event-producing system, there can be a security benefit of a smaller attack surface exposing only the event stream and not the host system.

I believe strongly that going forward there will be two main types of infrastructure: those served by cloud providers as services and open source software. Open source has eaten the world. Linux is the dominant operating system in the cloud and the data center. Kubernetes is becoming the open source cloud native fabric of the cloud. Then there is an abundance of free and open source data center software from multibillion-dollar corporations, consortia and innovative start-ups alike.

One incredibly interesting example of composable infrastructure is the ONUG Cloud Security Notification Framework. CSNF is an open source initiative led by FedEx, Raytheon and Cigna that tackles the difficulty of providing security assurance for multiple clouds at scale caused by the large volume of events and security state messaging. The problem is compounded when using multiple cloud service providers (CSPs) due to the lack of standardized events and alerts among CSPs.

Figure 1.2 Architecture diagram of composable infrastructure for ONUG Cloud Security Notification Framework

This gap translates into increased toil and decreased efficiency for the enterprise cloud consumer. Cloud Security Notification Framework (CSNF), developed by the ONUG Collaboratives Automated Cloud Governance (ACG) Working Group, is working to create a standardization process without sacrificing innovation.

The interesting thing about CSNF is that its a loosely coupled set of technologies that can incorporate both cloud services and on-premises technologies. While the initial goal is to normalize security events from cloud providers into a single format, it can also incorporate any number of other tools and data sources as appropriate.

While your existing infrastructure may not be completely modern, theres no reason you cant benefit from modern technologies and cloud services through integration. Firstly, integration is arguably the key to modernization without the dreaded lift and shift. If you look at your integration layer today, Id consider a number of tactics:

For IT operations to thrive, they need to adopt agile practices like DevOps and technologies that are open source, event-driven and cloud native. Though, even if you have an IT heritage to consider, it doesnt mean you are stuck in the past. In the modern world of open source cloud native technologies, you can still reap benefits without a wholesale move to the cloud.

Featured image via Pixabay

View post:
Going from COBOL to Cloud Native The New Stack - thenewstack.io

Read More..