Page 2,467«..1020..2,4662,4672,4682,469..2,4802,490..»

Artificial Intelligence, Machine Learning, and Biometric Security Technology will be Drivers of Digital Transformation in 2022 And Beyond: IEEE…

Published on November 25, 2021

Bengaluru IEEE, the worlds largest technical professional organization committed to advancing technology for humanity, today concluded its virtual roundtable focused on The Next Big Thing in Technology, the top technologies that will have a massive impact in 2022 and beyond. With the ongoing COVID-19 pandemic where digitization and technology have become increasingly powerful drivers for innovation, IEEE curated this roundtable to discuss how AI, ML, and advanced security mechanisms are fuelling industries to drastically increase productivity, automate systems to achieve better accuracy, and help workforces outperform while minimizing tedious repetitive tasks. AI-driven learning systems are generating more opportunities for intertwining technology trends which will only continue in 2022.

Speaking in the roundtable about The Impact of Technology in 2022, Sukanya Mandal, IEEE Member, and Founder and Data Science Professional explained, AI and ML are creating strides for technological advancements and will be extremely vital for our future to increase output, bring specialization into job roles, and increase the importance of human skills such as problem-solving, quantitative skills, and creativity. I strongly believe the future will consist of people and machines working together to improve and adapt to a modern way of working. AI will also play a critical role in all aspects of e-commerce, from customer experiences and marketing to fulfillment and distribution.

Recently published research on Artificial Intelligence and the Future of Work conducted by MIT Work of The Future, highlights that AI continues to push large-scale innovation, create more jobs, advance labor processes, and holds the immense potential to impact various sectors. Furthermore, a Gartner report predicts that half of data centers around the world will deploy advanced robotics with AI and ML capabilities by 2025, which is estimated to lead to 30% higher operating efficiencies.

Industry 4.0 is all about interconnecting machines, processes, and systems for maximum process optimization. Along the same lines, Industry 5.0 will be focused on the interaction between humans and machines. It is all about recognizing human expertise and creatively interconnecting with machine intelligence for process optimization. It is true to say that we are not far away from the 5th industrial revolution. Over this decade and the next, we will witness applications of IoT and smart systems adhering to the principles of the 5th industrial revolution across various sectors., she further added.

The roundtable also focused on Redefining the Future of Biometric Security Technology. AI-Machine Learning-based systems, in collaboration with the latest technologies such as IoT, Cloud Computing, and Data Science, have successfully advanced Biometrics. Biometric systems generate huge volumes of data that can be managed with Machine Learning techniques for better handling and space management. Deep learning can also play a vital role in analyzing data to build automated systems that achieve better accuracy. A report by Carnegie Endowment for International Peace stated that 75 countries, representing 43 percent of a total of 176 countries, are actively leveraging AI capabilities for biometric purposes, including facial recognition systems, smart cities, and others.

Commenting on this, Sambit Bakshi, Senior IEEE Member, said, During the pandemic, we all saw the increased use of technology in public places such as airports, train stations, etc., not only to monitor body temperatures but also to help maintain COVID protocols. Biometric technologies are rapidly becoming a part of the daily lives of people around the world.

Biometric authentication is likely to expand in the coming years. Multimodal authentication exercises a combination of similar biometric technologies to authenticate someone. Cues from different platforms can be integrated through cloud computing and IoT-based architecture to verify someones identity. These can include gait features or anthropometric signatures. The future of biometric security lies in simplicity. Improving modern techniques is the simplest way to offer a high level of protection.

The rest is here:
Artificial Intelligence, Machine Learning, and Biometric Security Technology will be Drivers of Digital Transformation in 2022 And Beyond: IEEE...

Read More..

The Crucial Role Of Wild Horses In Bitcoin Mining – Bitcoin Magazine

Just over 13 years ago a tsunami was silently and slowly building from the force of Satoshi Nakamotos newly released paper Bitcoin: A Peer-to-Peer Electronic Cash System. At the time only a handful of cryptography enthusiasts were aware of Bitcoin, and even they were grappling with its viability. And since the source code was still being refined and the genesis block had yet to be mined, Satoshi was grinding away in obscurity oblivious to the havoc Bitcoin was about to unleash on the world.

Any Bitcoin enthusiast knows that a fundamental element of Satoshis Bitcoin architecture is the selection of Proof-of-ork (PoW) as a consensus mechanism. Today when most people think of PoW, they immediately think of Bitcoin mining, and when they think of Bitcoin mining they garner images of ASIC-based mining servers lining a warehouse. But while Satoshi made analogies to gold mining, he never publicly used the word miner. The closest he (or she) probably came was the phrase proof of worker. He also talked about things like your computer's heat is offsetting your baseboard electric heating regarding the cost of running a node and implying that he viewed the PoW function as something that would be performed largely by individuals in homes. The Satoshi of 2008 would likely have found the direction of the Bitcoin mining infrastructure baffling, and, maybe, like me, a bit concerning. The basis of my concerns are the emerging trends for the Bitcoin mining network to lack diversity in the scale of operations and toward a dependence on third-party controlled energy sources.

To illustrate this situation lets look to the animal world. Consider three animals and their key characteristics: elephants, horses, and rabbits. Elephants are very large and mighty, slow to move over distance, hard to hide, reproduce slowly and are somewhat rare. Horses are powerful, can move quickly even over long distances; they can be difficult to find, theyre plentiful but building their population takes time. Rabbits are small and extremely quick over short distances, and fairly easy to spot individually; however, they are innumerable and can multiply at an astounding rate. Like all animals, some live in captivity and some in the wild. With this in mind, consider that the elephants are like the very large Bitcoin mining sites, that the rabbits are like the home miners, and that the horses are like the small to medium-sized mining operations. Regardless of the site size, those reliant on the grid for power are captive animals and those which produce and control their own power are wild.

In the beginning, Bitcoins mining infrastructure was all captive rabbits, just PCs running in offices, spare bedrooms, dorm rooms and garages. Over the first few years it was only the captive rabbits forming the network, but slowly a few captive horses started to appear as larger commercial efforts started to take over in back rooms, small warehouses, and old data centers. While there are plenty of captive rabbits and horses now, we are moving squarely into the era of the captive elephant. Bitcoin news feeds are littered regularly with announcements of new facilities which will house thousands of mining servers (ASICs), consume dozens or even hundreds of megawatts of electricity and cost hundreds of millions of dollars to bring online.

wild and captive animals as bitcoin minersCaptive elephant sites are very important as they bring massive computing power and security to the Bitcoin network, and in many cases, they are designed in partnership with utility companies to provide the stability and economic incentive to expand grid capacity. But there is danger in the growing trend for the Bitcoin network to become more dependent on them.

If you think of the mining servers as the food for the mining community then the elephants have a preferred seat at the dining table. Right now, server supply is extremely tight and unfortunately the elephants voracious appetites are leaving the horses and the rabbits to fend for scraps. It is certainly understandable that manufacturers like Bitmain, MicroBT and Bitfury give preference to the elephants given their limited production capacity and that the elephants simplify business for them. This is because they can make commitments to inventory months (or even years) in advance, they have the capital to make large deposits on their orders, and it is easier for the sales and support teams to deal with just a few elephant-class clients instead of a plethora of rabbits and horses. As a result, for the past few years rabbits and horses have largely been forced to buy products at a premium on the gray market, or use older technology, and in many cases, theyve been left out entirely.

The elephants have been getting fatter and fatter by consuming such a massive portion of the food supply and this has created a famine for the rabbits and horses. Since technology refreshes are ultimately mandatory for all miners, if the food supply for the rabbits and horses does not improve soon it will result in death by starvation for many. Should that occur, the entire Bitcoin ecosystem takes on a large vulnerability. If a large super majority of the global hash were owned by the elephants, and given that elephants are easy to find(hunt) and slow to reproduce, then a targeted effort to impair them and compromise the networks integrity takes on a realistic possibility. This could be from a coordinated effort involving legislation and regulation, directed attacks to physically destroy the sites, and/or even attacks at the elephants power sources. Such an attack against the Bitcoin network was initiated by China in the spring of 2021, so we must consider the possibility of another attack in the future. The next one might be larger, better funded, might involve arms and could even involve the confiscation and directed usage of the mining sites against the network. The more likely that such an attack would be successful, the more likely it is to occur especially as Bitcoin has become a growing threat to the global financial infrastructure and to government power.

Bitcoin defended itself marvelously in the China attack of 21, but it would be foolish to assume that future attacks could be so easily turned back. For instance, imagine the impact of a subtle change in Chinas strategy; instead of simply demanding all mining stop, what if China had confiscated all the hash rate and directed it against the network. If so, wed be having very different discussions today about the state of mining and might even find ourselves in the middle of the first world war in the digital space. To be certain that Bitcoins defense systems remain at full readiness, it is crucial that a robust population of rabbits and horses, both in and out of captivity, exists to provide a reliable base layer of hashing power.

When the China attack of 21 occurred, the Bitcoin network withstood the resulting 50-60% decrease in hashing power quite well. Because we now have the benefit of understanding that the network can maintain itself with this level of impairment, and that we want to remove the possibility of a hostile party confiscating enough hash power to wage a 51% attack, this gives us a good approximation of the maximum amount of ecosystems hash power that should be in elephant sites, especially captive elephant sites. The Bitcoin community should monitor these levels and never allow them to drift too far from 51%. Of course, the hashing power of the elephants will be split between wild and captive sites but because captive sites present a higher risk, setting an upper threshold to the amount of hashing power there is very important.

Before determining the appropriate split of captive and wild power sources for the elephants, lets first delve deeper into the definitions of wild and captive sites. A captive site is one in which the sites power is provided by an external, trusted third-party, or power master. This would typically be a utility company via a connection to a public power grid. Captive sites are usually in the vicinity of population centers as the economics of a grid power generally require a significant population and commercial activity to justify their existence.

A wild site is one in which a miner generates electricity on-site and that electricity moves to the mining equipment without passing through an intermediary. This would typically be implemented by using energy sources like flared gas, stranded gas, steam from geo-thermal sources, or small-scale hydroelectric solutions. Wind and solar power sources are possible as a portion of a mining sites power solution, but their intermittent nature means that they are typically accompanied by a parallel and supporting grid solution. Wild sites require more technical ability to build and maintain, have higher capital costs, need more space, and often have special safety requirements. Wild sites are possible even in the most remote areas, under the harshest environmental conditions and do not require the economic impact of also supporting others. An extreme example of a wild site would be a satellite containing mining equipment launched into earth orbit and powered by an on-board nuclear reactor. There are an infinite number of possible locations for wild sites, while the number of possible captive sites is finite. Finally, because generating extremely large amounts of consistent, reliable energy is hard in remote locations, wild elephant sites are, and will be, somewhat rare.

There are no metrics currently available showing the split of captured hash power from wild hash power; however, wild sites of any size are currently very unusual, and it can be said with certainty that wild hash rate makes up well less than 10% of the global total, and it could easily be as low as 1%. This means that at least 90% or more is captive and reliant on a power master a dangerous spot to be in. Obviously, allowing anyone to gain control of 51% of the network is dangerous, and since wild hash power is very difficult to secure through legislation or force, ideally 50% or more of the global hash power should be wild. However, for the next few years we will continue to have a huge and widening gap because massive increases in captive elephant hash power are already in motion. Our best hope in the near term (current halving cycle) would be to simply maintain something close to the present level and then strive for a 20% wild hash rate in the following halving cycle, and 50% in the subsequent one. Hitting these targets on the nose isnt crucial, it is important only that we be in the rough vicinity.

If we were able to achieve a split like the one in the target chart shown above, then 50% of the hash would be overseen by elephants and 50% by rabbits and horses, and 55% of the hash would be captive and 45% would be wild. Assuming that that this hash is also spread out in a geographically balanced manner, it would make it impossible for any bad actor, or even a group of bad actors, to compromise the mining ecosystem.

To set a course to achieve these targets each animal type will need help. First lets examine the rabbits. Foremost for the rabbits is access to mining servers. People buying individual machines have no clout or priority with the existing base of suppliers, and there are presently no retailers or even large distributors acting as a consolidation point. This results in rabbits being forced to buy on the spot market, and usually at a considerable premium to the prices paid by the elephants. Companies like Compass Mining do provide some means for individuals to get into mining, but those folks arent really rabbits as their units are hosted at horse and elephant sites. For a larger mix of mining to migrate toward rabbits the supply base must allocate a higher percentage of their inventory to individual sales, or they must establish a relationship with large retailers or distributors to support this market. It is encouraging that Blockstream and Square have both announced initiatives to develop ASICs, and that Jack Dorsey, Squares CEO, has specifically commented on wanting to support further decentralization of the network, thus inferring support for the rabbits. As mentioned earlier, rabbits are likely to be predominantly captive because the difficulty of producing and maintaining power at a small scale is challenging; however, over time it is likely that in areas where residential solar power is popular there will be some proliferation of wild rabbits.

The course for elephants is somewhat the opposite of rabbits. There is such massive momentum in the development of captive elephant sites that they may be placing the integrity of the network at risk. For instance, Riot Blockchain, Inc. is in the process of expanding its site in Rockdale, Texas to 700MW. This is very impressive and the accompanying leap in hashing power will initially help further secure the network; however, if most network expansion comes through similar captive elephant sites, then collectively these sites have the potential to become an Achilles heel. This is exacerbated by the fact that these captive elephant sites are being developed by the small number of organizations which have access to the enormous capital and resource requirements. Certainly, there is no implication that organizations like Riot should scale back their expansion efforts but, hopefully they will see that continuing a strategy of only developing captive elephant sites exposes both them and the network.

As crucial as development of wild elephants is to the health of the Bitcoin mining ecosystem, over the next handful of years nothing is more important than the expansion of wild horse sites. There are already several companies like Great American Mining, Upstream Data, Digital Shovel and my company, Barefoot Mining, which are building infrastructure equipment or doing development for wild horse sites. Interestingly, rapidly increasing wild horse sites is not dependent on finding energy; known stranded and flared gas sources alone have the potential to meet all wild horse needs. Adding in small-scale hydro and geo-thermal sources makes energy supply essentially infinite. Development of these sites is mostly dependent on raising capital. For instance, a wild horse site of about 2MW requires capital of $5 million to $10 million depending on the energy source and the mining equipment selected. To date, traditional commercial money-lending sources have been largely uninterested in supporting projects like this, especially for the small to medium-sized companies typically behind them. As a result, this usually forces these companies into fund-raising mode, but this is a time-consuming and frustrating process. This is because by the time a business plan is created and money has been raised, mining equipment costs and availability, and market conditions usually have changed too. In turn this means that the capital needs and pro-forma of the deal have changed so that a return to the investors is required. This can become a vicious cycle.

The good news is that over the past few years Barefoot Mining and others have brought wild horse sites into the network proving their technical and economic viability. This is leading to more confidence from investors in wild horse sites and more flexibility in how deals are created. This gives me great optimism that we are on the cusp (or in the midst) of a boon in the development of wild horse sites. Interest in this segment should continue to skyrocket and attract the capital it needs to become a major segment of the mining community. The energy is just waiting to be put to use.

The Bitcoin mining ecosystem has proven itself to be incredibly strong. It has weathered an attack from one of the largest, most powerful nations on earth without missing a beat and sometime early next year it will achieve a new all-time high in hash rate. There is a massive amount of money flowing into mining and on the surface all is well. However, it would be foolish for the Bitcoin mining community to assume that it is infallible and growing ever stronger. There is clearly a possibility of the mining network growing too asymmetrically, too top-heavy, and too captive, resulting in an unbalanced and exposed ecosystem. Nature has already taught us a lot about balance and survival. When an apex predator becomes too dominant and the population below it dwindles too far, the entire ecosystem collapses upon itself. Lets encourage and support the rabbits, and especially the wild horses, so that the diversity of the Bitcoin mining ecosystem becomes its great strength instead of its greatest weakness.

This is a guest post by Bob Burnett. Opinions expressed are entirely their own and do not necessarily reflect those of BTC Inc. or Bitcoin Magazine.

Go here to see the original:

The Crucial Role Of Wild Horses In Bitcoin Mining - Bitcoin Magazine

Read More..

ExoMiner Goes Planet Hunting! NASA’s Machine Learning Network Validates 301 New Exoplanets at One Go | The Weather Channel – Articles from The Weather…

This artist's illustration shows the planetary system K2-138, which was discovered by citizen scientists in 2017 using data from NASA's Kepler space telescope.

After the first exoplanet was identified almost three decades earlier, in 1992, humanity has come a long way in terms of exoplanet discovery. As of today, we have spotted over 4000 validated exoplanets that revolve around their respective suns.

Exoplanets are celestial bodies that exist outside our vast solar system. Equipped with cutting-edge technology, many research groups have been identifying these exoplanets left, right and centre.

However, for the first time ever, 301 validated planets were added to the ever-growing exoplanet tally all at once!

Wondering how? The US space agency NASA reported that a new deep neural network called 'ExoMiner' was responsible for this incredible scientific feat.

The ExoMiner leverages NASA's Pleiades supercomputer and, like any deep neural network, can automatically learn a task when provided with enough data. ExoMiner is designed with various tests, properties human experts use to confirm new exoplanets, past confirmed exoplanets, and false-positive cases in mind. Thus, it could tell apart actual exoplanets from imposters, making this technology and its predictions highly reliable.

"Unlike other exoplanet-detecting machine learning programs, ExoMiner isn't a black boxthere is no mystery as to why it decides something is a planet or not," said Jon Jenkins, an exoplanet scientist at NASA's Ames Research Center in California's Silicon Valley. "We can easily explain which features in the data lead ExoMiner to reject or confirm a planet."

It is a highly tedious process to comb vast datasets from missions like Kepler, which has hundreds of stars in its range of view, each with the potential to house numerous possible exoplanets. In such cases, the ExoMiner is the perfect substitute as it reduces the burden of astronomers in sifting through data and determining what is and isn't a planet.

"When ExoMiner says something is a planet, you can be sure it's a planet," said Hamed Valizadegan, ExoMiner project lead and machine learning manager with the Universities Space Research Association at Ames. "ExoMiner is highly accurate and in some ways more reliable than both existing machine classifiers and the human experts it's meant to emulate because of the biases that come with human labelling."

NASA said that all 301 machine-validated planets were originally detected by the Kepler Science Operations Center and were promoted to planet candidate status by the Kepler Science Office. But until ExoMiner, no one was able to validate them as planets.

And while none of the newly discovered planets is thought to be Earth-like or in their parent stars' habitable zones, they share some traits with the rest of the verified exoplanet population in our galaxy.

According to Jon Jenkins, the 301 discoveries will help researchers better understand planets and solar systems beyond our own and what makes ours so unique.

**

For weather, science, and COVID-19 updates on the go, download The Weather Channel App (on Android and iOS store). It's free!

Continue reading here:
ExoMiner Goes Planet Hunting! NASA's Machine Learning Network Validates 301 New Exoplanets at One Go | The Weather Channel - Articles from The Weather...

Read More..

BIS: What Does Machine Learning Say About The Drivers Of Inflation? – Exchange News Direct

SummaryFocus

Which are the key drivers of inflation, and what role do expectations play in the inflation process have been long-standing questions in macroeconomics, particularly given their relevance to economic policymaking. This paper sheds some fresh light on these central questions using machine learning.

I examine inflation in 20 advanced economies since 2000 through the lens of a flexible data-driven method. Beyond comparing explanatory performance with more traditional econometric methods, as far as possible, I also interpret the predicted relations between explanatory variables and consumer price inflation.

The machine learning model predicts headline and core CPI inflation relatively well, even when only a small standard set of macroeconomic indicators is used. Inflation prediction errors are smaller than with standard OLS models using the same set of explanatory variables which are firmly grounded on economic theory. Expectations emerge as the most important predictor of CPI inflation. That said, the relative importance of expectations has declined during the last 10 years.

This paper examines the drivers of CPI inflation through the lens of a simple, but computationally intensive machine learning technique. More specifically, it predicts inflation across 20 advanced countries between 2000 and 2021, relying on 1,000 regression trees that are constructed based on six key macroeconomic variables. This agnostic, purely data driven method delivers (relatively) good outcome prediction performance. Out of sample root mean square errors (RMSE) systematically beat even the in-sample benchmark econometric models, with a 28% RMSE reduction relative to a nave AR(1) model and a 8% RMSE reduction relative to OLS. Overall, the results highlight the role of expectations for inflation outcomes in advanced economies, even though their importance appears to have declined somewhat during the last 10 years.

Keywords: expectations, forecast, inflation, machine learning, oil price, output gap, Phillips curve.

JEL classification: E27, E30, E31, E37, E52, F41.

See the original post:
BIS: What Does Machine Learning Say About The Drivers Of Inflation? - Exchange News Direct

Read More..

Sprout Health Solutions Presents Data on Social Media Listening and the Patient Experience at ISPOR EU 2021 – PRNewswire

LONDON, Nov. 23, 2021 /PRNewswire/ -- Sprout Health Solutions presented new insights on the use of social media listening (SML) for understanding patient experience in chronic disease at Virtual ISPOR EU, the leading European conference for Health Economics and Outcomes Research (HEOR). Fernanda Trevisan, MSc, Scientist at Sprout, shared findings from a scoping review in the session, "Is Social Media Information Useful to Understand Patient Experiences and the Burden of Disease?" Her on-demand podium presentation is available starting today, in advance of the conference (November 30 December 3).

"As social media is increasingly used in healthcare research, we wanted to assess the tools available, including their limitations, benefits and ethical questions," said Trevisan. "The evidence suggests SML provides an inclusive, unfiltered and less burdensome method of capturing patient experiences that is both time and cost efficient. However, because it is a young field, there's more work to do in organizing methods, frameworks and guidelines."

Evolution of Social Media Listening

Originally tapped for branded consumer market research, SML is now utilized across healthcare to map a more authentic picture of patient journeys, including the impact of illness, symptoms, treatment beliefs, side effects and unmet needs.

According to Trevisan, SML can be a key pillar of patient insights work to develop relevant educational resources, and to inform clinical trial development. It's especially beneficial in populations less represented in typical research or where evidence is scarce. "For example, individuals with rare disease often rely on social media to learn about new treatments or share personal stories with those who face similar challenges. Even though the data is anonymized, you get a very clear sense of personal experiences over time and conversations peer-to-peer."

Trevisan noted the studies identified ethical issues and limitations, including user selection bias. And, because data mining algorithms tend to prioritize the most frequent mentions, it can be harder to identify all emerging themes.

To view the presentation, visit Sprout Health SolutionsIndustry Insights.

About Sprout Health SolutionsSprout is a specialist consultancy of experts in behavior science and health outcomes who design and deliver person-centered strategies and programs for improved health and regulatory success worldwide. Their two divisions, Sprout Health Outcomes and Sprout Behaviour Change, work synergistically to provide effective solutions for pharma, biotech, and digital health partners. http://www.sprout-hs.com.

SOURCE Sprout Health Solutions

Read the original:

Sprout Health Solutions Presents Data on Social Media Listening and the Patient Experience at ISPOR EU 2021 - PRNewswire

Read More..

Cryptocurrency prices today: Bitcoin, Ether weaken …

Cryptocurrency prices fell on Wednesday due to profit booking by investors. (Photo: Reuters)

Cryptocurrency prices fell slightly over the past 24 hours due to profit booking across the spectrum by investors.

The value of Bitcoin, which is the worlds largest cryptocurrency, decreased slightly as investors rushed to book profit a day after the popular virtual coin hit a record high. Bitcoin was trading at $66,636 or 1.35 per cent lower than its price 24 hours ago at 5:30 pm.

The market capitalisation declined slightly to $1.26 trillion and the 24-hour trading volume stood at $1.35 billion.

Ethers valuation also dipped slightly by 0.74 per cent and it was trading at $4,742. The market capitalisation stood at $557 billion and the 24-hour trading volume was $1.05 billion. Most other altcoins fell marginally while some remained flat.

Crypto highlights | Check yesterday's prices

There are changes that prices may fall over the next 24 hours but the long-term momentum remains steady.

Commenting on the momentum, Edul Patel, CEO and Co-founder of Mudrex, a global algorithm-based crypto investment platform, said, Over the past 24 hours, the cryptocurrency market witnessed profit booking across the spectrum. As trade volumes go up, the market might end up a bit lower.

Despite this profit booking, the long term investors need not worry as the momentum in the market is still pretty positive, he added.

Cryptocurrency

Price (US Dollar)

24-hour change

Market cap

Volume (24 Hours)

Bitcoin

66,826.40

-0.94%

$1.26 trillion

$1.35 billion

Ether

4,746.01

-0.64%

$557.44 billion

$1.05 billion

Dogecoin

0.272903

-0.72%

$36 billion

$1.96 billion

Litecoin

290.91

17.81%

$20.05 billion

$436.83 million

XRP

1.26

2.38%

$125.01 billion

$4.65 billion

Cardano

2.24

-1.44%

$73.33 billion

$671.42 million

DISCLAIMER: The cryptocurrency prices have been updated as of 05:45 pm and will change as the day progresses. The list is intended to give a rough idea regarding popular cryptocurrency trends and will be updated daily.

Click here for IndiaToday.ins complete coverage of the coronavirus pandemic.

Go here to see the original:
Cryptocurrency prices today: Bitcoin, Ether weaken ...

Read More..

Allorion Debuts with $40 Million to Enhance and Discover Precision Targets – BioSpace

Allorion Therapeutics is thankful for the $40 million in Series A financing itannouncedWednesday morning, a day ahead of Thanksgiving.

With headquarters in both Natick, Massachusetts and Guangzhou, China, Allorions drug discovery engine combines advances in protein structure, big data, machine learning and gene editing to discover and develop highly selective small molecules.

The precision medicine company is developing mutant selective and isoform-specific drugs in non-conventional ways for well-known targets in the oncology and autoimmune spaces, with the aim to improve upon efficacy and prevent resistance. Along with existing targets, Allorions proprietary technology systemically screens for synthetic lethality targets and allosteric inhibitors.

The company believes that carefully cultivating the synergy of these two approaches will allow it to build a robust pipeline with the intention of transforming the paradigm for cancer and autoimmune disease.

Founder and CEO Peter Ding expressed optimism about where Allorion currently stands and where he sees it heading.

Over the past year, Allorion has built up R&D capabilities in Boston and Guangzhou and formed a strong management and R&D team. Multiple projects achieved their milestones, Ding said in a statement. We are grateful to all the investors for their trust and support. Allorion will leverage cutting-edge technologies and strive to make precision medicine more precise and accessible to more patients.

The funding was led by Chinese VC Qiming Venture Partners and helped by participation from IDG Capital, Octagon Capital, Firstred Capital and Elikon Venture. Original investors TF Capital and Med-Fine Capital continue to like what they see and returned for this round.

There are huge unmet medical needs for autoimmune disease and cancer therapy globally. Based on data mining and an in-depth understanding of disease biology, Allorion focuses on the early discovery and development of precision medicines. We have confidence in the team's strong R&D capabilities. We hope to support Allorion to grow into a globally-recognized company and improve patients' life quality, said Qiming principal Chen Kan.

Allorion will apply the new funds to advance its preclinical projects, Investigational New Drug (IND)-enabling studies and to support the IND applications for two drug candidates. The company will also ramp up its investment in its novel screening technologies, and further build out both its clinical and business development teams.

The completion of this financing round shows investors' recognition of the progress and the support for the company's long-term strategy on highly innovative platforms for best- or first-in-class drug discovery, Ding added.

The oncology space can be thankful this year for a number of other innovative new players.Elucida Oncologylaunchedin January to develop drug conjugates with its C-Dots, which precisely target and penetrate tumors, and ArriVent Biopharma debuted in June with$150 millionin Series A financing and an epidermal growth factor receptor tyrosine kinase inhibitor (EGFR TKI) candidate for lung cancer.

Read the original:

Allorion Debuts with $40 Million to Enhance and Discover Precision Targets - BioSpace

Read More..

Machine learning can improve your public services. Are you ready to take the red pill? – The Register

Paid Post Theres no doubt that machine learning has massive potential for improving the development, delivery, and operation of public services, whether thats delivering insights into disease proliferation, enabling predictive maintenance, or identifying fraud.

This can seem to be a complex technology. But if the principles behind machine learning can be intimidating, they are nowhere near as intimidating as the consequences of getting it wrong and generating questionable or even positively harmful outcomes.

So, whether your organisation is preparing for its first journey with machine learning, or has already implemented the technology, it pays to step back and take a broader look.

And we have something that can help you in this process, in the shape of Machine Learning Reloaded, an in-depth dive into the principles and applications of machine learning in public services.

This concise but info-packed report is part of the Perspectives series from our chums at global smart software specialists Civica, with their latest volume produced in association with the UKRI Centre for Doctoral Training in Accountable, Responsible and Transparent Artificial Intelligence (ART-AI) at the University of Bath.

Machine Learning Reloaded gives you a crash course in the principles behind the tech, helping you understand what it is, what it isnt, and what it can potentially do.

It also provides an in-depth examination of how machine learning is making a difference across the full range of public services, including local government, health and care, government and justice, housing, and education.

With best use cases, and explanations of key applications, it will take you direct to a range of other resources showing how public bodies have already put the technology to work.

As well as whetting your appetite, it provides you with a template for planning your own machine learning projects, from choosing your data, selecting your tools, and choosing the right personnel and partners while guiding you on how to do all this ethically and responsibly.

Civicas NorthStar lab has already run the ruler over Chatbots, and Immersive Technologies. You can find out more and explore the rest of the Perspectives from Civica series at http://www.civica.com/perspectives. If you want to know where cutting edge technology is taking public services, this is the place to start.

Read the original here:
Machine learning can improve your public services. Are you ready to take the red pill? - The Register

Read More..

Cryptocurrency prices today: Meme coins take charge as …

Cryptocurrency prices gained on Thursday despite heavy profit booking. (Photo: Reuters)

Popular cryptocurrency prices have weakened over the past 24 hours due to heavy profit booking by investors. While popular virtual coin prices bounced back slightly, meme cryptocurrencies zoomed on Thursday.

Bitcoin, the worlds largest cryptocurrency, was trading barely above $61,000 or 3.69 per cent higher than its value 24 hours ago at 3:30 pm. The market capitalisation of Bitcoin fell to $1.15 trillion and the 24-hour trading volume reduced marginally to $1.76 billion.

Ether was trading at over $4,150 or 3.90 per cent higher than its price 24 hours ago. The cryptocurrencys market capitalisation stood at $488 billion and the 24-hour trading volume was $1.45 billion.

Cryptocurrency highlights | Check yesterday's prices

While all other altcoins gained substantially during the day, meme coins Dogecoin and Shiba Inu were the biggest gainers. Dogecoin was trading over 27 per cent higher while Shiba Inu was retailing over 30 per cent higher at the time of publication.

Commenting on the cryptocurrency market momentum, Edul Patel, CEO and Co-founder of Mudrex, a global algorithm-based crypto investment platform, said, The cryptocurrency market saw a wave of profit-booking over the past 24 hours.

However, technical indicators suggest that this could likely be a blip in an otherwise positive momentum. SHIBA Inu continued in the course of a massive rally, shooting up over 45 per cent, he added.

Cryptocurrency

Price (US Dollar)

24-hour change

Market cap

Volume (24 Hours)

Bitcoin

61,196.69

3.97%

$1.15 trillion

$1.76 billion

Ether

4,175.06

4.53%

$490.38 billion

$1.45 billion

Dogecoin

0.298597

26.63%

$39.33 billion

$11.93 billion

Litecoin

188.28

3.59%

$12.96 billion

$143.53 million

XRP

1.06

4.59%

$106.03 billion

$6.19 billion

Cardano

2.02

3.12%

$66.21 billion

$545.53 million

DISCLAIMER: The cryptocurrency prices have been updated as of 04:10 pm and will change as the day progresses. The list is intended to give a rough idea regarding popular cryptocurrency trends and will be updated daily.

Click here for IndiaToday.ins complete coverage of the coronavirus pandemic.

Here is the original post:
Cryptocurrency prices today: Meme coins take charge as ...

Read More..

Less energy, better quality PAM images with machine learning – The Source – Washington University in St. Louis – Washington University in St. Louis…

Photoacoustic microscopy (PAM) allows researchers to see the smallest vessels inside a body, but it can generate some unwanted signals or noise. A team of researchers at the McKelvey School of Engineering at Washington University in St. Louis found a way to significantly reduce the noise and maintain image quality while reducing the laser energy needed to generate images by 80%.

Song Hu, associate professor of biomedical engineering, and members of his lab devised this new method using a machine-learning-based image processing technique, called sparse coding, to remove the noise from PAM images of vessel structure, oxygen saturation and blood flow in a mouse brain. Results of the work were published online inIEEE Transactions on Medical ImagingNov. 1.

To acquire such images, the researchers need a dense sampling of data, which requires a high laser pulse repetition rate that may raise safety concerns. Reducing the laser pulse energy, however, leads to impaired image quality and inaccurate measurement of blood oxygenation and flow. Thats where Zhuoying Wang, a doctoral student in Hus lab and first author of the paper, brought in sparse coding, a type of machine learning often used in image processing that doesnt need a ground truth on which to train, to improve the image quality and quantitative accuracy while using low laser doses.

The team applied the technique to images of blood hemoglobin concentration, oxygenation and flow in a mouse brain at both normal and reduced energy levels. Their two-step approach performed very well, significantly reducing the noise and achieving similar image quality that was previously only possible with five times higher laser energy.

In the first step of our approach, sparse coding separated the vascular signals from noise in the cross-sectional scans acquired at different tissue locations, called B-scans, because the noise is less sparse than the signals, Wang said. Then we applied the same sparse coding strategy on the projection image formed by denoised B-scans in the second step to further suppress the background noise.

Hu said while machine learning has been previously used to denoise photoacoustic images, their two-step method is a step ahead.

Our approach allows us to remove the noise and leave the signal intact, Hu said. It not only provides higher visibility of the microvessels but also preserves the signal presentation to give us the opportunity to do quantitative imaging.

While this is the initial demonstration of what these machine learning tools can do, Hu said it shows the importance of advanced computational tools in imaging in general and in photoacoustic microscopy in particular.

The five-times reduction in laser energy is promising, but we think we could do more with follow-up advances, not only to reduce the laser energy but also to improve the temporal resolution, or how fast we can take the image without losing resolution and spatial coverage, he said.

This research was supported by the National Institutes of Health (NS099261 and NS120481), the National Science Foundation (2023988), and the Chan Zuckerberg Initiative DAFan advised fund of Silicon Valley Community Foundation (2020-226174). Z. Wang is supported by the Washington University Imaging Sciences Pathway Fellowship.

Wang Z, Zhou Y, Hu S. Sparse Coding-enabled Low-fluence Multi-parametric Photoacoustic Microscopy.IEEE Transactions on Medical Imaging, early access online Nov. 1, 2021. DOI:10.1109/TMI.2021.3124124

The McKelvey School of Engineering at Washington University in St. Louis promotes independent inquiry and education with an emphasis on scientific excellence, innovation and collaboration without boundaries. McKelvey Engineering has top-ranked research and graduate programs across departments, particularly in biomedical engineering, environmental engineering and computing, and has one of the most selective undergraduate programs in the country. With 140 full-time faculty, 1,387 undergraduate students, 1,448 graduate students and 21,000 living alumni, we are working to solve some of societys greatest challenges; to prepare students to become leaders and innovate throughout their careers; and to be a catalyst of economic development for the St. Louis region and beyond.

See the article here:
Less energy, better quality PAM images with machine learning - The Source - Washington University in St. Louis - Washington University in St. Louis...

Read More..