Page 2,749«..1020..2,7482,7492,7502,751..2,7602,770..»

What is aromaticity? | Opinion – Chemistry World

If youre inclined to resist the popular assertion that chemistry is just applied quantum physics, you need do no more than invoke the notion of aromaticity. Of all chemistrys messy, ill-defined concepts, none is more so than this. It shows that chemistry is in some ways closer to sociology or zoology: populated by individuals molecules that convenience impels us to classify and group without being quite sure if our categories are sound.

The very word betrays its shaky status, being one of the more obvious misnomers of the field. It stems, of course, from the central role of the benzene ring, the hexagonal core of a slew of organic compounds notable for their pungency by the time August Kekul proposed the cyclic structure in 1865. Michael Faraday immediately noted the almond scent of benzene when he first isolated it in 1825. (He called it bicarburet of hydrogen, believing the ratio of carbon to hydrogen to be 2:1 because of the erroneous atomic weight then assigned to hydrogen.) Kekul presented the structure in an odd sausage format in his 1865 paper; only in 1872 did he show the familiar rings, noting that there were two equivalent configurations of the alternating single and double carboncarbon bonds.

The very word betrays its shaky status, being one of the more obvious misnomers of the field. It stems, of course, from the central role of the benzene ring, the hexagonal core of a slew of organic compounds notable for their pungency by the time August Kekul proposed the cyclic structure in 1865. Kekul presented this structure in an odd sausage format in his 1865 paper; only in 1872 did he show the familiar rings, noting that there were two equivalent configurations of the alternating single and double carboncarbon bonds.

It wasnt until the quantum theory of chemical bonding in the 1930s that benzenes electronic structure was clarified. Linus Pauling proposed that the two alternative Kekul structures oscillate rapidly back and forth in a resonance that accounts for the unusual stability of benzene: it is more stable than would be expected for any one of the Kekul structures. Erich Hckel, meanwhile, offered another description based on his notion of molecular orbitals, in which the hexagon of bonds between carbons is supplemented by the continuous rings of electrons from 2p orbitals overlapping above and below the plane of the atomic nuclei. These orbitals may sustain circulating electrical ring currents in an applied magnetic field, the effects of which account for the distinctive chemical shift of hydrogens attached to aromatic groups in NMR spectroscopy. That shift offers one way of assessing aromaticity.

The trouble is, its not unique in doing so. In fact, there are many such measures based, for example, on bond lengths, electronic structure, energetics and chemical reactivity. And they arent all consistent, so the extent to which a molecular fragment is aromatic depends on how you define it. Ostensibly tied to the quantum description of bonding, this putative feature of molecules is actually far more diffuse.

Arguments based on the symmetry properties of the electronic wavefunction that are needed to create a closed circuit of mobile electrons led to the initial view that aromaticity was confined to planar ring systems with 4n+2 electrons, but that limitation has long since been abandoned. For example, ring-like molecules with conjugated bonds that have the single twist of a Mbius strip attain aromatic-like stability if they have 4n electrons. Its now generally agreed that as well as bonds can partake in aromaticity. And it can appear in three-dimensional systems too, especially involving main-group elements such as clusters of boron atoms but also including the cage-like fullerenes. The latter illustrate the complications of an aromaticity criterion based on energetic stabilisation for how does one disentangle that from the strain energy of the curved framework?

With so little agreement about what is and is not aromatic, the concept is open to extensions and modifications and, some would say, abuses. Thus we see the appearance of doubly aromatic, hyperaromatic and superaromatic molecules. The latter, for example, was claimed for the macrocycle kekulene, made from 12 fused benzene rings in a hexagonal formation, until a recent single-molecule structural study using atomic force microscopy showed that the molecule does not after all contain two concentric, fully delocalised rings but is more like six independently aromatic units welded together.1

Still the putative variety of types of aromaticity proliferates: spherical and cubic aromaticity, transition-state aromaticity, homoaromaticity, and more. It has got to the point where some chemists disdain the whole concept as irredeemably vague and sloppy. Others instead cry Enough already! We should stop multiplying subclasses of aromaticity, says Miquel Sol of the University of Girona, and apply more self-discipline over the criteria used to identify it.2

Roald Hoffmann of Cornell University attributes aromaticity inflation to the natural human tendency to want our molecular children to be exceptional.3 Suppose we posit a molecule in which the occupied orbitals constitute a closed group then lets call it -aromatic! And yet, Hoffmann says, for some of these hypothetical molecules the supposed stabilisation applies to a structure that wouldnt survive an instant in air at room temperature.

Its not so much the hype that bothers him, but the damage done to this beautiful, eminently chemical idea that I can trace back a century and a half. Messy or not, aromaticity surely stands for something so we should be wary of debasing its currency.

See the original post:

What is aromaticity? | Opinion - Chemistry World

Read More..

Top Analyst Says One Altcoin Ready for 300% Rally, Predicts Spikes in Two More Crypto Assets – The Daily Hodl

Crypto analyst and trader Michal van de Poppe says one under-the-radar altcoin is poised for a massive surge.

The closely followed analyst tells his 356,000 followers that hes convinced layer-2 scaling platform Celer Network (CELR) is ready for exponential gains in its Bitcoin pair (CELR/BTC).

Im still convinced that CELR is ready for a big new move of 300-700%.

Weekly charts on altcoins are looking quite good for a new impulse wave, with CELR and FET as great examples.

The crypto strategist predicts CELR/BTC can surge to as high as 0.00000462 BTC ($0.18) from its current value of 0.00000077 BTC, worth about $0.02.

The next crypto that Van de Poppe is watching is the leading smart contract platform and second-largest digital asset by market cap Ethereum (ETH). Once Ethereum breaks the critical level of $2,300, says Van de Poppe, $3,000 will then be in play, setting up ETH for a 30% rally.

Support held for Ethereum once again, so its approaching the other side of the range and crucial breaker, before a continuation to $3,000 could happen.

At time of writing, Ethereum is trading at $2,995, according to CoinMarketCap.

Van de Poppe also has his eye on cross-chain decentralized finance (DeFi) lending platform Kava.io (KAVA). According to the analyst, KAVA/BTC could be primed for an 86% run from 0.000125 BTC ($4.78) to 0.00024 BTC ($9.17) as it gears up to take out its immediate resistance.

KAVA looks ready for continuation to 24,000 sats (0.00024).

As for Bitcoin, Van de Poppe says he expects BTC to test crucial resistance at $40,000.

Bitcoin facing the next level of resistance (after this its $36,000).

Reminder that were dealing with a weekend pump. Wouldnt be surprised to see a retest happening in green to close the CME gap this coming week.

Overall, very good behavior and still expecting $40,000 overall.

Don't Miss a Beat Subscribe to get crypto email alerts delivered directly to your inbox Follow us on Twitter, Facebook and TelegramSurf The Daily Hodl Mix

Check Latest News HeadlinesDisclaimer: Opinions expressed at The Daily Hodl are not investment advice. Investors should do their due diligence before making any high-risk investments in Bitcoin, cryptocurrency or digital assets. Please be advised that your transfers and trades are at your own risk, and any loses you may incur are your responsibility. The Daily Hodl does not recommend the buying or selling of any cryptocurrencies or digital assets, nor is The Daily Hodl an investment advisor. Please note that The Daily Hodl participates in affiliate marketing.

Featured Image: Shutterstock/Mia Stendal

Read the original post:
Top Analyst Says One Altcoin Ready for 300% Rally, Predicts Spikes in Two More Crypto Assets - The Daily Hodl

Read More..

What contributed to this altcoin outperforming Cardano, Dogecoin, and Binance Coin – AMBCrypto News

The crypto-market is like the wild west. Nothing is certain and sometimes, even unknown currencies are capable of massive gains, making the top ones seem less significant. One such recent success story is that of Axie Infinitys native token AXS after it more than doubled in just three days.

The AXS/USD exchange rate touched a record value of $32.69 on 23 July, up 31.28% intraday and up 131% up from its 20 July low of $14. Owing to the same, the crypto-asset was placed in the list of best-performing digital assets on a year-to-date timeframe with 2021 gains of around 5,000%.

Right before the tokens price pump, there were speculations about Axie Infinity revolutionizing the blockchain-enabled gaming industry and growing faster than any company.

How much truth there is to the aforementioned claims cant be assessed simply by looking at the charts. One thing that stood out the most, however, was that AXS registered higher 24-hour trade volumes than most major alts. AXSs trade volumes over the period stood at $1.63 billion, way higher than Binance Coin (BNB), Litecoin (LTC), Dogecoin (DOGE), EOS, Tron(TRX), Stellar(XLM), and even Cardano (ADA).

This isnt the first time an altcoin has made its way past the top alts when it comes to trade volumes. However, metrics, on-chain analysis, and social sentiment are what paint a true picture of the alts state.

The alts MVRV ratio registered a clear uptick fueled by AXSs bull run. The metric touched its highest level since March this year recently. This was a sign that its market value was greater than the realized value, highlighting a bullish market for the cryptocurrency.

Furthermore, AXSs social dominance climbed to a yearly all-time high, in correspondence with its price performance. However, development activity remained moderate and didnt show any signs of hiking.

In fact, on the downside, there was actually a minor downtick in development activity, at the time of writing. Additionally, the supply on exchanges continued to oscillate low levels, falling even further on 22 July.

Finally, Google trends for Axie Infinity also presented a massive spike as it climbed to 100 over the last week. This was another sign of worldwide interest in the altcoin.

Based on the aforementioned metrics, it can be argued that the altcoin is here to stay. At least for the short term.

However, weak network development and exchange supply are causes for skepticism. Nonetheless, it will be interesting to see how its price action plays out over the next few weeks.

Go here to read the rest:
What contributed to this altcoin outperforming Cardano, Dogecoin, and Binance Coin - AMBCrypto News

Read More..

This is the main difference between Litecoin and other Altcoins – AMBCrypto News

Altcoins have started to move in the right direction as the market has built up on the recent bullish momentum. Litecoin has been one of the surprise inclusion which was facilitated growth from an on-chain perspective. While it may look great on paper, there are few more things that needs to incorporated and understood before estimating a Litecoin rally.

According to Santiment, most altcoins were at a state of decline when it came to their daily on-chain statistics. However, Litecoin was following the other route, indicative of significant activity in the market. Data suggested that Litecoins 24-hour active addresses are currently witnessing all-time high levels for 2021. The rate of active addresses are higher than January-February 20201.

Similarly, addresses holding LTC between 1k-100k also raised their holdings by 270,000 LTC which is currently valued at ~$28.5 million. It is a massive accumulation of data which is suggestive that LTC holders are holding out for a bullish promise with the altcoin.

However, it is important to understand that the Litecoin market may not react proactively to these bullish signs.

Source: glassnode

Litecoins MSOL or Median Spent Output Lifespan indicates how the assets movement might have reacted during the past few weeks. Bitcoins MSOL chart reflected strong movement during the past few days but for Litecoin, it has been radically quiet. Now, MSOL is based on the movement of old and new coins in the industry, and a tepid MSOL meant that only new coins are currently responsible for the traffic.

Now a couple of inferences can be drawn based on the MSOL. Either the asset is surging due to simple re-balancing, where the buying pressure is driving the asset towards pre-set liquidity pools, or LTC hodlers never buckled in the first place. Hodlers keeping a strong position in the industry is positive, but it is nothing new for the LTC community.

While assets such as Bitcoin, Ethereum, Binance Coin, Cardano, etc reached new ATHs during the 2021 rally, Litecoin hardly closed a weekly candle above its 2017 high. Observing the weekly chart also presented the reduction of trading volumes over the course of both cycles.

Litecoin wasnt able to consolidate above its immediate support of $145 as well, which was the 2019 high. The valuation remains firmly under the 20-Moving Average but the main concern remains its sideways movement on the weekly chart.

In comparison, every other asset mentioned above has witnessed a steady rise in the chart.

Improving on-chain metrics is always a positive but pinning hopes on it for a considerable recovery is a step too far. With Litecoin, there hasnt been anything drastic from an individualistic sense of growth, so only time will tell if that changes.

Follow this link:
This is the main difference between Litecoin and other Altcoins - AMBCrypto News

Read More..

What is the Difference Between The Learning Curve of Machine Learning and Artificial Intelligence? – BBN Times

Machine Learning (ML)is about statistical patterns in the artificial data sets, while artificial intelligence (AI) is about causal patterns in the real world data sets.

Source: Medium

The term artificial intelligence was coined in 1956, but AI has become more popular today thanks to increased data volumes, advanced algorithms, and improvements in computing power and storage.

Source: SAS

Artificial intelligence (AI) makes it possible for machines to learn from experience, adjust to new inputs and perform human-like tasks.Artificial intelligence is important because it automates repetitive learning and discovery through data.Instead of automating manual tasks, AI performs frequent, high-volume, computerized tasks. And it does so reliably and without fatigue. Of course, humans are still essential to set up the system and ask the right questions.

Machine learning is a subset of artificial intelligence, that automates analytical model building based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention. Using statistical learning technologies, computers can be trained to accomplish specific tasks by processing large amounts of data and recognizing correlations and patterns in the data.

There are plenty of examples of how easy it is to break the leading pattern-recognition technology in ML/DL, known as deep neural networks (DNNs). These have proved incredibly successful at correctly classifying all kinds of input, including images, speech and data on consumer preferences. But DNNs are fundamentally brittle, taken into unfamiliar territory, they break in unpredictable ways. DNNs do not actually understand the world. Loosely modelled on the architecture of the brain, they are software structures made up of large numbers of digital neurons arranged in many layers. Each neuron is connected to others in layers above and below it.

That could lead to substantial problems. Deep-learning systems are increasingly moving out of the lab into the real world, frompiloting self-driving carstomapping crimeanddiagnosing disease.

An AI footballer in a simulated penalty-shootout is confused when the AI goalkeeper enacts an adversarial policy: falling to the floor (right) | Credit: Adam Gleave

it was possible to use adversarial examples not only to fool a DNN, but also to reprogram it entirely effectively repurposing an AI trained on one task to do another.

There are no fixes for the fundamental brittleness of noise/pixel-fooled DNNs, but making real AIs that can model, explore and exploit the world for themselves, write their own code and retain memories.

Deep Learningis a specific class of machine learning algorithms that use complex neural networks. The building block of the brain is the neuron, while the basic building block of an artificial neural network is a perceptron that accomplishes signal processing. Perceptrons are then connected into a large mesh network. The neural network is taught how to perform a task by having it process and analyze examples, which have been previously labeled. For example, in an object recognition task, the neural network is presented with a large number of objects of a certain type (i.e. a dog, a car). The neural network learns to categorize new images by having been trained on recurring patterns. This approach combines advances in computing power and neural networks to learn complex patterns in large amounts of data.

Source: Forbes & IBM

Contrary to popular assumptions, the biggest challenge facing companies with artificial intelligence (AI) isnt a lack of data scientists but rather data itself. Companies need more of it in every formstructured, unstructured and otherwise.

Source: Nature

Artificial-intelligence researchers are trying to fix the flaws of neural networks.

These kinds of systems will form the story of the coming decade in AI research, emerging as a real true or causal AI with a deep understanding of the structure of the world.

Real AI enables machines or software applications to effectively interact with any environment, while understanding the world and learning from experience, and performing any human-like tasks and beyond.

Of many known definitions, just a few are close to real AI systems:An AI system is a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. [OECD/LEGAL/0449]

All sectoral applications in the public sectors of various industries, from agriculture and forestry to manufacturing, healthcare, education and government imply the real world AI systems.

The most advanced use case of real AI in the agricultural sector is known as precision agriculture, where AI enabled processing of data allows farmers to make temporally and spatially tailored management decisions, leading to a more efficient use of agricultural inputs, such as fertilisers and pesticides. And the required data is generated through remote sensing technologies using satellites, planes and unmanned aerial vehicles (drones) and through on the ground sensors in combination with IoT technology.

But even a blind pattern recognition with predictive ML algorithms is so extremely powerful that it is good enough to have made companies such as Apple, Microsoft and Amazon, Facebook and Google, Alibaba, Tencent, Amazon the most valuable in the world.

But theres a much bigger wave coming. And this will be about superintelligent machines that manipulate the world and create their own data through their own actions. [Jrgen Schmidhuber at the Dalle Molle Institute for Artificial Intelligence Research in Manno, Switzerland].

We talk about the next generation of MI, Real World AI and Machine Learning. Its universe of discourse is the whole world with all its sub-worlds.

Such a Universe is modeled as consisting of 4 major parts, the universe of Nature (World I), the domain of Mind (World II), the domain of Society and Human Culture (World III), and the realm of Technology and Engineering and Industry (World IV).

Science and technology, the arts and philosophy are unified as a web of intellectual learning, scientific knowledge, and engineering sciences. A union of human knowledge defined as the wisdom science (or scientific wisdom).

It is affording a framework for the most life-critical innovations and breakthroughs, from the Internet of Everything to Theory of Everything, Emerging Technologies to Intelligent Cities and Connected Smart World, all integrated by the Real World AI and ML.

Companies that dont adop machine learning and AI technologies are destined to be left behind. Most industries are already being changed by the emergence of AI.2021 has shown a growing confidence in artificial intelligence and its predictive technology. However, for it to achieve its full potential, AI needs to be trusted by companies.

More here:
What is the Difference Between The Learning Curve of Machine Learning and Artificial Intelligence? - BBN Times

Read More..

Harvard researchers part of new NSF AI research institute – Harvard School of Engineering and Applied Sciences

Harvard University researchers will take leading roles in a new National Science Foundation (NSF) artificial intelligence research institute housed at the University of Washington (UW). The UW-led AI Institute for Dynamic Systems is among 11 new AI research institutes announced today by the NSF.

Na Li, the Gordon McKay Professor of Electrical Engineering and Applied Mathematics at the Harvard John A. Paulson School of Engineering and Applied Science (SEAS), is a co-principal investigator at the institute and will lead one of the main research thrusts. Michael Brenner, the Michael F. Cronin Professor of Applied Mathematics and Applied Physics and Professor of Physics at SEAS, and Lucas Janson, Assistant Professor of Statistics and Affiliate in Computer Science will also be part of the institutes research team.

The AI Institute for Dynamic Systems will focus on fundamental AI and machine learning theory, algorithms and applications for real-time learning and control of complex dynamic systems, which describe chaotic situations where conditions are constantly shifting and hard to predict. In addition to research, the institute will be focused on training future researchers in this field throughout the education pipeline.

"The engineering sciences are undergoing a revolution that is aided by machine learning and AI algorithms," said institute director J. Nathan Kutz, a UW professor of applied mathematics. "This institute brings together a world-class team of engineers, scientists and mathematicians who aim to integrate fundamental developments in AI with applications in critical and emerging technological applications."

The overall goal of this institute is to integrate physics-based models with AI and machine learning approaches to develop data-enabled efficient and explainable solutions for challenges across science and engineering. The research will be divided into three main thrusts: modeling, control and optimization and sensors.

Li will lead the control research thrust. Li, along with Janson and the rest of the control research team, will leverage the successes of machine learning towards the control of modern complex dynamical systems. Specifically, they will be focused on several challenges pertaining to reinforcement learning (RL), a class of machine learning that addresses the problem of learning to control physical systems by explicitly considering their inherent dynamical structure and feedback loop.

The AI for control team will figure out how to develop scalable learning-based control methods for large-scale dynamical systems; maintain the performance of the learned policies even when there is a model class mismatch; and guarantee that the systems maintain stability and stay within a safety constraint while still learning efficiently.

To date, the successes of RL have been limited to very structured or simulated environments, said Li. Applying RL to real-world systems, like energy systems, advanced manufacturing, and robot autonomy, faces many critical challenges such as scalability, robustness, safety, to name a few. We will develop critically enabling mathematical, computational, and engineering architectures to overcome these challenges to bring the success of AI/ML to our real-world systems.

"One particular focus of ours will be to quantify the statistical uncertainty of what AI learns, enabling us to develop algorithms with rigorous safeguards that prevent them from harming anyone or anything while they explore and learn from their environment," said Janson.

Brenner, a leader in the field of physics-informed machine-learning methods for complex systems, will be part of the AI for modeling research team. That team will be focused on learning physically interpretable models of dynamical systems from off-line and/or on-line streaming data.

"Our research will explore how we can develop better machine-learning technologies by baking in and enforcing known physics, such as conservation laws, symmetries, etc.," said institute associate director Steve Brunton, a UW associate professor of mechanical engineering. "Similarly, in complex systems where we only have partially known or unknown physics such as neuroscience or epidemiology can we use machine learning to learn the 'physics' of these systems?"

Harvard is among several partner institutions including the University of Hawaii at Mnoa, Montana State University, the University of Nevada Reno, Boise State University, the University of Alaska Anchorage, Portland State University and Columbia University. The institute will also partner with high school programs that focus on AI-related projects and creating a post-baccalaureate program that will actively recruit and support recent college graduates from underrepresented groups, United States veterans and first-generation college students with the goal of helping them attend graduate school. The institute will receive about $20 million over five years.

Continue reading here:
Harvard researchers part of new NSF AI research institute - Harvard School of Engineering and Applied Sciences

Read More..

Machine Learning May Solve the Cube Conundrum – Journal of Petroleum Technology

Optimal well spacing is the question. Well interactions are the problem. And cube drilling was supposed to be the answer. But it didnt turn out that way.

There was this idea that operators could avoid parent/child interactions by codeveloping their wells, said Ted Cross, a technical adviser with Novi Labs, during a recent presentation. They could develop many, many zones and maximize the recovery from a three-dimensional volume of rock.

This was cube drilling.

They could get a lot of operational efficiencies by having multiple frac crews on site, Cross said, building these megapads and saving on pad-construction costs.

The practice was tried, and, when the results were released, production was underwhelming. Stocks fell. Clearly, the cube was not the answer.

Nonetheless, much was learned from the venture into this dense drilling, which saw 50, 60, maybe 70 wells per section, within a given square mile, which is incredibly dense, Cross said. Just because the idea of a 70-well superdevelopment is dead doesnt mean that the concept cant still be useful.

While the concept of megapads has faded, it is not gone. Cross presented development maps and analysis that show people are still going to town on dense development, even if theyre not 60 wells per section. The industry has taken a little bit of time to figure out what geology supports these.

Consequently, well spacing remains important. Its still the key to driving net asset value and cash flow, said Novis president and cofounder, Jon Ludwig. If you go too aggressive, too many wells per section, obviously you lose cash flow, subtract net asset value, and, if youre public, you can subtract a good amount of company value as well. But, if youre not aggressive enough, you leave value on the table. So, its still critical to get this right.

Getting it right takes data, something the oil and gas industry has never lacked and something that cube drilling has produced in great quantities. Courtesy of all this cube development that has occurred, theres a lot of data, Ludwig said. Thats a huge advantage. We know now what good and bad look like. Every single cube thats been developed has left a signature.

Of course, the data doesnt help if it isnt used properly. We can all benefit from that if we know how to use the data well, Ludwig said. This is where machine learning comes in.

Machine learning models can tease out these subtle warnings from the past, Ludwig said.

One technique that benefits from the lessons of cube drilling is what Ludwig calls the surgical strike.

Getting cubes right is not all about a codeveloped cube in greenfield acreage, Ludwig said. A surgical strike, as weve defined it, is: What if I put a lease-line well between these existing developments? Or, what if Ive just acquired acreage in a very developed play like Eagle Ford or Bakken? How do I improve asset value? How do I bring learnings, completions designs, etc. how do I bring that in and actually improve net asset value by figuring out where you could still develop?

The machine-learning models help, Ludwig said, but the data must be dynamic. If youve built any kind of data-driven model, you want to use that model then to actually make forecasts and run scenarios for various ways you might develop your acreage. In order to do that, you need to have dynamic parent/child calculations for these hypothetical developments. If youre going to plan a cube where youre going to come in under an existing development, you need data that gets generated on the fly that describes distances, timing, etc. and allows whatever method youre using for modeling to change the forecast based on those factors.

This, Ludwig added, must be presented as a time series. We learned early on that making a point prediction is valuable and useful but its not nearly as useful as showing the shape of the curve and how the production rates change over time.

A cube, however, will not thrive in a black box. You really need to have the model not only output a forecast but also output something that explains why that forecast was made, what variables are driving that forecast, Ludwig said. He said that the models, if applied correctly, can explain their work.

What I mean by explain their work is: If a model forecasts X or Y, two different forms of a particular cube design, can it tell me also why? Because answering why is important when youre make the kinds of investment decisions that the industry is being asked to make. The sophistication of the models is not just the ability to make accurate forecasts, it is also the ability to explain their work. These two things together are critical for the financial case to continue to develop cubes.

See original here:
Machine Learning May Solve the Cube Conundrum - Journal of Petroleum Technology

Read More..

LG CNS Recognized by Google Cloud with Machine Learning Specialization – The Korea Bizwire

This photo provided by LG CNS Co. on July 29, 2021, shows LG CNS workers showing Google Clouds Machine Learning Specialization distinction and Tensorflow Developer certificate.

SEOUL, July 29 (Korea Bizwire) LG CNS Co., a major IT service provider under LG Group, said Thursday it has earned a Machine Learning Specialization distinction from Google Cloud as the company strives to expand its presence in the artificial intelligence (AI) sector.

LG CNS said it became the first South Korean company to achieve machine learning in the Google Cloud Partnership Advantage program for its expertise in the sector.

Google Cloud has 17 types of specialization certification programs that are rewarded to its partner companies that prove their specialty in certain technology sectors.

To earn a Machine Learning Specialization distinction, a company has to meet Google Clouds requirements in 33 categories across six fields, in which a firm is assessed on areas ranging from machine learning models to investment plans.

Leveraging Google Clouds technology, LG CNS said it has established AI-powered services for LG Electronics Inc. and AEON Corp., one of Japans largest chains of language learning institutions.

Seven LG CNS professional machine learning engineers were certified by Google, and some 170 of its workers were also recognized with Tensorflow Developer Certificates from the company.

To beef up its AI capabilities, LG CNS said it has 35 working teams dedicated to its AI business.

(Yonhap)

Go here to see the original:
LG CNS Recognized by Google Cloud with Machine Learning Specialization - The Korea Bizwire

Read More..

Cassie the bipedal robot uses machine learning to complete a 5km jog – New Atlas

Four years is a long time in robotics, especially so for a bipedal robot developed at Oregon State University (OSU) named Cassie. Dreamt up as an agile machine to carry packages from delivery vans to doorsteps, Cassie has recently developed an ability to run, something its developers have now shown off by having it complete what they say is the first 5-km (3.1-mi) jog by a bipedal robot.

We first took a look at Cassie the bipedal robot back in 2017, when OSU researchers revealed an ostrich-like machine capable of waddling along at a steady pace. It is based on the team's previously developed Atrias bipedal robot, but featured steering feet and sealed electronics in order to function in the rain and snow and navigate outdoor terrain.

The team has since used machine learning to equip Cassie with an impressive new skill: the ability to run. This involved what they call a deep reinforcement learning algorithm, which Cassie combines with its unique biomechanics and knees that bend like an ostrich to make fine adjustments to keep itself upright when on the move.

Deep reinforcement learning is a powerful method in AI that opens up skills like running, skipping and walking up and down stairs, says team member Yesh Godse.

Running robots are of course nothing new. Honda's ASIMO robot has been jogging along at speeds of up to 6 km/h (3.7 mph) since 2004, and in 2011 we looked at machine called Mabel with a peak pace of 10.9 km/h (6.8 mph), which was billed as the world's fastest bipedal robot with knees. More recently, the Atlas humanoid robot from Boston Dynamics has wowed us not just by running through the woods, but performing backflips and parkour.

The OSU team were keen to show off the endurance capabilities of Cassie, by having it use its machine learning algorithms to maintain balance across a 5-km run around the university campus, while untethered and on a single charge of its batteries. It wasn't all smooth sailing, with Cassie falling down twice due to an overheated computer and a high-speed turn gone wrong. But following a couple of resets, the run was completed in a total time of 53 minutes and 3 seconds.

Cassie is a very efficient robot because of how it has been designed and built, and we were really able to reach the limits of the hardware and show what it can do, said Jeremy Dao, a Ph.D. student in the Dynamic Robotics Laboratory.

According to the researchers, this is the first time a bipedal robot has finished a 5-km run, albeit it took place at walking speed and needed a little help along the way. It is possible that other bipedal robots may be capable of covering such distances, but it is also possible that no one has thought to try. Either way, the run is an impressive demonstration of the progress being made by the team. Check it out below.

OSU Bipedal Robot First to Run 5K

Source: Oregon State University

See original here:
Cassie the bipedal robot uses machine learning to complete a 5km jog - New Atlas

Read More..

Global AI in Information and Communications Technology (ICT) Report 2021: AI and Cognitive Computing in Communications, Applications, Content, and…

DUBLIN--(BUSINESS WIRE)--The "AI in Information and Communications Technology 2021-2026: AI and Cognitive Computing in Communications, Applications, Content, and Commerce" report has been added to ResearchAndMarkets.com's offering.

This report assesses the AI in the ICT ecosystem including technologies, solutions and players. Application areas covered include marketing and business decision making, workplace automation, predictive analysis and forecasting, fraud detection and mitigation.

The report provides detailed forecasts globally, regionally, and across market segments from 2021 to 2026. The report also covers AI subset technologies, embedded in other technologies, and cognitive computing in key industry verticals.

While the opportunities for artificial intelligence in the information and communications technology industry are virtually limitless, we focus on a few key opportunities including AI in big data, chatbots, chipsets, cybersecurity, IoT, smart machines and robotics. AI is poised to fundamentally shift the Information and Communications Technology (ICT) industry as technologies such as Machine Learning, Natural Language Processing, Deep Learning, and others.

AI will dramatically enhance the performance of communications, apps, content, and digital commerce. AI will also drive new business models and create entirely new business opportunities as interfaces and efficiencies facilitate engagement that has been heretofore incomprehensible.

Many other industry verticals will be transformed through this evolution as ICT and digital technologies support many aspects of industry operations including supply chains, sales and marketing processes, product and service delivery and support models.

For example, we see particularly substantial impacts on the medical and bioinformatics as well as financial services segments. Workforce automation is an area that will affect many different industry verticals as AI greatly enhances workflow, processes, and accelerates the ROI for smart workplace investments.

Key Topics Covered:

1 Executive Summary

2 Introduction

2.1 Artificial Intelligence Overview

2.1.1 Intelligent Software Agent

2.1.2 Problem Solving

2.2.4 Practical Approaches to AI

2.2 Machine Learning

2.2.1 Supervised Learning

2.2.2 Unsupervised Learning

2.2.3 Semi-Supervised Learning

2.2.4 Reinforcement Learning

2.3 Deep Learning

2.3.1 Artificial Neural Networks

2.3.2 Artificial Neural Network Deployment

2.4 Cognitive Computing

2.5 AI Algorithms in Applications

2.5.1 Natural Language Processing

2.5.2 Machine Perception

2.5.3 Data Mining

2.5.4 Motion and Manipulation

2.6 Limitations and Challenges for AI Expansion

2.7 AI in Information and Communications Technology Industry

2.7.1 AI Market Drivers in ICT

2.7.2 Key AI Opportunities in ICT

2.7.2.1 Artificial Intelligence and Big Data

2.7.2.2 Artificial Intelligence in Chatbots and Virtual Private Assistants

2.7.2.3 Artificial Intelligence in Chipsets and Microelectronics

2.7.2.4 Artificial Intelligence and Cybersecurity

2.7.2.5 Artificial Intelligence and Internet of Things

2.7.2.6 Artificial Intelligence in Network Management and Optimization

2.7.2.7 Artificial Intelligence in Smart Machines and Robotics

3 AI Intellectual Property Leadership by Country and Company

3.1 Global AI Patents

3.2 AI Patents by Leading Countries

3.3 Global Machine Learning Patents

3.4 Machine Learning Patents by Leading Countries

3.5 Machine Learning Patents by Leading Companies

3.6 Global Deep Learning Patents

3.7 Deep Learning Patents by Leading Countries

3.8 Global Cognitive Computing Patents

3.9 Cognitive Computing Patents by Leading Countries

3.10 AI and Cognitive Computing Innovation Leadership

4 AI in ICT Market Analysis and Forecasts 2021-2026

4.1 Global Markets for AI 2021-2026

4.2 Global Market for AI by Segment 2021-2026

4.3 Regional Markets for AI 2021-2026

4.4 AI Market by Key Application Area 2021-2026

4.4.1 AI Markets for Predictive Analysis and Forecasting 2021-2026

4.4.2 AI Market for Marketing and Business Decision Making 2021-2026

4.4.3 AI Market for Fraud Detection and Classification 2021-2026

4.4.4 AI Market for Workplace Automation 2021-2026

5 AI in Select Industry Verticals

5.1 Market for AI by Key Industry Verticals 2021-2026

5.1.1 AI Market for Internet-related Services and Products 2021-2026

5.1.2 AI Market for Telecommunications 2021-2026

5.1.3 AI Market for Medical and Bioinformatics 2021-2026

5.1.4 AI Market for Financial Services 2021-2026

5.1.5 AI Market for Manufacturing and Heavy Industries 2021-2026

5.2 AI in other Industry Verticals

6 AI in Major Market Segments

6.1 AI Market by Product Segment 2021-2026

6.2 Market for Embedded AI within other Technologies 2021-2026

6.2.1 AI Algorithms in Data Mining 2021-2026

6.2.2 AI in Machine Perception Technology 2021-2026

6.2.3 Market for AI Algorithms in Pattern Recognition Technology 2021-2026

6.2.4 Market for AI Algorithm in Intelligent Decision Support Systems Technology 2021-2026

6.2.5 Market for AI Algorithms in Natural Language Processing Technology 2021-2026

7 Important Corporate AI M&A

7.1 Apple Inc.

7.2 Facebook

7.3 Google

7.4 IBM

7.5 Microsoft

8 AI in ICT Use Cases

8.1 Verizon Uses AI and Machine Learning To Improve Performance

8.2 Deutche Telecom Uses AI

8.3 H2O.ai Use-cases in Telecommunications powered by AI

8.4 KDDI R&D Laboratories Inc., AI-assisted Automated Network Operation System

8.5 Telefonica AI Use Cases

8.6 Brighterion AI, Worldpay Use cases

9 AI in ICT Vendor Analysis

9.1 IBM Corporation

9.1.1 Company Overview

9.1.2 Recent Developments

9.2 Intel Corporation

9.3 Microsoft Corporation

9.4 Google Inc.

9.5 Baidu Inc.

9.6 H2O.ai

9.7 Hewlett Packard Enterprise

9.8 Apple Inc.

9.9 General Electric

Read more:
Global AI in Information and Communications Technology (ICT) Report 2021: AI and Cognitive Computing in Communications, Applications, Content, and...

Read More..