Page 1,030«..1020..1,0291,0301,0311,032..1,0401,050..»

OAG adds airfare data with acquisition of Infare – PhocusWire

OAG, a data platform for the global travel industry, has acquired Infare, a provider of competitor air travel data, from Ventiga Capital. Terms of the acquisition have not been disclosed, but OAG says the deal values the combined entity at over $500 million.

Together, the two businesses say they can provide customers with a broader picture of airfare data, enabling them to forecast resources, evaluate travel demand and competition and build more complex and innovative models to drive revenue and profitable growth. Both management teams will continue at the combined company and retain a shareholding interest with new funding (amount undisclosed) from Vitruvian Partners.

Subscribe to our newsletter below

Phil Callow, CEO of OAG, said the increasing dynamism in global travel and technology is fueling a need for more sophisticated, granular data to understand, manage and unlock growth in air travel.

The acquisition of Infare strengthens our ability to deliver consistent and accurate information across the wider supply and demand value chain, Callow said. Together, we are enabling new and existing customers to thrive and innovate ahead of their counterparts.

Nils Gelbjerg-Hansen, CEO of Denmark-based Infare, said airlines rely on comprehensive and accurate data to make informed business decisions.

Our technology platform, data sets and intelligence software complement OAGs and will greatly benefit our customers worldwide, he said. We see this as a unique opportunity to expand our services and introduce new innovative products for our customers, we are excited about the journey ahead together.

In 2021, Infare acquired Air Cube, a business intelligence and data-mining platform for the airline industry.

Here is the original post:

OAG adds airfare data with acquisition of Infare - PhocusWire

Read More..

Global Data Mining Tools Market Size and Forecast | IBM, SAS … – Glasgow West End Today

New Jersey, United States The Global Data Mining ToolsMarket is comprehensively and accurately detailed in the report, taking into consideration various factors such as competition, regional growth, segmentation, and market size by value and volume. This is an excellent research study specially compiled to provide the latest insights into critical aspects of the Global Data Mining Tools market. The report includes different market forecasts related to market size, production, revenue, consumption, CAGR, gross margin, price, and other key factors. It is prepared with the use of industry-best primary and secondary research methodologies and tools. It includes several research studies such as manufacturing cost analysis, absolute dollar opportunity, pricing analysis, company profiling, production and consumption analysis, and market dynamics.

The competitive landscape is a critical aspect every key player needs to be familiar with. The report throws light on the competitive scenario of the Global Data Mining Tools market to know the competition at both the domestic and global levels. Market experts have also offered the outline of every leading player of the Global Data Mining Tools market, considering the key aspects such as areas of operation, production, and product portfolio. Additionally, companies in the report are studied based on key factors such as company size, market share, market growth, revenue, production volume, and profits.

Get Full PDF Sample Copy of Report: (Including Full TOC, List of Tables & Figures, Chart) @https://www.verifiedmarketresearch.com/download-sample/?rid=8921

Leading 10 Companies in the Global Data Mining Tools Market Research Report:

IBM, SAS Institute, Oracle, Microsoft, Teradata, MathWorks, Intel, Alteryx, SAP.

Global Data Mining ToolsMarket Segmentation:

Data Mining Tools Market, By Component

Tools Services

Data Mining Tools Market, By Service

Managed services Consulting and implementation Others

Data Mining Tools Market, By Business Function

Marketing Finance Supply chain and logistics Operations

Data Mining Tools Market, By Deployment Type

On-premises Cloud

Data Mining Tools Market, By Industry Vertical

Retail Banking, Financial Services, and Insurance (BFSI) Healthcare and life sciences Telecom and IT Government and defense Energy and Utilities Manufacturing Others (Education, and Media and Entertainment)

The report comes out as an accurate and highly detailed resource for gaining significant insights into the growth of different product and application segments of the Global Data Mining Tools market. Each segment covered in the report is exhaustively researched about on the basis of market share, growth potential, drivers, and other crucial factors. The segmental analysis provided in the report will help market players to know when and where to invest in the Global Data Mining Tools market. Moreover, it will help them to identify key growth pockets of the Global Data Mining Tools market.

The geographical analysis of the Global Data Mining Tools market provided in the report is just the right tool that competitors can use to discover untapped sales and business expansion opportunities in different regions and countries. Each regional and country-wise Global Data Mining Tools market considered for research and analysis has been thoroughly studied based on market share, future growth potential, CAGR, market size, and other important parameters. Every regional market has a different trend or not all regional markets are impacted by the same trend. Taking this into consideration, the analysts authoring the report have provided an exhaustive analysis of specific trends of each regional Global Data Mining Tools market.

Inquire for a Discount on this Premium Report@ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=8921

What to Expect in Our Report?

(1) A complete section of the Global Data Mining Tools market report is dedicated for market dynamics, which include influence factors, market drivers, challenges, opportunities, and trends.

(2) Another broad section of the research study is reserved for regional analysis of the Global Data Mining Tools market where important regions and countries are assessed for their growth potential, consumption, market share, and other vital factors indicating their market growth.

(3) Players can use the competitive analysis provided in the report to build new strategies or fine-tune their existing ones to rise above market challenges and increase their share of the Global Data Mining Tools market.

(4) The report also discusses competitive situation and trends and sheds light on company expansions and merger and acquisition taking place in the Global Data Mining Tools market. Moreover, it brings to light the market concentration rate and market shares of top three and five players.

(5) Readers are provided with findings and conclusion of the research study provided in the Global Data Mining Tools Market report.

Key Questions Answered in the Report:

(1) What are the growth opportunities for the new entrants in the Global Data Mining Tools industry?

(2) Who are the leading players functioning in the Global Data Mining Tools marketplace?

(3) What are the key strategies participants are likely to adopt to increase their share in the Global Data Mining Tools industry?

(4) What is the competitive situation in the Global Data Mining Tools market?

(5) What are the emerging trends that may influence the Global Data Mining Tools market growth?

(6) Which product type segment will exhibit high CAGR in future?

(7) Which application segment will grab a handsome share in the Global Data Mining Tools industry?

(8) Which region is lucrative for the manufacturers?

For More Information or Query or Customization Before Buying, Visit @ https://www.verifiedmarketresearch.com/product/data-mining-tools-market/

About Us: Verified Market Research

Verified Market Research is a leading Global Research and Consulting firm that has been providing advanced analytical research solutions, custom consulting and in-depth data analysis for 10+ years to individuals and companies alike that are looking for accurate, reliable and up to date research data and technical consulting. We offer insights into strategic and growth analyses, Data necessary to achieve corporate goals and help make critical revenue decisions.

Our research studies help our clients make superior data-driven decisions, understand market forecast, capitalize on future opportunities and optimize efficiency by working as their partner to deliver accurate and valuable information. The industries we cover span over a large spectrum including Technology, Chemicals, Manufacturing, Energy, Food and Beverages, Automotive, Robotics, Packaging, Construction, Mining & Gas. Etc.

We, at Verified Market Research, assist in understanding holistic market indicating factors and most current and future market trends. Our analysts, with their high expertise in data gathering and governance, utilize industry techniques to collate and examine data at all stages. They are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research.

Having serviced over 5000+ clients, we have provided reliable market research services to more than 100 Global Fortune 500 companies such as Amazon, Dell, IBM, Shell, Exxon Mobil, General Electric, Siemens, Microsoft, Sony and Hitachi. We have co-consulted with some of the worlds leading consulting firms like McKinsey & Company, Boston Consulting Group, Bain and Company for custom research and consulting projects for businesses worldwide.

Contact us:

Mr. Edwyne Fernandes

Verified Market Research

US: +1 (650)-781-4080UK: +44 (753)-715-0008APAC: +61 (488)-85-9400US Toll-Free: +1 (800)-782-1768

Email: sales@verifiedmarketresearch.com

Website:- https://www.verifiedmarketresearch.com/

Continued here:

Global Data Mining Tools Market Size and Forecast | IBM, SAS ... - Glasgow West End Today

Read More..

Datamining Report: All of the GO Fest Special Research Texts … – Pokmon GO Hub

Hello Trainers! Pokmon GO has had a lot of texts pushed pertaining to Go Fest, both in person and Global, along with the texts for Poliwag Community Day, and Pokemon Air Adventures.

Please read through all of this with a grain of salt we often post data mining reports that take months to release, and we dont want our readers disappointed. Be smart, read this like speculation, and be happy once it goes live.

Disclaimer: You know the drill by now, everything in this article is data mined, and therefore subject to change or not even being released at all. All this information is publicly provided by the PokMiners and includes some of my commentaries.

Final Warning Disclaimer:There are a lot of spoilers here. I mean a lot. So if youre planning on participating in GO Fest and dont want the research texts spoiled then I suggest you click away now. Then again, if you didnt want spoilers you probably wouldnt have clicked on this article in the first place. So lets go!

Generic Texts for Go Fest

Texts for the Add Ons for in-person Go Fest

Texts for the Go Fest City and Park Experiences for the three host cities.

It looks like well be getting reminders of when our Go Fest tickets are for.

All of the announced habitats for in-person Go Fests for London and Osaka

All of the announced habitats for in-person Go Fests for New York City

This looks like field research based on specific Go Fest Habitats for Osaka and London

The GO Fest Park Experience Special Research will be called Park Adventure

Willow found a map and wants us to use an Incense.

Get that camera out! Willow wants pictures.

Glimmering Jewel?

Now Willow wants Stardust.

Apparently, Willow wants to know the second you Mega evolve Diancie!

The GO Fest City Experience Special Research will be called City Sights

Mega Rayquaza is coming!

Willows team is ready, is yours?

Free boba on Willow!

Willow: Science!You: Intuition

Willow out here imparting knowledge of Rayquaza and their choice of habitats

Short and sweet research is only 2 steps.

Willows back! I wonder if he gets paid overtime for all this.

Power up some Flying and Dragon types. Maybe a Flying/Dragon dual type to save yourself some stardust

Mega Rayray!!

Willow wants you to let your team have a breather.

Willow says he wants to celebrate again next year like you wont see him 147 times between now and then. Odd, but alright.

Already announced rotating habitats for global GO Fest

Collection challenges for the rotating habitats

Go spin some stops already, youre gonna need Potions and Revives.

Looks like making new friends is part of the GO Fest research again.

Willows got jokes.

Pop a lucky egg and get evolving

Dropping Diancie hints

Willow always has nice things to say about us, doesnt he? Such a nice guy that Willow.

Texts for Poliwag community day, photobombs, and the like.

The special research is called Slippery Swirls

Quests for Poliwhirl Community Day

Texts for the Pokemon Air Adventures event in South Korea

These may be GO Fest quests.

May be GO Fest related, may not be. Probably are

Text for a potential shop discount

That was a lot!! Are you excited about Go Fest? I sure am, getting back to an in-person Go Fest is going to be amazing! The special research looks amazing as well. I cant wait. Let me know what youre thinking in the comments.

Until next time trainers, stay safe out there.

Read more from the original source:

Datamining Report: All of the GO Fest Special Research Texts ... - Pokmon GO Hub

Read More..

Navigating the Future of Technology: Key Trends in Global Business … – Fagen wasanni

Exploring the Future: Key Trends Shaping Global Business Intelligence and Analytics Software

As we navigate the future of technology, its clear that business intelligence (BI) and analytics software are playing an increasingly pivotal role in global business operations. These tools are not only transforming the way businesses operate but are also shaping the future of industries across the globe.

Firstly, the rise of artificial intelligence (AI) and machine learning (ML) is a key trend that is revolutionizing the BI and analytics software landscape. These technologies are enabling businesses to automate complex processes, make accurate predictions, and gain deeper insights into their operations. AI and ML are being integrated into BI tools to provide advanced analytics capabilities, such as predictive analytics, prescriptive analytics, and data mining. This integration is helping businesses to make data-driven decisions, optimize their operations, and improve their bottom line.

Secondly, the increasing adoption of cloud-based BI and analytics software is another trend that is shaping the future of this sector. The cloud offers numerous benefits, including scalability, flexibility, cost-effectiveness, and easy access to data from anywhere, at any time. As a result, more and more businesses are migrating their BI and analytics tools to the cloud, which is driving the growth of the global cloud-based BI and analytics software market.

Thirdly, the growing importance of data visualization is another trend that is influencing the BI and analytics software industry. Data visualization tools are becoming increasingly sophisticated, enabling businesses to present complex data in a visually appealing and easy-to-understand format. This is helping businesses to communicate their data more effectively, make better decisions, and gain a competitive edge in the market.

Moreover, the rise of self-service BI and analytics tools is another trend that is shaping the future of this sector. These tools are designed to be user-friendly and require minimal technical expertise, enabling non-technical users to analyze data, generate reports, and make informed decisions. This is democratizing access to BI and analytics, empowering all employees to become data-driven decision-makers.

Lastly, the increasing focus on data privacy and security is a trend that cannot be overlooked. With the growing volume of data being generated and processed by businesses, the risk of data breaches and cyber-attacks is also increasing. Therefore, businesses are investing heavily in advanced security measures to protect their data and comply with data privacy regulations. This is driving the demand for BI and analytics software with robust security features.

In conclusion, the future of technology is being shaped by key trends in the global business intelligence and analytics software industry. The rise of AI and ML, the adoption of cloud-based solutions, the importance of data visualization, the democratization of BI and analytics through self-service tools, and the focus on data privacy and security are all playing a crucial role in this transformation. As we continue to navigate the future of technology, these trends will undoubtedly continue to evolve and shape the way businesses operate, make decisions, and drive growth.

See the article here:

Navigating the Future of Technology: Key Trends in Global Business ... - Fagen wasanni

Read More..

Making machine learning accessible to all @theU – @theU

Many call this the age of information, said Rajive Ganguli, the Malcolm McKinnon Professor of Mining Engineering at the University of Utah. It is perhaps more accurate to call it the age of data since not everyone has the ability to truly gain from all the data they collect. Many are either lost in the data or misled by it. Yet, the promise of being informed by data remains.

Ganguli, who is also the College of Mines and Earth Sciences associate dean, is launching UteAnalytics, a free analytics software that makes artificial intelligence (AI) or machine learning (ML) accessible to all.

Founder of the ai.sys group at the U, Ganguli said that as long as a client knows their data, they can use UteAnalytics to understand better the problems they are trying to solve. The research groups mission is to seek insight from data, model systems and to develop computational tools for education and research.

At various points in time, Ganguli has developed ML tools that his students could use in class. Years ago, it occurred to him that more could benefit from ML if only his workflow and tools were more user-friendly. Graduate student Lewis Oduro brought his vision to tuition by leveraging the numerous public domain ML tools available to programmers and converting them into Windows-based software.

The tool is problem agnostic, Ganguli said. Hence it can have a broad group of users. I have used it for a variety of projects I am involved in, including mining, atmospheric sciences/air quality and COVID/hospital admissions.

PHOTO CREDIT: Rajive Ganguli

Lewis Oduro (right) and Rajive Ganguli (left).

He reports that tens of subject matter experts (SMEs) who are non-coders have already subscribed to receive the software in advance of its formal release. Many are professionals across a broad spectrum of fields from social science to business, along with scientists and engineers.

Designed to empower the domain expert, UteAnalytics allows a client to clean their data and conduct exploratory data analysis in various ways.The software also allows users to estimate the effect of each input on the output, as well as develop models in advance of predicting on a new dataset.

Daniel Mendoza, who holds faculty appointments in the Department of Atmospheric Sciences and elsewhere at the U, is an early adopter of the software. Through his work with air quality monitors on UTA trains and electric buses in Salt Lake Valley, he and his team have successfully collected more than eight years of data for particulate matter and ozone levels, and recently, for nitrogen oxides.

When we look at neighborhood-specific data we can drill in and really see some social justice impacts, Mendoza reported last year. Today, he is using UteAnalytics to quickly and efficiently analyze the temperature data that well be collecting in real-time from our mobile and stationary sensors. UA gives researchers the power to look at data in a very streamlined way without endless hours of coding. The included tools facilitate a thorough interpretation of data and save time without compromising reliability.

The difference that dataassisted by UteAnalytics tools make in Mendozas work on air quality is most recently seen in the Urban Heat Watch campaign, involving citizen scientists who are helping collect data along the streets of Salt Lake Valley. As one of the top three urban heat islands in the nation, the Salt Lake City metropolitan area features a groundbreaking monitoring programnowhere else the world does an initiative exist at the density and scale than in Utahs capital city and environs.

UteAnalytics is just the latest deliverable for Ganguli, who has led approximately $13 million in projects as primary investigator. He is currently involved in several projects in five different countries U.S., Denmark/Greenland, Mongolia, Saudi Arabia and Mexico on topics ranging from ML to training.

Meanwhile, graduate student Lewis Oduro, who defended his thesis this past spring, has since taken a job near Phoenix, Arizona as a mining engineer at Freeport-McMoRan, a leading international mining company. A native of Ghana, Oduro said of his mentor, He gave me the chance to work under him and provided me with the kind of relationship only evident between a father and a son.

Under Gangulis tutelage and support, Oduro was the principal player in building UteAnalytics as desktop software used for data analytics and building predictive ML models.

I will forever be indebted to him and to the entire faculty at the University of Utahs Mining Engineering Department, the young scientist said on his LinkedIn page.

Follow this link:

Making machine learning accessible to all @theU - @theU

Read More..

Advance your Career with the 3rd Best Online Masters in Data … – KDnuggets

Sponsored Post

Go beyond business analytics with Bay Path University's MS in Applied Data Science. Data Science teams need general industry experts who understand data science and technical specialists who can make it happen. Bay Path University will provide you with a career path in data science, regardless of your background and experience. We were one of the first institutions to develop two tracks to complete the Master of Science (MS) in Applied Data Science degree,which is right for you?

Generalist Track -This track prepares students to be well-rounded, collaborative, and skilled data scientists and analysts regardless of their background or area of expertise. Coursework in this track provides the foundation needed for breaking into the fast-growing field of data science.

Specialist Track -This track prepares students to take on more technical roles on data science teams, such as data modeler, data mining engineer, or data warehouse architect.

Our MS in Applied Data Science Degree Program Provides:

Originally posted here:

Advance your Career with the 3rd Best Online Masters in Data ... - KDnuggets

Read More..

The Rise of AI Data Mining: Challenges and Opportunities – Clayton County Register

The rise of artificial intelligence (AI) has brought about a new era of data mining, transforming the way businesses and organizations collect, analyze, and utilize vast amounts of information. As AI-powered data mining tools become increasingly sophisticated, they offer unprecedented opportunities for companies to gain insights, streamline operations, and drive innovation. However, this rapid growth also presents significant challenges, particularly in terms of privacy, security, and ethical considerations.

AI data mining refers to the process of using advanced algorithms and machine learning techniques to automatically extract valuable patterns and insights from large datasets. This approach has proven to be highly effective in various industries, including finance, healthcare, marketing, and manufacturing. For instance, AI-driven data mining can help banks detect fraudulent transactions, enable healthcare providers to predict patient outcomes, and allow retailers to optimize pricing strategies based on consumer behavior.

One of the key advantages of AI data mining is its ability to process and analyze massive amounts of data at a scale and speed that would be impossible for humans. This capability is particularly crucial in todays data-driven world, where the volume of information generated by digital devices and online platforms is growing exponentially. By leveraging AI-powered data mining tools, companies can quickly identify trends, correlations, and anomalies that may otherwise remain hidden, giving them a competitive edge in their respective markets.

Moreover, AI data mining can also help organizations become more efficient and cost-effective. By automating routine tasks and enabling data-driven decision-making, AI-driven data mining tools can significantly reduce the time and resources required for data analysis. This, in turn, allows businesses to focus on more strategic initiatives and drive innovation.

Despite these promising benefits, the rise of AI data mining also raises several challenges that need to be addressed. One of the most pressing concerns is the potential impact on privacy and data security. As AI-driven data mining tools become more adept at extracting information from various sources, there is a growing risk that sensitive personal data may be inadvertently exposed or misused. This issue has already sparked debates around the world, with regulators and policymakers grappling with the question of how to balance the benefits of AI data mining with the need to protect individual privacy.

Another challenge associated with AI data mining is the potential for biased or discriminatory outcomes. Since AI algorithms rely on historical data to make predictions and draw conclusions, they may inadvertently perpetuate existing biases and inequalities. For example, if a machine learning model is trained on a dataset that contains biased information, it may generate biased results that unfairly disadvantage certain groups or individuals. To mitigate this risk, it is essential for organizations to carefully evaluate their data sources and ensure that their AI-driven data mining tools are designed to be as fair and unbiased as possible.

Finally, the rise of AI data mining also raises ethical questions about the appropriate use of these powerful tools. As AI-driven data mining becomes more pervasive, there is a risk that it may be used for malicious purposes or to manipulate public opinion. To address these concerns, it is crucial for governments, businesses, and civil society to engage in open and transparent discussions about the ethical implications of AI data mining and to develop guidelines and best practices that promote responsible use.

In conclusion, the rise of AI data mining presents both significant opportunities and challenges for businesses and society as a whole. By harnessing the power of AI-driven data mining tools, companies can unlock valuable insights, improve efficiency, and drive innovation. However, it is also essential to address the privacy, security, and ethical concerns associated with this rapidly evolving technology. By striking the right balance, we can ensure that the benefits of AI data mining are realized while minimizing the potential risks.

More here:

The Rise of AI Data Mining: Challenges and Opportunities - Clayton County Register

Read More..

Prediction of stability coefficient of open-pit mine slope based on … – Nature.com

Side stability of open pit mine

When developing open-pit mines, the limitation of rock slope safety is the main reason that affects mine production efficiency. The proportion of open-pit mining in China is actually very large. The mining of iron ore and fossil raw materials is almost always in the form of open pit mining. During the mining process, the safety of the slope body is the most important. Therefore, in the mining process, it is necessary to increase the final slope angle and ensure the stability of the slope. There will be a very sharp contradiction in the mining process, that is, the larger the final slope angle, the more unstable the slope will be. If this problem is not handled properly, it will seriously affect the safety production of the mine and the economic benefits of the mine. And the mining of open-pit mines is very likely to cause the surrounding environment to become unsafe. Therefore, when mining, it is necessary to ensure safety without reducing the mining speed and reducing the impact on the surrounding environment. Committed to achieving an economical and efficient stripping ratio10. The mining of the open pit mine is shown in Fig.1.

Mining of an open pit mine.

As shown in Fig.1, in the mining process of the mine, professional tools such as excavators need to be used11. And it will form a first-order slope. In fact, open-pit mines in China are still relatively common, and there are many mines in Inner Mongolia and Xinjiang. These mines have very serious stability problems, and many large landslides have occurred. This has caused a lot of economic losses to the mining mines. So the issue of stability has always been an essential issue. At the same time, because the shallow resources are being developed by people, they are constantly decreasing. Therefore, new technologies have been created to develop deep resources. However, this technology will have a very large impact on the entire mine rock formation, the environment will also be damaged, and safety and stability will also be reduced. Although many scholars have improved the stability of mining in different ways, the stability of each mine is an independent quantity, and they have different characteristics. Therefore, it is necessary to determine which method to use for prediction according to the on-site assessment of the mine.Therefore, further research is needed to find better ways to mine mines12.

If the stability of the slope is not good, it is easy to occur a landslide disaster, which is a very serious problem13. It threatens people's property and lives. The occurrence of landslides is generally due to the destruction of rock slopes. Rockslides and rockslides are the main types of rock slope damage as shown in Picture 2:

As shown in Fig.2, there are two main types of landslides14. The first is rock avalanches. It mostly happens on the kind of very steep slopes where the rock breaks apart in chunks and then collapses, tumbling forward. The rock body at the top is often detached and then falls off due to some factors, and accumulates at the foot of the slope. These situations often occur where there are cracks on the top of the slope. Cracks are also created by weathering of rock over time, or by the intrusion of rainwater and prolonged soaking. However, it is also possible that due to changes in temperature, high temperature or shading may cause the rock to loosen. The protective measures taken by general experts are to use artificially reinforced building materials, that is, anchor cables. This way, the impact force of rock mass collapse and sliding can be minimized. The second is rock slip, which is a phenomenon in which the rock mass slides along a certain surface15. In fact, the main reason for rock slip is because of too much rainfall. After surface water seeps into the cracks, it will generate hydrostatic pressure, which is the force that promotes the sliding of the soil slope and is detrimental to the stability of the soil slope. Due to the infiltration of rainwater, the rise of river water level, or the impoundment of reservoirs, the groundwater level rises, causing static water pressure to act on the impermeable structural surface of the slope. It acts perpendicular to the structural surface and acts on the slope, weakening the normal stress generated by the weight of the sliding mass on the surface, thereby reducing the antisliding resistance of the soil. There are several types of rock slides, so I wont introduce them one by one here. Generally speaking, rock sliding is plane sliding. It means that when the rock slides along the plane, the plane is more prone to plane sliding when the inclination angle of the sliding surface is greater than the internal friction angle. Two conditions need to be satisfied for the plane sliding of the slope rock mass, that is, to overcome the resistance on both sides and the resistance at the bottom. In soft rock, when the bottom inclination angle of the slope rock mass is much larger than the internal friction angle of the rock mass in the open-pit mine slope rock mass, the lateral restraint of the rock mass cannot provide enough force to prevent the rock from being damaged. Will detach from the slope rock mass to produce plane sliding. In the hard rock slope rock mass, only when the discontinuous surface of the slope rock mass crosses the top of the slope, and the rock on the slope is separated from the rock on both sides, the slope rock mass without lateral restraint may also slide in a plane16.

Rock mass characteristics are another tool for classifying slopes, especially in mines. SMR is the most common classification scheme and is often used by different researchers to analyze the stability of cutting slopes in different mines. Slope quality rating is the main tool for understanding the rock mass behavior of open-pit mine slopes. Due to the increase in depth and slope angle, slope quality rating always brings serious problems. Due to various geological complexities, stability issues are more severe. The stability analysis of the moving slope was conducted using the Stereonet diagram. It is a simple tool to analyze wedge failure in planar and rock slopes. This structural data is geometrically plotted to establish the failure probability of the equal area network in the pattern17. There are also many ways to control the slope, there are generally three methods. As shown in Fig.3.

Methods of treating slopes.

From Fig.3, it can be clearly seen that these three methods of slope management18. The first method is to dig up and make up. The general meaning is that there will be many rock masses with poor stability near the upper part of the slope. These rock masses with poor stability can be dug up, transported to the foot of the slope, and compacted. This can effectively enhance the stability. However, because the traction between the rock masses is still very strong, only the rock masses with poor stability can be dug up. The second method is drainage. Because rain is an essential reason for affecting stability. The accumulation of rainwater will affect the slippage of the cracks on the rock surface, resulting in the occurrence of landslides. Especially in the treatment of high, steep, and large slopes, drainage is particularly important. The third method is to use artificial structures for reinforcement. Anchor cables are generally used for protection and reinforcement. Of course, there are also retaining walls and antislide piles. All three methods work well. When controlling slopes, they can be used in combination to achieve better results19.

Moreover, slope material is important or slope geometry is important. Classified by stratigraphic lithology: it can be divided into soil slopes and rock slopes. (a) According to the rock structure, it is divided into layered structure slope, block structure slope, and network structure slope; (b) According to the relationship between rock strata inclination and slope direction, it can be divided into forward slope, reverse slope, and vertical slope. All slope instability involves the failure of slope rock and soil under shear stress. Therefore, the factors that affect the shear stress and the shear strength of rock and soil all affect the stability of the slope.

Deep learning is actually a kind of machine learning method, and its predecessor is machine learning and artificial neural network20. However, because of the passage of time, this method is constantly developing and optimizing, and its application fields are also very wide. Specifically as shown in Fig.4.

Application areas of deep learning.

As shown in Fig.4, this method has many application fields. The author lists nine areas in total. First of all, in the field of computer vision, this method can help computers process image data, or recognize text, and convert these images or text, which is very intelligent and convenient21. In the field of speech recognition, with the support of this algorithm, the efficiency of speech recognition has been greatly improved. Just like processing image data in computer vision, the method can turn sounds into recognizable models very quickly. In the field of audio recognition, this method is also used to improve the efficiency of audio recognition. In terms of social network filtering, the components in the network are very messy, and there are all kinds of information, but this method is very good at information classification, so this method is also very suitable for filtering social networks. In terms of machine translation, using this method can improve the quality of machine translation and make machine translation more inclined to the translation level of an ordinary person. In drug design, this method can assist the development of small molecule drugs, provide new computational decisions for pharmaceuticals, and process more chemical data information. In bioinformatics, using this method can bring new changes to the discipline. Because the method is so good at mining data, it is well suited for solving genomics problems. In the field of medical image analysis, after applying this method, a fast and very detailed analysis of medical images can be performed better. Because this method has already achieved good results in image segmentation. Therefore, it is also very suitable for image analysis in the field of medical images22.

The deep learning method not only has a wide range of applications, but also has many advantages, as shown in Fig.5.

Advantages of deep learning.

As shown in Fig.5, its first advantage is high versatility. Generally, the data we deal with are multidimensional ordered data, and then due to the rapid development of big data, deep learning has been applied very well in various fields23. In addition to speech recognition and image classification, it also has very good performance in data mining and data processing and data prediction, so its versatility is high. The second advantage is robustness, which means that the method is smarter and more stable24. It can automatically adjust parameters according to data changes and automatically adapt to data changes. The third advantage is a good generalization. After the data is increased, it can still have good generalization ability, and the performance is not weakened at all, but enhanced. The fourth advantage is scalability, because when the neural network is stacked too much, the gradient will disappear or the gradient will explode, and this method can solve this problem very well. And this method has very good scalability in the number of layers and structural parameters and can be freely combined to achieve a better learning effect25.

In addition, the method can be generalized to the neural network structure trained in different fields, and can also have a good training effect in the case of insufficient data. We can compare the performance of machine learning and deep learning at training time, as shown in Fig.6.

The relationship between the amount of training data and training performance.

As shown in Fig.6, it is obvious that the previous machine learning has too few parameters, and when the training data increases, the generalization ability will decrease26. The method proposed in this paper, when the training data increases, the generalization ability is better. This shows that the method proposed in this paper not only has good stability in data processing, but also can process very well data at the same time. Not affected by the size of the data volume.

Since deep learning is widely used, its framework has also been introduced by scholars. The code of the framework itself is very concise, the supported language types are quite rich, the technical documentation is complete, and the maintenance and operation are in good condition. Below I will list five more popular frameworks, as shown in Table 1.

As shown in Table 1, there are actually many mainstream frameworks, but for the convenience of analysis, the author lists the five most popular frameworks. The first framework is Tensor Flow, and its core code is written in C++. This is generally used to deal with multidimensional vectors. This framework also has visualization tools that can fully display the structure and data flow of the neural network. It contains many mainstream algorithms, and the entire design process is also comprehensive, which is very suitable for prediction in the industry. The second framework is Caffe, which defines each neural network. After ensuring normal docking, the network construction work is just stacking each layer. And it can participate in training as long as the model is defined, and the training performance is very good. The third framework is Torch, whose popularity is mainly due to the support of Facebook. This framework supports a lot of scientific computing, and it is generally the first choice for scientific research in academia. The fourth is CNTK, which is introduced by Microsoft. There are also many features, mainly the network structure is very fine, the code is product-level, and can be trained on a variety of hardware. The fifth framework is Keras. Its components are highly encapsulated. It is generally used by beginners, and it is relatively quick to get started. After understanding the principle, you can initially build a network.

(1) Recurrent neural network (rnn).

It is mainly a neural network generated to process sequence data.

Assuming the time is Y, you can get the model output as:

$${P}_{Y}=Bcdot {J}_{Y}+{N}_{P}$$

(1)

When the model predicts the output value at time Y, we analyze the loss function, and the backpropagation starts from the final loss value. Then during backpropagation, R represents the cost function, and the defined objective function is:

$$R=Vsum_{Y=1}^{Y}{Vert {A}_{Y}-{U}_{Y}Vert }^{2}=V{sum }_{Y=1}^{Y}sum_{K=1}^{A}{({A}_{Y}left(Kright)-{U}_{Y}left(Kright))}^{2}$$

(2)

The weight formula E can be updated by adjusting the cost function to be smaller:

$${E}^{NEW}=E-rho frac{vartheta R}{vartheta E}$$

(3)

represents the learning efficiency, which can control the speed of parameter update. If it is not properly controlled, it will cause the optimization speed to not keep up. So to calculate the gradient, the error can be formulated as:

$${varepsilon }_{Y}^{U}left(Kright)=-frac{vartheta R}{vartheta {B}_{Y}left(Kright)}$$

(4)

$${varepsilon }_{Y}^{J}left(Kright)=-frac{vartheta R}{vartheta {I}_{Y}left(Kright)}$$

(5)

Then recursively calculate them, and the new formula can be obtained as:

$${U}_{Y}=Hleft({E}_{JU}Gleft({E}_{CJ}{C}_{Y+}{E}_{JJ}{J}_{Y-1}right)right)$$

(6)

$$R=V{sum }_{Y=1}^{Y}{Vert {A}_{Y}-{U}_{Y}Vert }^{2}=Vsum_{Y=1}^{Y}{left({A}_{Y}left(Kright)-{U}_{Y}left(Kright)right)}^{2}$$

(7)

Y represents the last time point, at which the hidden layer can be expressed as:

$${varepsilon }_{Y}^{J}left(Kright)=-left(sum_{O=1}^{A}frac{vartheta R}{vartheta {B}_{Y}(O)}frac{vartheta {B}_{T}(0)}{vartheta {J}_{Y}(O)}frac{vartheta {J}_{Y}(K)}{vartheta {I}_{Y}(K)}right)$$

(8)

By derivation of this formula, the error formula at other time points can be obtained as:

$${varepsilon }_{Y}^{J}left(Kright)=left({O}_{Y}left(Kright)-{U}_{Y}left(Kright){H}{prime}left({B}_{Y}left(Kright)right)right)$$

(9)

The error formulas in the output layer and hidden layer are:

$${varepsilon }_{Y}^{J}left(Kright)=left[sum_{O=1}^{M}{varepsilon }_{Y+1}^{J}left(Oright){E}_{HH}left(O,Kright)+sum_{O=1}^{L}{varepsilon }_{Y}^{U}left(Oright){E}_{JU}left(O,Uright)right]{G}{prime}left({I}_{Y}left(Kright)right)$$

(10)

$${varepsilon }_{Y}^{J}=left[{E}_{JJ}^{Y}{varepsilon }_{Y+1}^{J}+{E}_{JU}^{Y}{varepsilon }_{Y}^{U}right]cdot {G}{prime}left({I}_{Y}right)$$

(11)

It represents the error at time point Y and represents the error at time Y+1.

This way, the weights of the output layer can be updated as:

$${mathrm{E}}_{mathrm{JU}}^{mathrm{NEW}}left(mathrm{O},mathrm{K}right)={mathrm{E}}_{mathrm{JU}}left(mathrm{O},mathrm{K}right)-upbeta {sum }_{mathrm{Y}=1}^{mathrm{Y}}{upvarepsilon }_{mathrm{Y}}^{mathrm{U}}left(mathrm{O}right){mathrm{J}}_{mathrm{Y}}left(mathrm{K}right)$$

(12)

The weights of the input layer can be updated as:

$${E}_{CU}^{NEW}left(O,Kright)={E}_{CU}left(O,Kright)-beta {sum }_{Y=1}^{Y}{varepsilon }_{Y}^{J}left(Oright){C}_{Y}left(Kright)$$

(13)

The weights of the recurrent layer can be updated as:

$${E}_{JJ}^{NEW}left(O,Kright)={E}_{JJ}left(O,Kright)-beta {sum }_{Y=1}^{Y}{varepsilon }_{Y}^{J}left(Oright){J}_{Y-1}left(Kright)$$

(14)

(2) Long short-term memory neural network (lstm)

Although it is similar in structure to RNN, it will change the increased cell state in the structure according to the existence time, and this cell state is a long-term memory27. The LSTM algorithm is often used to perform operations such as prediction of various data or image recognition. Its forgetting gate determines whether the knowledge I have already learned is useful, and which part I want to discard; the input gate determines whether the knowledge others tell me is useful to me, and which knowledge I want to receive; integrating my current knowledge through the output gate some knowledge determines what to report to others28. The theory and learning process of LSYM are valuable, and it has good problem-solving ability when solving some practical problems.

If it is a parameter in the output gate, it is the output of the hidden layer. The formula of the input gate can be obtained as:

$${O}_{Y}=tau left({E}_{CO}{C}_{Y}+{E}_{DO}{D}_{Y-1}+{E}_{VO}{V}_{Y-1}+{N}_{P}right)$$

(15)

$${D}_{Y}={P}_{Y}TANH({V}_{Y})$$

(16)

At this time, if you want to update the cell state, the formula can be expressed as:

$${V}_{Y}={G}_{Y}{V}_{Y-1}+{O}_{Y}{V}_{Y}$$

(17)

$$overline{{V }_{Y}}=tau left({E}_{O}cdot left[{J}_{Y-1},{C}_{1}right]+{N}_{O}right)$$

(18)

Calculated according to the state of the current time, the final output value can be obtained.

$${P}_{Y}=tau left({E}_{CP}{C}_{Y}+{E}_{DP}{J}_{D-1}+{N}_{P}right)$$

(19)

$${D}_{Y}={P}_{Y}TANH({V}_{Y})$$

(20)

Continued here:

Prediction of stability coefficient of open-pit mine slope based on ... - Nature.com

Read More..

The Role of Robotic Vision in Shaping Global Internet Technologies – Fagen wasanni

Exploring the Impact of Robotic Vision on the Evolution of Global Internet Technologies

The role of robotic vision in shaping global internet technologies is a fascinating and rapidly evolving field. As we delve into the impact of this technology, it becomes clear that robotic vision is not just a futuristic concept, but a reality that is transforming the digital landscape.

Robotic vision, a technology that enables machines to see and interpret the world around them, is revolutionizing the way we interact with the internet. It is a key component in the development of autonomous systems, such as self-driving cars and drones, which rely on the ability to perceive and understand their environment to operate safely and efficiently.

The integration of robotic vision into these systems is made possible through advancements in artificial intelligence (AI) and machine learning. These technologies allow machines to process and analyze visual data, enabling them to make decisions based on what they see. This is a significant leap forward in the evolution of the internet, as it opens up new possibilities for automation and efficiency.

One of the most significant impacts of robotic vision on global internet technologies is in the realm of data collection and analysis. With the ability to see and interpret the world, robots can gather vast amounts of visual data. This data can then be analyzed using AI and machine learning algorithms to extract valuable insights. This process, known as data mining, is becoming increasingly important in a range of industries, from healthcare to retail.

For instance, in the healthcare sector, robotic vision is being used to analyze medical images, such as X-rays and MRIs, to detect diseases at an early stage. In retail, it is being used to track customer behavior and preferences, enabling businesses to offer personalized shopping experiences.

Moreover, the advent of robotic vision is also driving the development of new internet technologies. One such technology is the Internet of Things (IoT), a network of interconnected devices that communicate and exchange data with each other. Robotic vision plays a crucial role in the IoT, as it allows devices to perceive their environment and interact with it in a meaningful way.

For example, a smart home system equipped with robotic vision can monitor its surroundings and adjust its settings based on what it sees. If it detects that its getting dark outside, it can automatically turn on the lights. If it sees that no one is home, it can lower the thermostat to save energy.

In conclusion, the role of robotic vision in shaping global internet technologies is profound. It is driving the development of autonomous systems, revolutionizing data collection and analysis, and paving the way for new technologies like the IoT. As we continue to explore the potential of this technology, we can expect to see even more exciting advancements in the digital landscape. The future of the internet, it seems, is not just about connecting people and information, but also about giving machines the ability to see and understand the world around them.

Read more:

The Role of Robotic Vision in Shaping Global Internet Technologies - Fagen wasanni

Read More..

A New Approach to Space Weather Forecasting: The Power of … – Fagen wasanni

A new approach to space weather forecasting is being developed, utilizing predictive analytics to improve accuracy and provide timely predictions of potentially harmful space weather events. Space weather refers to the dynamic conditions in Earths outer space environment, influenced by the Suns activity, such as solar flares and coronal mass ejections (CMEs), which can impact satellite communications, navigation systems, and power grids.

Previously, space weather forecasting relied on observations of the Sun, solar wind measurements, and Earths magnetic field. However, these methods had limitations in accuracy and lead time, making it difficult to provide reliable forecasts.

Predictive analytics, a branch of advanced analytics, utilizes data mining, machine learning, and artificial intelligence techniques to predict future events. In space weather forecasting, this involves analyzing historical and real-time data from satellites and observatories.

The power of predictive analytics lies in its ability to quickly and efficiently analyze vast amounts of data. With numerous satellites and observatories monitoring the Sun and Earths space environment, processing this data is crucial. Advanced algorithms and machine learning techniques assist forecasters in identifying relevant information for accurate predictions.

Predictive analytics also improves over time through machine learning. As more data is collected and analyzed, the algorithms can be refined, enhancing accuracy and reliability. This iterative process enables predictive analytics to become more effective in forecasting space weather events.

The use of predictive analytics in space weather forecasting offers significant benefits. Improved accuracy enables stakeholders, such as satellite operators and power grid managers, to take proactive measures in protecting their systems from space weather effects. This minimizes disruptions to communications, navigation, and power systems, and reduces the risk of satellite damage.

In conclusion, predictive analytics represents a new frontier in space weather forecasting, providing enhanced accuracy and timely predictions. By employing advanced data mining techniques and sophisticated algorithms, predictive analytics transforms our understanding of the cosmos and its impact on daily life. The potential benefits are clear, paving the way for increased space weather preparedness and resilience.

Continued here:

A New Approach to Space Weather Forecasting: The Power of ... - Fagen wasanni

Read More..