Page 2,177«..1020..2,1762,1772,1782,179..2,1902,200..»

Moon mining and satellite collisions make list of DoD concerns in space – Federal News Network

Outer space is becoming increasingly militarized as China, Russia, the United States and other countries continue to vie for dominance in the domain and even consider mining off-planet assets.

The Defense Department identified its two biggest competitors, along with increasing congestion in the area just outside the Earths atmosphere as some of the largest threats to the United States space interests.

China and Russia value superiority in space, and as a result, theyll seek ways to strengthen their space and counterspace programs and determine better ways to integrate them within their respective militaries, Kevin Ryder, Defense Intelligence Agency senior analyst for space and counterspace, said Tuesday at the Pentagon. Both nations seek to broaden their space exploration initiatives together and individually with plans to explore the moon and Mars during the next 30 years. If successful, these efforts will likely lead to attempts by Beijing and Moscow to exploit the moons natural resources.

In a new report from the DIA, the organization found that since 2019 competitor space operations have increased in pace and scope across nearly all major categories including communications, remote sensing, navigation, and science and technology demonstration.

Ryder said that China and Russia intend to undercut the United States and its allies in space.

The report states the two nations increased their number of satellites around the Earth by 70% in the last two years.

Other advancements include China landing a rover on Mars and a robotic spacecraft on the dark side of the moon.

What weve seen so far has been more civilian in nature, Ryder said. However, China emphasizes in their writings, civil-military integration and dual-use purpose space capabilities. While we do understand that right now, it is civil in nature, we continue to monitor for any possibility of military activity.

Its not just competition that DIA is outlining as a threat to U.S. space efforts. The intelligence agency noted that the probability of collisions of massive derelict objects in low earth orbit is growing, and will continue through at least 2030.

As of January 2022, more than 25,000 objects of at least 10 centimeters in size were tracked and cataloged in Earths orbit to include active satellites, the report states. The primary risk to spacecraft in orbit is from uncataloged lethal nontrackable debris (LNT), which are objects between 5 millimeters and 10 centimeters in size. An estimated 600,000 to 900,000 pieces of uncataloged LNT are in low earth orbit.

Looking to the future, the U.S. is now considering deep space operations and the challenges they will present for tracking and monitoring spacecraft.

DoD has outlined space as a crucial domain for the United States. The Pentagon is increasing its investments in space capabilities. The 2023, budget request asks for $27.6 billion for space capabilities, command and control and resilient architectures.

Thats not to mention that Congress and the department set up the Space Force in the last couple years, a military branch solely focused on operations outside of earth.

The Biden administration is asking for the largest ever Space Force budget next year at $24.5 billion.

Read the original post:

Moon mining and satellite collisions make list of DoD concerns in space - Federal News Network

Read More..

Central Michigan University professor receives Professor of the Year award – The Morning Sun

Central Michigan University Professor Carl Lee has been recognized as one of the states three recipients of the Michigan Distinguished Professor of the Year award.

This award recognizes the outstanding contributions and dedication exhibited by the faculty from Michigans 15 public universities to the education of undergraduate students.

The Academic Affairs Officers committee of the Michigan Association of State Universities will recognize the nominees and recipients of this annual award on April 15 during a luncheon at the Lansing Center. The other two winners are Michigan State University Professor Vashti Sawtelle and Wayne State University Professor Sandra Gonzales.

Supporting and preparing future educators

Professor Lee is among the award nominees who continue to bring new scholarship and innovation in teaching and learning to Michigans public universities, said Dr. Daniel J. Hurley, CEO of the Michigan Association of State Universities. These professors have the highest dedication to their students, ensuring that they are well prepared to make a meaningful impact in their careers and in their communities.

Dr. Carl Lee is Founding Chair and Professor of Department of Statistics, Actuarial and Data Sciences at Central Michigan University, where he has taught for more than 38 years and developed undergraduate and graduate programs and taught courses ranging from introductory to Ph.D. level courses. Dr. Lee earned his B.S. in agronomy from National Taiwan University, his M.A. in mathematics at West Florida State University and his Ph.D. in statistics at Iowa State University.

He is a Fellow of the American Statistical Association (ASA) and a recipient of the Haimo Distinguished Teaching of Mathematics Award from the Mathematical Association of America (MAA), the Distinguished Teaching Award from the Michigan Section of the MAA, CMUs University Distinguished Service Award, and numerous other honors.

Dr. Lees dedication to undergraduate education goes beyond his classroom rapport with students. He has consistently been innovative in his approach to pedagogy, emphasizing projects, hands-on activities, cooperative learning, and exercises, or what he calls a PACE model. He has authored an extensive array of scholarly papers on the topic of teaching and learning in the field of statistics. Since statistics is considered by many students to be among the most difficult subjects, this motivates Dr. Lees interest in conducting research to investigate how students learn quantitative concepts, the misconceptions and difficulties encountered, as well as the effect of technology on learning. Among his 150 publications, 42 papers are associated with teaching and student learning.

Biochemistry student awarded Goldwater Scholarship

Dr. Lee has also made substantial contributions to the teaching of statistics at CMU, developing statistics programs at the undergraduate and graduate levels and nationally through his involvement with ASAs Undergraduate Statistics Education Initiatives, and the National Consortium for the Advancement of Undergraduate Statistics Education.At CMU, he helped design the undergraduate statistics and actuarial science program and also initiated and developed a graduate certificate program in data mining, the MS in Applied Statistics and Analytics, and completely revised the doctoral curriculum.For the past two years, Dr. Lee has worked tirelessly with colleagues from across the university to create the most interdisciplinary program on campus in data science.

The monumental task has involved coordinating a new degree and major with colleagues from four colleges and nine academic departments, continuing to inspire fellow faculty and administrators at CMU.

Throughout his career, Dr. Carl Lee has consistently demonstrated his devotion to students and high-quality undergraduate education through his engaging classroom approach, innovative use of pedagogy, high quality scholarship in teaching and learning of statistics, and his contributions to program development and curricular reform at CMU and in his professional field, said Richard Rothaus, Central Michigan University Interim Executive Vice President and Provost.

View original post here:

Central Michigan University professor receives Professor of the Year award - The Morning Sun

Read More..

Overcoming the Challenges of Mining Big Data – Tahawul Tech

by Bill Scudder, Senior Vice President and General Manager, AIoT Solutions, AspenTech

Big Data remains of critical importance to every organisation operating today, no matter its size or location. Data volumes continue to rise. Big Data growth statistics from Statista reveal that data creation will be over 180 zettabytes by 2025. That will be about 118.8 zettabytes more than in 2020. IDC recommends Industrial organisations are among those most significantly impacted by data growth. In the Middle East, Turkey and Africa, IDC expects big data analytics spending to grow by 8% and be valued at $3bn.

Making the most of data

The impact is a mixture of opportunity and challenge. On the one hand, Big Data has an important role to play in arming organisations with the resources and information they need to enable data-driven decisions that can improve business-related outcomes. When analysed properly, the benefits of Big Data can include optimising production, real-time visibility, and enhanced decision-making, allowing teams to be more productive, effective, and innovative. These benefits can make the difference between an organisations success over its competitors.

Indeed, for capital-intensive industries such as manufacturing and industrial facilities, Big Data is essential to operations. It can help with predictive maintenance, so supervisors can schedule plant downtime to repair assets before unexpected costly breakdowns occur, provide anomaly detection to alert workers to small deviations from the norms of quality, and predict with greater certainty around international supply chain management challenges.

Thats essentially the opportunity. However, the problem often lies in harnessing this data and then making use of it to advance the organisations goals. With the Industrial Internet of Things (IIoT) and capital-intensive industries amassing more and more data, theyre having to face the increasing challenges of being unable to manage it or leverage it effectively.

Starting to address the challenge

More data isnt always better data. With the influx of data that organisations have received through digitalisation efforts, many have found themselves in the middle of a data swamp with every piece of possible data included.

We are seeing signs of progress. COVID-19 undoubtedly accelerated the digitalisation of organisations across the world, down to the way they store and access their data. However, this transformation also revealed the limitations of the traditional model of data management, where data is siloed by teams, sources, and locations. This kind of data gatekeeping significantly hinders visibility, as only certain people with unique access or domain expertise can understand or even access data sets that may be relevant to others across the enterprise.

Industrial organisations must switch focus from mass data accumulation to strategic industrial data management, specifically homing in on data integration, data mobility and data accessibility across the organisation. They need to bring together data stored in various silos, often at a range of different facilities worldwide with the goal of using AI-enabled technologies to unlock hidden value in previously unoptimised and undiscovered sets of industrial data. In effect, they need to start moving to an approach based on big data mining.

Debunking the myths

Before embarking on this approach, however, capital-intensive organisations need to counter the myths that have hampered the approach in the past. When it comes to Big Data mining specifically, one of the biggest challenges organisations face is operating under the assumption that there is a one-size-fits-all solution.

This is not true organisations must continuously re-evaluate their workflows and processes for collecting, storing, optimising, and presenting data to ensure theyre reaping the greatest business value from it. This shows up in practice when thinking about auto discovery for example. Many IT leaders believe that there are tools that will auto-discover relevant information across all data, where in fact there is an age limit on what these tools can work with data that is past a certain time frame is usually undetectable, for example.

Another challenge comes from a generational, operational expertise gap. Many organisations are having difficulty finding the right people who have the means and knowledge of where data is stored and what format its in. This all circles back to making sure they have the correct data integration strategy in place it makes it infinitely easier on the employees when a set plan has been made and executed on.

Executing on a strategy

Once organisations are fully aware of the myths of data mining and prepared to counter the challenges, they can start on the process of making more of their Big Data stores.

To help facilitate this change, enterprise IT teams should put a clear strategy in place to ensure that theyre implementing all the proper tools they need to get adequate information from all sources. A big part of this strategy should be to implement a data historian. These tools have evolved, moving beyond standardised aggregations of process data to become the anchor technology for industrial data management strategies. In todays world, the data historian serves as a democratising force, making it possible for data to be accessed and actioned on by anyone in the organisation.

Next, organisations need to start applying Industrial AI to make data more visible, accessible, and actionable across the enterprise. By building a cloud-ready industrial architecture that connects the latest AI technologies with the IIoT can not only collect and transmit data but also turn it into intelligence to drive smarter decisions.

This emerging confluence of AI and IIoT offers capital-intensive businesses a range of benefits because they can create a sustainable advantage when analysing large volumes of industrial data for real-time reporting, automation and decision making and then integrate it across assets, plants and sites. They can then build on the advantage by visualising data to identify trends, outliers, and patterns to drive mission-critical apps and actionable business workflows and collaborating across AI-powered apps to help achieve safety, sustainability and profitability goals.

The strategic data management approach that this combination of data historians and industrial AI delivers also helps to bridge a growing skills gap. As veteran employees with years of expertise retire, replaced with younger employees with much less experience, an AI-powered, datadriven approach ensures that critical, historic knowledge is preserved and shared widely across the organisation regardless of team, geographical location, or siloes.

A positive future approach

Big Data will continue to play a mission-critical role in arming industrial organisations with the resources and insights needed for making data-driven decisions tied to concrete business value outcomes.

This could be about helping with predictive maintenance so supervisors can schedule plant downtime to repair assets before unexpected costly breakdowns occur, providing anomaly detection to alert workers to small deviations from the norms of quality, and predicting with greater certainty around supply chain management challenges.

It could also mean anything from optimising production lines to providing real-time process visibility, all to help teams become more productive, effective, and innovative. But to reap the most value from Big Data and apply it meaningfully to industrial applications, capital-intensive businesses must switch their focus from mass data accumulation to more thoughtful, strategic industrial data management homing in on data integration, mobility, and accessibility across the organisation. By deploying technologies like next-gen data historians and Industrial AI, these businesses can unlock new, hidden value from previously unoptimised and undiscovered sets of industrial data.

More here:

Overcoming the Challenges of Mining Big Data - Tahawul Tech

Read More..

Read This Engaging Story Of How The Computer Has Been Eating The World For 75 Years – Forbes

A New History of Modern Computing by Thomas Haigh and Paul E. Ceruzzi is a must-read for investors, entrepreneurs, executives, and anyone interested in understanding the technology that is embedded in the lives of most of the worlds population.

A New History of Modern Computing by Thomas Haigh and Paul E. Ceruzzi

Haigh and Ceruzzi tackled the challenge of writing a definitive, comprehensive history of an ever-changing technology, by breaking the seventy-five years (1945-2020) of modern computing into about fifteen distinct themes, each focusing on a specific group of users and applications around which the computer is redefined with the addition of new capabilities. Along the way, they trace the transformation of computing from scientific calculations to administrative assistance to personal appliances to a communications medium, a constant reinvention that continues today.

Computers made an astounding variety of other technologies vanish into itself, write Haigh and Ceruzzi. We conceptualize this convergence of tasks on a single platform as a dissolving of those technologies, and in many cases, their business models, by a device that come ever closer to the status of a universal technological solvent.

In Silicon Valley parlance, dissolving is disrupting. In the dominant tech zeitgeist (to some extent since the 1950s, without exception since the 1990s), each computer transformation is a revolution. Which is why historyand understanding the actual (factual) details of past transformationsis typically of no interest to the denizens of the next big thing.

Haigh and Ceruzzi deftly demonstrate why it is important to understand the evolution of computing, why knowing where you came from is a foundation of success, why tradition is a key to innovation. Architectural advances pioneered by Cary supercomputers now help your phone to play Netflix video more effectively is one example highlighting the remarkable continuity of computing, as opposed to the make-believe disruptive innovations. Whenever the computer became a new thing, it did not stop being everything it had been before, Haigh and Ceruzzi sum up the real business of innovating while standing on the shoulders of giants.

Possibly reacting to the endless pronouncements that this or that new computing innovation is changing the world, Haigh and Ceruzzi remind us that the computers influence on our lives has so far been less fundamental than that of industrial age technologies such as electric light or power, automobiles or antibiotics. Armed with this useful historical perspective, they have tried to give a reasonably comprehensive answer to a more tractable question: How did the world change the computer?

Numerous inventors, engineers, programmers, entrepreneurs and users have been responsible for the rapid and reliable change in the scale and scope of computing, not any inherent laws or some kind of inevitable, deterministic technology trajectory. In the process, they have changed the computer industry, what we mean by industry, and what we perceive as the essence of computing.

Just like the technology around which it has grown by leaps and bounds, the computer industry has gone through multiple transformations. From a handful of vertically integrated companiesprimarily IBM and DEC; to a number of companies focusing on horizontal industry segments such as semi-conductors, storage, networking, operating systems, and databasesprimarily Intel, EMC, Cisco, Microsoft, and Oracle; to companies catering mostly to individual consumersprimarily Apple, Google, Facebook, and Amazon. To this latter group we may add Tesla, which Haigh and Ceruzzi discuss as a prime example of the convergence of computing and transportation. Just like computing technology, the ever-changing computer industry has not stopped being what it was earlier when it moved into a new phase of its life, preserving at least some elements of previous stages in its evolution.

Still, the new stages eventually dissolved the business models of the past, leading to todays reliance by many large and small computer companies on new (to the industry) sources of revenues such as advertising. Eating other industries, especially media businesses, brought on huge profits and, eventually, serious indigestion.

While swallowing other industries, the computer industry has also made the very term industry quite obsolete. The digitization of all analog devices and channels for the creation, communications, and consumption of information, spurred by the invention of the Web, shuttered the previously rigid boundaries of economic sectors such as publishing, film, music, radio, and television. In 2007, 94% of storage capacity in the world was digital, a complete reversal from 1986, when 99.2% of all storage capacity was analog.

I would argue that the data resulting from the digitization of everything is the essence of computing, of why and how high-speed electronic calculators were invented seventy-five years ago and of their transformation over the years into a ubiquitous technology, embedded, for better or worse, in everything we do. This has been a journey from data processing to big data.

As Haigh and Ceruzzi write early computers wasted much of their incredibly expensive time waiting for data to arrive from peripherals. This problem of latency, of efficient access to data, played a crucial role in the computing transformations of subsequent years, but it has been overshadowed by the dynamics of an industry driven by the rapid and reliable advances in processing speeds. Responding (in the 1980s) to computer vendors telling their customers to upgrade to a new, faster processor, computer storage professionals wryly noted they are all waiting [for data] at the same speed.

The rapidly declining cost of computer memory (driven by the scale economies of personal computers) helped address latency issues in the 1990s, just at the time business executives started to use the data captured by their computer systems not only for accounting and other internal administrative processes. They stopped deleting the data, instead storing it for longer periods of time, and started sharing it among different business functions and with their suppliers and customers. Most important, they started analyzing the data to improve various business activities, customer relations, and decision-making. Data mining became the 1990s new big thing, as the business challenge shifted from how to get the data quickly? to how to make sense of the data?

A bigger thing that decade, with much larger implications for data and its usesand for the definition of computingwas the invention of the Web and the companies it begat. Having been born digital, living the online life, meant not only excelling in hardware software development (and building their own clouds), but also innovating in the collection and analysis of the mountains of data produced by the online activities of millions of individuals and enterprises. Data has taken over from hardware and software as the center of everything computing, the lifeblood of tech companies. And increasingly, the lifeblood of any type of business.

In the last decade or so, the cutting edge of computing became big data and AI (more accurately labeled deep learning), the sophisticated statistical analysis of lots and lots of data, the merging of software development and data mining skills (data science).

As Haigh and Ceruzzi suggest, we can trace how the world has changed the computer rather than how computer technology has changed the world. For example, tracing the changes in how we describe what we do with computers, the prestige-chasing transformations from data processing to information technology (IT), from computer engineering to computer science, and from statistical analysis to data science. The computerand its datahave brought many changes to our lives, but has not changed much what drives us, what makes humans tick. Among many other things, it has not influenced at all, it could not have influenced at all, the all-consuming desire for prestige and status, whether as individuals or as nations.

See the article here:

Read This Engaging Story Of How The Computer Has Been Eating The World For 75 Years - Forbes

Read More..

Medicare is cleaning up the FDA’s mess on Biogen’s Alzheimer’s drug – Kingsport Times News

Medicare has decided once and for all not to pay for Biogens new Alzheimers drug Aduhelm unless patients are enrolled in a clinical study.

The agencys final call was unsurprising, but blessedly rational. It corrects the Food and Drug Administrations mistake in letting Aduhelm onto the market. At the same time, it leaves room for future Alzheimers drugs to be covered as long as studies show they are safe and effective.

This will encourage beneficial innovation in Alzheimers drug development and ensure that patients get medicines that can truly help them.

The decision by the Centers for Medicare & Medicaid Services marks a turning point in Aduhelms long and contentious journey. In 2014, the drug raised hopes among Alzheimers doctors and patients when, in a small phase 2 trial, it appeared to clear amyloid plaques in patients brains and in a first for the field ease their cognitive decline. Biogen promptly began a large, expensive phase 3 study to confirm those results and, to prepare for the drugs eventual approval, invested $2.5 billion in manufacturing capacity.

In larger trials, however, the stunning early results couldnt be replicated. And that seemed to end all hope for the drug until Biogen said it found buried in the data a signal that the drug could still be effective. Then, according to an investigation by Stat News reporters, the company secretly lobbied the FDA for Aduhelms approval.

In 2020, the FDAs scientific advisory committee harshly criticized the companys data mining and overwhelmingly recommended against approving Aduhelm. Then the agency stunned everyone by approving the drug anyway, based on its ability to clear amyloid plaques, with the proviso that Biogen would run another trial to prove that the plaque-clearing would slow cognitive decline.

Biogen audaciously priced the drug at $56,000 per year. And Medicare, faced with the possibility of paying for treatment for millions of qualified Americans, had to schedule a big rise in monthly premiums for Part B coverage. (After an outcry, Biogen eventually halved the price.)

Top stories, delivered straight to your inbox.

Now that CMS has settled on a way to limit spending on the drug until its benefit is proved, Medicare will be able to dial back that premium increase. The decision also likely spells the end of Aduhelm, which doctors were already shunning. In 2021, it brought in only $3 million in sales.

Biogen, patient advocacy groups and even some members of Congress have suggested that CMSs refusal to cover Aduhelm could have a chilling effect on innovation in Alzheimers. They have argued that drug companies will have no incentive to develop new medicines if insurers wont cover them.

But in a clear and sober explanation of its thinking on Aduhelm, CMS pointed out that the opposite is true: The CMS final decision provides clarity on the criteria to receive coverage for any drug in this class (and thus what evidence is necessary to meet the standard for reasonable and necessary for this particular treatment).

A drug can be considered innovative only if it actually improves patients lives. In a disease as devastating as Alzheimers, even marginal improvements matter. But evidence from several large clinical studies indicates that Aduhelm fails to offer that.

Medicare has laid a path for other companies to understand where the bar for coverage is set: A drug must be safe and offer a meaningful benefit to patients, and it must do so over time. This is good news for Eli Lilly & Co. and Roche, both of which have Alzheimers therapies that will soon be up for approval.

CMS, which is expected to foot the bill for Medicare patients drugs, perhaps had greater incentive than the FDA to make sure the drug works. But the FDA is the agency that should have set the bar. FDAs mandate is to follow the science.

As it weighs other loaded decisions, particularly for neurodegenerative diseases, it should make sure that Medicare never again has to correct its mistakes.

Lisa Jarvis, the former executive editor of Chemical & Engineering News, writes about biotech, drug discovery and the pharmaceutical industry for Bloomberg Opinion.

Excerpt from:

Medicare is cleaning up the FDA's mess on Biogen's Alzheimer's drug - Kingsport Times News

Read More..

Payor audits on the rise. How private practices can get ready. – American Medical Association

Claims processing and payments from health insurers should flow easily under negotiated contracts, but that isnt always the case. Payment and claims audits are more common than ever and physician private practices should know what to anticipate, according to experts from one of the nations preeminent health care law firms.

Ross Burris and Sean Timmons of the Polsinelli law firm offered their expert advice and detailed the trends in payor audits and disputes in the first of a two-part AMA webinar. You can watch part two for even more audit-response strategies.

Medicares auditing process

The Centers for Medicare & Medicaid Services has introduced an audit process called Targeted Probe and Educate that was designed to help health care claimants reduce denials and appeals and improve their administrative process.

But the process hasnt lived up to its goals, Burris said. The new system ties physicians and their practices to Medicare Administrative Contractors (MACs) who are tasked with helping claimants identify and correct errors.

We are seeing them a lot, especially for physicians. It sounds really nice, Burris said. But the process is onerous, requiring a lot of time to prepare supporting documents for MAC review, which can then lead to broader and more complicated audits. The Polsinelli lawyers said the audits can take up to two years to resolve and they have seen a recent increase in these types of audits. Learn more with the AMA about the MAC review process.

Commercial payor audits, meanwhile, can be even harder to navigate. Many contracts have not been updated in years, and commercial audits are based on contracts with unique opportunities for denial, rate changes and termination, Burris said.

Payors may request itemized bills and medical records before payment and as a result, we are seeing a lot of claims being held up, he said.

Contract termination is a capital punishment outcome for audits, and physicians should know that terminations can come at almost any point in an audit or negotiation, the Polsinelli experts said.

The AMA Payor Audit Checklist (PDF) helps practices respond effectively to payor records requests while minimizing the administrative burden associated with responding to such requests. A thorough and timely response could reduce the likelihood that a practice will have to return money to the payor, pay a penalty or lose access to the plans beneficiaries.

Mining data to ID outliers

Payors are getting more aggressive in terminating physicians and practices that they identify as outliers by using data-mining and claims analysis to identify physicians that deem to be performing and billing for procedures outside of what they see as normal in their coverage regions, the lawyers from Polsinelli said.

We had a case last year when a client had a pretty favorable rate they had negotiated, and [the insurer] came in and decided that day they wanted to amend the contract and change the rate, Burris said.

When the physician practice reminded the insurer that a change was not within the period allowed for rate changes, the payor announced it would terminate the contract completelywhich was allowed by the contract.

Payors often make little or no effort to understand the reason for the billings that led to the audit. Physician practices should be mindful that termination can happen without warning, leaving their doctors out of network, the Polsinelli experts said.

Its a draconian tactic, but it happens from time to time, Burris said.

State insurance regulators may have the power to intercede in insurance company behaviors such as contracts and audits, but are unlikely to do so because they are more interested in protecting beneficiaries, Timmons said. It is difficult to get their attention. That leaves insurers with a lot more options to pressure physician practices.

It takes astute clinical judgment as well as a commitment to collaboration and solving challenging problems to succeed in independent settings that are often fluid, and theAMA offers the resources and support physicians needto both start and sustain success in private practice.

As physicians strive to continue to provide care to patients and maintain their practices during the ongoing COVID-19 pandemic, the AMA is providing an updated guide to help doctors to keep their practices open.Find out more about theAMA Private Practice Physicians Section, which seeks to preserve the freedom, independence and integrity of private practice.

See the rest here:

Payor audits on the rise. How private practices can get ready. - American Medical Association

Read More..

IIIT-HYDERABAD HOSTS 27TH INTERNATIONAL CONFERENCE ON DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 22) – Devdiscourse

Conference includes talks, research papers, tutorials, demos, industry presentations, panel discussions and workshops on database systems HYDERABAD, India, April 11, 2022 /PRNewswire/ -- IIIT Hyderabad is hosting the 27th International Conference on Database Systems for Advanced Applications (DASFAA 22) online from 11-14 April.

The Conference provides a leading international forum for discussing the latest research on database systems and advanced applications. DASFAA is a well-established international conference series that provides a forum for technical presentations and discussions among database researchers, developers, and users from academia, business, and industry, which showcases state-of-the-art R&D activities in the general areas of database systems, Web information systems, and their applications. The conference's long history has established the event as the premier research conference in the database area.

It includes 5 keynote talks, 143 research papers, 5 tutorials, 11 demos, 12 industry presentations, 6 workshops and 2 panel discussions.

The dominant topics include big data management, machine learning for database, graph data management, graph and social network analysis followed by text and data mining, data management in social networks, recommendation systems, search and recommendation technology, data semantics and data integration, crowdsourcing, spatial data management, network embedding, sequence and temporal data processing, temporal and spatial databases, large-scale knowledge management, RDF and knowledge graphs, social network and security, security, privacy and trust, medical data mining, bio and health informatics, query processing and optimization, text databases, search and information retrieval, information integration, information recommendation, multimedia databases, multimedia data processing, distributed computing, cloud data management, data archive and digital library, data mining and knowledge discovery, data semantics and integrity constraints, data model and query language, data quality and credibility, data streams and time-series data, data warehouse and OLAP, embedded and mobile databases, databases for emerging hardware, database usability and HCI, HCI for modern information systems, index and storage systems, information extraction and summarization, blockchain, parallel & distributed systems, parallel, distributed & P2P systems, parallel and distributed databases, probabilistic and uncertain data, real-time data management, Semantic Web and triple data management, Semantic Web and knowledge management, sensor data management, statistical and scientific databases, transaction management, Deep Web, Web data management, Web information systems, advanced database and Web applications, and XML, RDF and semi-structured data.

More details at https://www.dasfaa2022.org/ About IIIT-Hyderabad The International Institute of Information Technology, Hyderabad (IIIT-H) is an autonomous research university founded in 1998 that focuses on the core areas of Information Technology, such as Computer Science, Electronics and Communications, and their applications in other domains through inter-disciplinary research that has a greater social impact. Some of its research domains include Visual Information Technologies, Human Language Technologies, Data Engineering, VLSI and Embedded Systems, Computer Architecture, Wireless Communications, Algorithms and Information Security, Robotics, Building Science, Earthquake Engineering, Computational Natural Sciences and Bioinformatics, Education Technologies, Power Systems, IT in Agriculture and e-Governance.

Website: http://www.iiit.ac.in For further information, please contact: Sunory Dutt, Head of Communications, IIIT-Hyderabad E-mail: sunory.dutt@iiit.ac.in Logo: https://mma.prnewswire.com/media/600789/IIIT_Hyderabad_Logo.jpg PWR PWR

(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)

Excerpt from:

IIIT-HYDERABAD HOSTS 27TH INTERNATIONAL CONFERENCE ON DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 22) - Devdiscourse

Read More..

Global Coated Recycled Paperboard Markets, 2022-2027 – Rise in Awareness of Eco-Friendly Packaging – ResearchAndMarkets.com – Business Wire

DUBLIN--(BUSINESS WIRE)--The "Global Coated Recycled Paperboard Market to 2027" report has been added to ResearchAndMarkets.com's offering.

Rise in awareness of eco-friendly packaging is the key driving factor which is expected to boost the global coated recycled paperboard market growth. Furthermore, increase in technological developments and innovations are expected to propel the global coated recycled paperboard market growth.

Also, increase in demand for coated recycled paperboard in printing and packaging sector is expected to fuel the growth of global coated recycled paperboard market.

However, stringent government rules and regulations and inability of the products to handle mechanical stress of heavy materials are the challenging factors which are expected to hamper the global coated recycled paperboard market growth.

Market Key Players

Key Topics Covered:

1 Introduction

1.1 Objective of the Study

1.2 Market definition

1.3 Market Scope

2 Research Methodology

2.1 Data Mining

2.2 Validation

2.3 Primary Interviews

2.4 List of Data Sources

3 Executive Summary

4 Global Coated Recycled Paperboard Market Outlook

4.1 Overview

4.2 Market Dynamics

4.2.1 Drivers

4.2.2 Restraints

4.2.3 Opportunities

4.3 Porters Five Force Model

4.4 Value Chain Analysis

5 Global Coated Recycled Paperboard Market, By Type

5.1 Y-o-Y Growth Comparison, By Type

5.2 Global Coated Recycled Paperboard Market Share Analysis, By Type

5.3 Global Coated Recycled Paperboard Market Size and Forecast, By Type

5.3.1 Calcium Carbonate

5.3.2 Kaolin Clay or China Clay

5.3.3 Titanium Dioxide

6 Global Coated Recycled Paperboard Market, By Application

6.1 Y-o-Y Growth Comparison, By Application

6.2 Global Coated Recycled Paperboard Market Share Analysis, By Application

6.3 Global Coated Recycled Paperboard Market Size and Forecast, By Application

6.3.1 Bakery Products

6.3.2 Home & Garden

6.3.3 Pet Food

6.3.4 Dry Foodstuffs

6.3.5 Cereal Cartons

6.3.6 Personal Care

6.3.7 Others

7 Global Coated Recycled Paperboard Market, By Region

7.1 Global Coated Recycled Paperboard Market Share Analysis, By Region

7.2 Global Coated Recycled Paperboard Market Share Analysis, By Region

7.3 Global Coated Recycled Paperboard Market Size and Forecast, By Region

For more information about this report visit https://www.researchandmarkets.com/r/ry06lm

Read the original post:

Global Coated Recycled Paperboard Markets, 2022-2027 - Rise in Awareness of Eco-Friendly Packaging - ResearchAndMarkets.com - Business Wire

Read More..

Global MABS Resin Markets, 2022-2027 – Focus on Medical Devices, Reusable Drinkware, Industrial Housing & Covers, Office Accessories and Toys -…

DUBLIN, April 11, 2022--(BUSINESS WIRE)--The "Global MABS Resin Market to 2027" report has been added to ResearchAndMarkets.com's offering.

MABS is referred as Methyl methacrylate-acrylonitrile-butadiene-styrene, which is an unmistakable design and item thermoplastic with phenomenal straightforwardness, great firmness, great protection and high quality. MABS is classified into various types such as High Impact Grade, High Rigidity Grade, General Purpose Grade, and Others.

Market Drivers

MABS is commonly called as undefined thermoplastic. It is copolymer of styrene, butadiene, acrylonitrile, methyl methacrylate and it has various mechanical properties like higher effect strength and straightforwardness.

Increase in demand for MABS resin due to its properties like high effect quality, great compound obstruction, and phenomenal straightforwardness which is expected to boost the global MABS resin market growth.

Market Restraint

However, availability of substitute is the major challenging factor which is expected to hamper the global MABS resin market growth.

Market Key Players

Key Topics Covered:

1 Introduction

1.1 Objective of the Study

1.2 Market definition

1.3 Market Scope

2 Research Methodology

2.1 Data Mining

2.2 Validation

2.3 Primary Interviews

2.4 List of Data Sources

3 Executive Summary

4 Global MABS Resin Market Outlook

4.1 Overview

4.2 Market Dynamics

4.2.1 Drivers

4.2.2 Restraints

4.2.3 Opportunities

4.3 Porters Five Force Model

4.4 Value Chain Analysis

5 Global MABS Resin Market, By Type

5.1 Y-o-Y Growth Comparison, By Type

5.2 Global MABS Resin Market Share Analysis, By Type

5.3 Global MABS Resin Market Size and Forecast, By Type

5.3.1 High Rigidity Grade

5.3.2 High Impact Grade

5.3.3 General Purpose Grade

5.3.4 Others

6 Global MABS Resin Market, By Application

6.1 Y-o-Y Growth Comparison, By Application

Story continues

6.2 Global MABS Resin Market Share Analysis, By Application

6.3 Global MABS Resin Market Size and Forecast, By Application

6.3.1 Medical devices

6.3.2 Reusable Drinkware

6.3.3 Industrial Housing & Covers

6.3.4 Office accessories

6.3.5 Toys

7 Global MABS Resin Market, By Region

7.1 Global MABS Resin Market Share Analysis, By Region

7.2 Global MABS Resin Market Share Analysis, By Region

7.3 Global MABS Resin Market Size and Forecast, By Region

For more information about this report visit https://www.researchandmarkets.com/r/3kuuvg

View source version on businesswire.com: https://www.businesswire.com/news/home/20220411005763/en/

Contacts

ResearchAndMarkets.comLaura Wood, Senior Press Managerpress@researchandmarkets.com

For E.S.T Office Hours Call 1-917-300-0470For U.S./CAN Toll Free Call 1-800-526-8630For GMT Office Hours Call +353-1-416-8900

The rest is here:

Global MABS Resin Markets, 2022-2027 - Focus on Medical Devices, Reusable Drinkware, Industrial Housing & Covers, Office Accessories and Toys -...

Read More..

Evaluation of the Padua Prediction Score ability to predict venous thromboembolism in Israeli non-surgical hospitalized patients using electronic…

We evaluated the ability of the PPS to predict VTE in non-surgical hospitalized patients. The results did not demonstrate significant benefit of the PPS or the prophylactic anticoagulation based on it. The occurrence of recorded clinically significant VTE events in non-surgical patients during hospitalization and up to 90days thereafter was relatively low, about 1 event in 400 admissions, compared to previously reported estimations5,11.

The study results demonstrated a difference of 0.01% between the risk groups (0.27% vs. 0.28%) and the statistical analysis indicated that this difference in the studied sample size of about 15,000 patients is not statistically significant. The calculated sample size to show statistical significance of a difference of 0.1% (tenfold higher than the observed difference) is about 80,000 patients and for a difference of 0.01% (as was observed in the study) it is over 8 million patients. This means that a profoundly high number of patients would have to be treated to prevent a single VTE case, which exclude any clinical significance. The potential harm and high cost dramatically outweigh any benefit in this case.

The use of PPS was also not related to improved or worsen safety regarding bleeding events. The routine performance of PPS and the implementation of pharmacologic prophylaxis following PPS estimations involve substantial resources (medical personnel time, medication cost) and may distract physicians. Lack of beneficial effects calls to consider discontinuation of this practice to save resources.

While the presented study followed the Padua study outline, there is a marked difference between the VTE incidences observed in the studies, over tenfold. The diagnosis of VTE events in the Padua study was mainly based on positive imaging and laboratory (d-dimer) results and not on clinical symptoms5. A high percentage of the Padua study participants was referred to testing. Data regarding the clinical significance of the VTE diagnosis in the Padua study are limited. Contrary, the presented study identified recorded symptomatic clinically meaningful VTE events that demanded intervention or readmission. The different methods may have led to the different outcome rates. In addition, the duration of hospitalization was relatively longer in the Padua study, which may have contributed to the elevated VTE prevalence.

Others also found low efficiency of the PPS. Saliba et al. measured single thrombin generation in acutely hospitalized patients and found no correlation with the PPS12. Vardi et al. showed that PPS estimation lacks granularity in detecting non-surgical septic patients at risk of acquiring VTE13. Depietri et al. assessed the application of risk assessment models, including the PPS, on VTE and major hemorrhage on internal medicine hospitalized patients and found it statistically ineffective14.

The PPS was compared to other risk assessment tools. The Caprini risk assessment model (RAM) performed better than the PPS in 2 studies. Zhou et al. assessed the validity of the Caprini RAM in Chinese hospitalized patients with VTE and compared it to the Kucher tool and the PPS15. Caprini model was found to classify much more VTE patients into high or highest risk level than the other models with statistically significant differences (Caprini model vs Kucher model, p<0.0001; Caprini model vs the PPS, p<0.0001). In another study by Zhou et al. the Caprini RAM was compared again to thePPS16. The Cparini RAM defined 82.3% of VTE cases as high risk, while the PPS has defined only 30.1% of these same cases. Nendaz et al. investigated the Geneva risk score among Swiss hospitalized medical patients17. They found that for VTE prediction, the Geneva Risk Score compared favourably with the PPS, particularly for its accuracy to identify low-risk patients who do not require thromboprophylaxis. The main practical limitation of the Caprini and Geneva models is their relative complexity, as they include 39 and 19 criteria, respectively, compared to the 11 criteria of PPS.

PPS is a mathematical scoring method based on the models of Kucher et al. and Lecumberri et al. that uses some of the criteria in the original models18,19. Its ability to capture the full clinical complexity of different patients and become a globally generalized valid assessment tool is limited. Risk assessment should be comprehensive and appropriately tailored to the patients and the relevant population. In addition, risk should be continuously evaluated and not assessed only at a single time point, such as admission.

The issue of any VTE diagnosis vs. symptomatic VTE is paramount. The practical contribution of asymptomatic VTE diagnosis to patients health and prognosis can be debated. Hospital related VTE was considered a safety concern following 2 studies published in the late 1980s20,21. Both were based on autopsies of patients that died during hospitalization or shortly after. These studies only demonstrated the presence of VTE in the body; there was no proof of causality or any linkage between VTE post-mortem presence and either morbidity or mortality. The Worchester study from 1991 showed higher VTE rates in hospitalized patients compared to community; this study also reported mainly asymptomatic cases22. About 10years later (20022003), 3 large-scale clinical trials sponsored by pharmaceutical companies were published almost simultanesly23,24,25. They all demonstrated significant improvement in VTE incidence following prophylactic anticoagulants. A critical review of the studies by Vardi and Haran emphasized that most of the VTE cases in the clinical trials were asymptomatic26. Symptomatic VTE rates were low and there was no benefit for the prophylaxis when observing only symptomatic cases.

During the years, there were several other publications that supported the importance of VTE risk during hospitalization and immediately after but based their conclusion on previous papers that mainly cited the old autopsy studies. Pradonicoauthor of the Padua studycited Leizorovicz and Mismetti to support in his paper the claim regarding VTE elevated risk in non-surgical hospitalized patients27,28. Leizorovicz and Mismetti themselves have based this claim on the THRIFT study paper29, which provides as references the 1980s autopsy studies, again20,21. The claim that the risk for clinically meaningful VTE in non-surgical hospitalized patients is significantly elevated, is poorly substantiated on clear clinical data.

Diagnosis of asymptomatic VTE is increasing with the advances in diagnostic technologies. Computed tomography pulmonary angiography is able to pick up pulmonary emboli as small as 23mm in diameter, but their clinical relevance is questioned30,31. Unnecessary anticoagulation increases the risk for major bleedings32.

The pathophysiology of VTE is related to an imbalance of 3 components of the Virchovs triangle: hypercoagulability, stasis, and endothelial impairment33. This imbalance does not necessarily worsen upon admission to hospitalization for most non-surgical patients; it can even improve. The risk for VTE is inherent and relatively fixed due to the patients prolonged personal medical and functional status before admission. Many of these patients are functionally impaired and immobile long before their hospitalization, while immobility of hospitalized patients is usually properly addressed by repeated position changes and other means and procedures, as instructed by current guidelines and hospital accreditation standards34. Blood viscosity is improved upon admission of deteriorated or dehydrated patients using fluid therapy. Infection and inflammation are appropriately and rapidly managed in most admitted patients. These interventions during hospitalization, which are nowadays a common practice, reduce the risk for VTE and the need for pharmacological prophylaxis. Shortening of the hospitalization is another key element of mitigating the risk.

Recent studies have indicated a low incidence of symptomatic VTE in hospitalized non-surgical patients. Fritz et al. found that only 0.3% of general medicine hospitalized patients developed VTE35. Koren et al. evaluated over 500 Israeli medical hospitalized patients and did not find any hospital acquired symptomatic VTE36. Vardi et al. studied septic patients in Israeli internal medicine departments and found a low rate of symptomatic VTE13. Kolomansky et al. evaluated prospectively the occurrence of VTE in Israeli hospitalized patients. Overall, VTE was diagnosed in 0.25% of these patients37. These reports support the presented study results.

Another support to the low incidence of clinically significant VTE in hospitalized patients comes from a recently published OECD report on VTE rates38. The recorded multinational multicenter VTE hospital-related diagnosis is 0.3% (189.3 DVT cases and 175.3 PE cases of 100,000 discharges), similar to the present study results. It represents the true extent of the clinically meaningful VTE cases.

The study has limitations. It is retrospective and based of computerized retrieval of digital data. It is depended on the heterogenic quality of medical information recorded by numerous staff members. It is a single center study and the generalization of its conclusion should be done carefully.

Nevertheless, the study holds several strengths. It has a large patients sample, over 5,000. More than twice the sample of the Padua study, which was also a single center study. This study outline replicated the Padua study group design and patient selection to enable a fair chance for comparison. The study period is relatively long, over 2years, and the outcomes were measured up to 90days after discharge. The data were pooled from 2 medical databases providing comprehensive information from the hospital and community clinics; this gives high credence to the results. An independent focused validation was performed to ensure the appropriateness of the VTE diagnosis.

In conclusion, PPS was not found to be an efficient tool for identification of non-surgical hospitalized patients with high risk for clinically significant VTE events. Prophylactic anticoagulation based on PPS did not provide significant clinical benefit. The study results indicate the need to consider stopping the use of PPS as a mandatory or preferred assessment tool for VTE in Israeli non-surgical hospitalized patients. Until confirmed, better and efficient assessment tools can be implemented, it is recommended to clinically assess VTE risk and the need for prophylaxis in hospitalized non-surgical patients on a case by case basis.

Read more:

Evaluation of the Padua Prediction Score ability to predict venous thromboembolism in Israeli non-surgical hospitalized patients using electronic...

Read More..