Page 1,599«..1020..1,5981,5991,6001,601..1,6101,620..»

BSC Leads Multi-Partner Consortium in Innovative Data Mining … – HPCwire

March 22, 2023 The Barcelona Supercomputing Center (BSC) is the coordinator of the EU-funded EXTRACT project which began on January 1, 2023, bringing together a 10-partner consortium from Spain, France, Italy, Finland, Israel and Switzerland.

This three-year project will work to provide a distributed data-mining software platform for extreme data across the compute continuum. It pursues an innovative and holistic approach to data mining workflows across edge, cloud and high-performance computing (HPC) environments and will be validated through two use cases that require extreme data: crisis management in the City of Venice and an astrophysics use case.

Data has become one of the most valuable assets worldwide due to its ubiquity in the thriving technologies of Cyber-Physical Systems (CPS), Internet of Things (IoT) and Artificial Intelligence (AI). While these technologies provide vast data for a variety of applications, deriving value from this raw data requires the ability to extract relevant and secure knowledge that can be used to form advanced decision-making strategies.

The BSC researchers will play a critical role in the project by developing data-driven deployment and scheduling methods required to select the most appropriate computing resource. They will also develop a distributed monitoring architecture capable of securely observing the performance, security, and energy consumption of data-mining workflow execution. Moreover, the BSC will explore various strategies, including AI-based orchestration for deploying and scheduling workflows, to ensure that the various goals are optimized holistically while respecting the constraints imposed by extreme data characteristics.

Current practices and technologies are only able to cope with some data characteristics independently and uniformly. EXTRACT will create a complete edge-cloud-HPC continuum by integrating multiple computing technologies into a unified secure compute-continuum. It will do so by considering the entire data lifecycle, including the collection of data across sources, the mining of accurate and useful knowledge and its consumption.

The EXTRACT platform will be validated in two real-world use-cases, each having distinct extreme data and computing requirements.

BSC researchers will develop the data-driven deployment and scheduling methods needed to select the most appropriate computing resource. This task will address the hereto unaddressed challenge of orchestration on the edge-cloud-HPC continuum. It will include ensuring that orchestration technologies are explicitly aware of extreme data characteristics and workflow description.

The BSC will also develop a distributed monitoring architecture that will be capable of securely observing the performance, security, and energy consumption of data-mining workflow execution. To ensure that various goals are optimised holistically while respecting constraints imposed by extreme data characteristics, BSC will explore various strategies including AI-based orchestration for deploying and scheduling workflows.

Eduardo Quiones, established researcher at the Barcelona Supercomputing Center and EXTRACT coordinator, is confident that:By seamlessly integrating major open-source AI and Big Data frameworks, EXTRACT technology will contribute to providing the technological solutions Europe needs to effectively deal with extreme data. It will go beyond facilitating the wider and more effective use of data to reinforce Europes ability to manage urgent societal challenges.

About EXTRACT

The EXTRACT project (A distributed data-mining software platform for extreme data across the compute continuum) is funded under Horizon Research and Innovation Action number 101093110. The project began on 1 January 2023 and will end 31 December 2025. The consortium, formed of 10 partners, is coordinated by the Barcelona Supercomputing Center (BSC). Consortium members include: Ikerlan (Spain), Universitat Rovira I Virgili (Spain), Observatoire de Paris (France), the Centre National de la Recherce Scientifique (France), Universit Paris Cit (France), Logos Ricerca e Innovazione (Italy), City of Venice (Italy), Binare (Finland), Mathema srl (Italy), IBM Israel (Israel), sixsq (Switzerland).

Source: BSC-CNS

More:

BSC Leads Multi-Partner Consortium in Innovative Data Mining ... - HPCwire

Read More..

Insurer Zurich experiments with ChatGPT for claims and data mining – Financial Times

What is included in my trial?

During your trial you will have complete digital access to FT.com with everything in both of our Standard Digital and Premium Digital packages.

Standard Digital includes access to a wealth of global news, analysis and expert opinion. Premium Digital includes access to our premier business column, Lex, as well as 15 curated newsletters covering key business themes with original, in-depth reporting. For a full comparison of Standard and Premium Digital, click here.

Change the plan you will roll onto at any time during your trial by visiting the Settings & Account section.

If you do nothing, you will be auto-enrolled in our premium digital monthly subscription plan and retain complete access for $69 per month.

For cost savings, you can change your plan at any time online in the Settings & Account section. If youd like to retain your premium access and save 20%, you can opt to pay annually at the end of the trial.

You may also opt to downgrade to Standard Digital, a robust journalistic offering that fulfils many users needs. Compare Standard and Premium Digital here.

Any changes made can be done at any time and will become effective at the end of the trial period, allowing you to retain full access for 4 weeks, even if you downgrade or cancel.

You may change or cancel your subscription or trial at any time online. Simply log into Settings & Account and select "Cancel" on the right-hand side.

You can still enjoy your subscription until the end of your current billing period.

We support credit card, debit card and PayPal payments.

Go here to read the rest:

Insurer Zurich experiments with ChatGPT for claims and data mining - Financial Times

Read More..

From Text Mining to Abstract Mining – Drew Today

Tags: Caspersen, data analytics, Homepage, Professors

March 2023 What are text mining and abstract miningand how are they useful tools in medical research?

These questions and many others were answered during a hybrid event featuring Drew Universitys Dr. Ellie Small, Norma Gilbert Junior Assistant Professor of Mathematics and Computer Science.

Hosted by Drews Caspersen School of Graduate Studies Data Analytics program, attendees had the privilege of learning about text mining, abstract mining, and new methods developed to identify novel ideas for medical research.

Data analytics has seen a surge in growth opportunities, largely due to the availability of data, the need to analyze the data in various ways, and the increased ability to store and analyze data.

The cost of computer storage has decreased, while computation power has increased, said Small, who specializes in data science and has completed research papers in networks and text mining.

Small explained the difference between various types of data analysis: text mining is analyzing text in documentsfrom one document to thousands of documentsand abstract mining is the ability to analyze multiple words or short phrases in documents.

Utilizing PubMed, a biomedical literature database, Small developed logic to extract frequently occurring phrases from the housed papers and cluster them according to the frequency of the phrases within the papers.

This application of data greatly simplifies medical research for students and the medical community at large.

Alex Rudniy, assistant professor of data analytics, also offered an overview of many usages of the data analytics tools and the industries that utilize these toolsfrom marketing, travel, health care, and beyond.

Link:

From Text Mining to Abstract Mining - Drew Today

Read More..

Bitcoin mining booms in Texas – Reuters

MCCAMEY, Texas, March 23 (Reuters) - Cryptocurrency bankruptcies and worries over electric power consumption have failed to dent the industry's growth in Texas, according to a top trade group, citing the rise in the miners' power demands.

Bitcoin miners consume about 2,100 megawatts of the state's power supplies, said Lee Bratcher, president of industry group Texas Blockchain Council. That power usage rose 75% last year and was nearly triple that of the prior 12 months, Bratcher said.

Those demands amount to about 3.7% of the state's lowest forecast peak load this year, according to data from grid operator Electric Reliability Council of Texas (ERCOT).

"There's been some challenges with the Bitcoin mining industry," Bratcher said, noting his group recently saw two prominent bankruptcies and other miners scaling back expansions.

The industry also faces new federal regulations, including a proposed 30% tax on electricity usage for digital mining and calls by the U.S. Treasury secretary and commodities regulator for a regulatory framework.

New York this year imposed a ban on some cryptocurrency mining that runs on fossil fuel-generated power. Other states are expected to follow suit.

But in Texas, some counties have offered tax incentives and miners continue to be drawn to its wind and solar power, which could supply about 39% of ERCOT's energy needs in 2023.

"Bitcoin mining is a very energy intensive business, which is why we tend to find places like West Texas to be full of Bitcoin miners," said Matt Prusak, chief commercial officer at cryptocurrency miner U.S. Bitcoin Corp, which has one of its mining operations in a 280-megawatt wind farm in Texas.

Its McCamey, Texas, site last month consumed 173,000 megawatt hours of power about 60% provided by the grid and nearly 40% from the nearby wind farm. The average American home uses about 10 MWh in a year, according to the Energy Information Administration.

In Texas, where about 250 people died during a winter storm blackout that exposed the fragility of the state's grid, the prospect of higher crypto demand has raised alarms.

"There are a lot of Bitcoin mines that are trying to connect to the system," said Joshua Rhodes, a research scientist at the University of Texas at Austin. "If all of them were to connect in the timelines that they are looking to connect, then it probably would present an issue to the grid because that load would be growing way faster than it ever has before."

Reporting by Evan Garcia and Dan Fastenberg; writing by Laila Kearney; Editing by Chizu Nomiyama

Our Standards: The Thomson Reuters Trust Principles.

Read more here:

Bitcoin mining booms in Texas - Reuters

Read More..

An In-Depth Examination of Business Intelligence and Data Integration – CIOReview

The purpose of data analytics is to collect, analyze, and visualize data so businesses can make data-driven decisions about their operations

Fremont, CA: As organizations have become more automated and big data-driven, data integration and business intelligence have become increasingly important. Businesses must consolidate and process vast amounts of data from various sources, including internal systems, cloud-based solutions, and third-party sources. A central repository for analyzing data can be created through data integration tools that help businesses bring data from several sources together.

In addition to automation, the need for accurate and timely data has also grown. Data integration is necessary for businesses to run their operations efficiently.

Data integration

The data integration process combines data from different sources into a unified view. An easy way to access and analyze data is by converting and loading it into an easily accessible central repository or data warehouse. Integrating accurate and timely data is critical to making informed decisions for any data-driven organization. Integration of data from different sources involves a complex process. In many cases, enterprises integrate their data using ETL (Extract, Transform, Load) tools, which convert data from disparate sources into a consistent format, then load it into a centralized repository. In addition to improving data quality and reducing redundancy, data integration can also streamline data management methods. A few popular solutions for data integration include Informatica PowerCenter, Talend Open Studio, and Microsoft SQL Server Integration Services (SSIS).

Business intelligence

A business intelligence system analyzes and interprets data to provide valuable insights that can help in making decisions. The purpose of data analytics is to collect, analyze, and visualize data so businesses can make data-driven decisions about their operations.

Data is used for business intelligence to gain insights into business operations, processes, and performance. There are many steps involved in the process of business intelligence, including data warehousing, data mining, reporting, and analysis.

Benefits of data integration

Integrating data helps make decisions more informed because the data is accurate, consistent, and up-to-date. Businesses can reduce costs and save time by integrating data from multiple sources. Integrating data can make data management easier and faster, improving access and analysis speed. Businesses can gain deeper insights into their operations by combining data into a unified view.

Go here to see the original:

An In-Depth Examination of Business Intelligence and Data Integration - CIOReview

Read More..

PAAB Publishes Draft Guidance Document on Use of Real-World … – Fasken

The Pharmaceutical Advertising Advisory Board (PAAB) has published a draft guidance document on the use of real-world evidence (RWE) in advertising directed to healthcare professionals (HCPs). The draft guidance document is available upon request from PAAB. PAAB is inviting industry stakeholders to provide feedback on the draft guidance document until the consultation period closes on April 3, 2023.

According to the draft guidance document, PAAB recognizes that not all clinical data that is relevant to clinical decisions by HCPs may be supportable by controlled and well-designed clinical trials with demonstrated statistical significance (so-called gold standard data). PAABs aim is to create a framework for use of RWE in advertising to facilitate delivery of the best data currently available to HCPs even in the absence of gold standard data, provided that the RWE data is sufficiently robust to be relevant and valuable to clinical practice.

Under PAABs proposed approach outlined in the draft guidance document, RWE may be used in advertising in addition to gold standard data, provided that the RWE data meets certain base requirements for validity and relevance and is presented in the advertisement in alignment with certain formatting principles, each of which is described below.

The draft guidance document provides the following nine criteria to be used to ascertain whether RWE meets basic requirements for validity and relevance:

The draft guidance document provides the following five principles for presentation of RWE data in advertising:

The draft guidance document provides examples of the application of these principles to visual advertisements, as well as adaptations of these principles for advertisements in video or audio formats.

Ourlife sciences team has significant expertise advising the pharmaceutical industry on advertising compliance and other matters and is available to consult on the draft guidance document.

Read the original post:

PAAB Publishes Draft Guidance Document on Use of Real-World ... - Fasken

Read More..

Argentine mining exports hit 10-year high as lithium, EVs take off – Reuters

BUENOS AIRES, March 21 (Reuters) - Argentina's mining exports hit historic levels last year, the government said on Tuesday, powered by surging lithium income as the South American agricultural powerhouse targets profits from the metal key to meeting booming electric vehicle (EV) demand.

Even as Latin America's third-largest economy suffers triple-digit inflation and the fallout of a devastating drought afflicting top farmland, lithium exports helped push up the country's mining exports to $3.86 billion last year - the highest level in a decade, according to economy ministry data.

Argentina's lithium riches, like those in neighboring Chile, are extracted from brine in sprawling salt flats that use the power of the sun to concentrate the ultra-light metal in evaporation pools.

In 2022, lithium exports surged 234% from a year earlier, accounting for nearly a fifth of all Argentine mining shipments.

In a rare bright spot for the country's ailing economy, the trend shows no sign of slowing.

During the first two months of this year, exports of the white metal more than doubled, from a year earlier, with February shipments pulling in a record $58 million.

The ministry sees mining revenues of $6 billion this year, in part boosted by two new lithium projects set to launch as well as a pair of major expansions underway.

A scramble for the metal has caused its price to skyrocket, which in turn has motivated companies and investors alike.

Some of the world's largest mining companies have operations in northern Argentina, including China's Ganfeng Lithium (002460.SZ) and U.S. miner Livent Corp (LTHM.N), which will supply lithium for rechargeable batteries in BMW (BMWG.DE) vehicles.

Mining industry investment since 2020 totals some $11.3 billion, the ministry data showed, including $5.1 billion for lithium and $4.9 billion for copper, also heavily used in EVs.

Spanning Chile, Argentina and Bolivia, South America's so-called "lithium triangle" accounts for more than half of global lithium supplies.

Reporting by Lucila Sigal; Writing by Sarah Morland; Editing by David Alire Garcia and Jonathan Oatis

Our Standards: The Thomson Reuters Trust Principles.

Originally posted here:

Argentine mining exports hit 10-year high as lithium, EVs take off - Reuters

Read More..

National Reconnaissance Office (NRO) Awards Pixxel with 5-year … – Space Ref

Pixxel, a leader in cutting-edge earth-imaging technology, has been awarded a 5-year contract by the NRO Commercial Systems Program Office (CSPO) under the Strategic Commercial Enhancements Broad Agency Announcement for Commercial Hyperspectral Capabilities.

Pixxel will provide technical hyperspectral imagery (HSI) remote sensing capabilities via modeling and simulation and data evaluation. Using its currently on-orbit pathfinder systems and future HSI constellations, Pixxel will demonstrate its capabilities through end-to-end tasking, collection, and product dissemination and respond to ad-hoc product ordering and delivery requests from the NRO and its partners.

The entire team here at Pixxel is excited to begin this journey with the NRO, said Awais Ahmed, CEO and co-founder of Pixxel. We are fully committed to this fantastic opportunity to offer our imaging capabilities to the organization, its partners, and the U.S. geospatial intelligence community.

We look forward to a collaboration with NRO CSPO and our esteemed partners, Riverside Research, Rochester Institute Technology (RIT) Center for Imaging Science (CIS), and Labsphere to advance this promising new space-based commercial remote sensing technology, said Pixxel Vice President, Skip Maselli.

Pixxels hyperspectral satellites capture images at hundreds of wavelengths in the electromagnetic spectrum and reveal key data about the health of our planet. Pixxel has seen a landmark year of growth, launching three pathfinder missions into orbit and growing their customer base across agriculture, mining, climate, oil and gas, government, and more. This agreement marks Pixxels first publicly announced government customer and aligns with growing public sector interest in climate monitoring tools.

About Pixxel

Pixxel is a space data company building a constellation of the worlds highest-resolution hyperspectral earth imaging satellites and the analytical tools to mine insights from the data. The constellation will aim to provide global coverage every 24 hours and help detect, monitor, and predict global phenomena across agriculture, mining, environment and energy use cases.

Co-founded by then-20-year-olds Awais Ahmed and Kshitij Khandelwal, the space tech startup aims to build a health monitor for the planet by 2024. Pixxel has worked with notable organizations such as the Indian Space Research Organization, NASA JPL, and SpaceX amongst other space stalwarts. The organization is backed by Lightspeed, Radical Ventures, Relativitys Jordan Noone, Seraphim Capital, Ryan Johnson, Blume Ventures, Sparta LLC and Accenture among others. For more information visit http://www.pixxel.space or follow Pixxel on Twitter and LinkedIn.

About NRO

The NRO develops, acquires, launches, and operates the worlds best intelligence, surveillance, and reconnaissance satellites to secure and expand Americas advantage in space. We are building a diversified and resilient architecture of spacecraft and ground systems designed to meet the challenges of a changing space environment by accelerating innovation and leveraging strategic partnerships, backed by a diverse and highly skilled workforce. At NRO, we see it, hear it, and sense it so our nations warfighters and policymakers have decision advantage amid increasing global competition. Learn more at NRO.gov.

ContactsJohn OBrien[emailprotected]

Read more:

National Reconnaissance Office (NRO) Awards Pixxel with 5-year ... - Space Ref

Read More..

The human labor behind AI chatbots and other smart tools – Marketplace

Every week it seems the world is stunned by another advance in artificial intelligence, including text-to-image generators like DALL-E and the latest chatbot, GPT-4.

What makes these tools impressive is the enormous amount of data theyre trained on, specifically the millions of images and words on the internet.

But the process of machine learning relies on a lot of human data labelers.

Marketplaces Meghan McCarty Carino spoke to Sarah T. Roberts, a professor of information studies and director of the Center for Critical Internet Inquiry at UCLA, about how this work is often overlooked. The following is an edited transcript of their conversation.

Sarah T. Roberts: In the case of something like ChatGPT and the engines that its using, its really going out and pretty much data mining massive portions of the internet. Now, we all know that the internet is filled with the best information and the greatest stuff all the time, right? So whats required for something like that is to have human beings with their special ability of discernment and good judgement, and sometimes visceral reactions to material and in the case of ChatGPT, to cull material out, material that users, or more importantly, companies, would not want inside of their products as a potential output. And so that means these data labelers, much like content moderators, spend their days working on some of the worst stuff that we can imagine. And in this case, theyre trying to build models to cull that out automatically. But it always starts and ends with human engagement.

Meghan McCarty Carino: What do we know about the people who are doing this really key work of data labelling?

Roberts: So taking a page from the content moderation industry, much of this work is outsourced to third-party companies that provide large labor pools. Often these data labelers are at great remove from where we might imagine the work of engineering these products goes on. They might be in other parts of the world. There was a great article by Billy Perrigo in Time magazine in January of 2023, about a place in Kenya that was doing data labelling. It was a really hard, upsetting job, and folks were being paid at most $2 an hour to be confronted with that material. Unfortunately, this is an industry that is reliant upon human intervention and human discernment, but once again, takes it for granted and pays very little and puts people in harms way.

McCarty Carino: Right, very similar to, you know, what weve learned about content moderation, which, as you said, happens in a similar sort of outsourced way where these people are sort of the front lines of everything that we dont want showing up in our end product, and it runs through these workers.

Roberts: Yeah, thats right. And for years, Ive been listening to industry figures and other pundits tell me that my concern about the welfare of content moderation workers was appropriate, but it was finite, and that in just a few years, AI technologies would be such that we could eliminate that work. In fact, whats happening is just the opposite. We are expanding, greatly expanding at an exponential pace, the number of people who are doing work like this. I think of data labelling frankly as content moderation before the fact, both in practice, but also in the material conditions of the work.

McCarty Carino: When we think about how these technologies are often described, or characterized by the companies that put them out or, you know, in the press, I mean, what is important to keep in mind as we think about this type of labor and its relationship to those products?

Roberts: I think what we have to remember is that AI is artificial intelligence leaning heavily on the artificial. And what its doing at best is imitating human discernment, thinking and processes, but it is only as good as the material that goes into it. You know, theres an old adage in programming, garbage in, garbage out, that goes maybe even more so for applications like these AI tools that weve been discussing. Emily Bender and her colleagues wrote a great paper called stochastic parrots, which is how she and her colleagues describe what ChatGPT is actually doing. And for those who arent familiar with that term, basically, what shes saying is that you can use ChatGPT, its incredible, Ive used it as well. But you have to keep in mind that what youre seeing as its output is at best mimicking humans in the same way that a parrot might copy our pattern of speech, a series of words or phrases, even our inflection, but really has no cognitive ability to necessarily understand what those things mean.

And in fact, I would give a parrot a better chance of having that kind of cognition than I would have machine. So in a way, Ive been thinking about ChatGPT and other tools like it really as vanity machines. Just as an example, I requested it to generate an annotated bibliography for me the other day in my own field. I picked something that I thought I would have some expertise in in order to evaluate the output. And it gave me about 10 answers. The first one it gave me was something I would have chosen as well, a book by a colleague. Perfect response. And then it started producing a bunch of new papers and books in my area of study that Id never heard of. And I really thought, Wow, have I really been underwater that much during COVID? Like, all this stuff is coming out and Im missing it? Turns out, those were fake citations, fake authors, fake books on legitimate presses, fake papers, but using legitimate journal titles with even page numbers given. Imagine if I hadnt had the expertise to know that those were bogus. Thats just one example of the way that this stochastic parrot or this mimicry might reproduce. And, of course, to be fair, I didnt ask it to give me real citations or truthful information. It gave me its best guess at what an annotated bibliography would look like in my field. But none of it was real.

McCarty Carino: What gets lost when tools like this are thought of as these sort of genius technological achievements without considering all of the human labor that went into them?

Roberts: They could have really chosen any model. They could have decided, you know, an infinite number of possibilities of how to set up that work and how to treat those workers. And I think it says something about tech companies. The actual intelligence that they are mining, the very essence of what makes these tools appear to have this human element in other words, mimicking the humans that work on the labeling, work on the moderation, work on these inputs are erased from the process. And I think the erasure of the humanity that goes into these tools is to all of our detriment if for no other reason then we cant really fully appreciate the labor that goes into creating them or the limits of the tools and how they should be applied.

A report from Grand View Research valued the global data collection and labeling market over $2.2 billion in 2022. Its a huge sector.

And its important to understand its not just this new generative AI that requires this kind of work. For example, my colleague Jennifer Pak reported a couple years ago on a data labeling center in China that contracts with big companies like Baidu and Alibaba.

One of the workers Jennifer spoke to said he was making twice the average salary in his local province, roughly $11,000 a year, plus commission.

The operation had workers labeling street data for an autonomous vehicle project basically, Thats a bike, thats a pedestrian, thats a baby stroller.

The same type of labor is used to label faces to train facial recognition software or to help robot vacuums navigate their way around your home.

Earlier this year, we spoke to MIT Technology Review reporter Eileen Guo about her story on how sensitive personal images taken by robot vacuums inside peoples homes ended up online.

Its a winding path, but it runs through a group of outsourced data labelers in Venezuela that iRobot contracted.

More here:

The human labor behind AI chatbots and other smart tools - Marketplace

Read More..

Rape cases incorrectly cleared by police. Here’s why suspects are walking free – 11Alive.com WXIA

CLAXTON, Ga. For more than a year, 11Alive has investigateda little-known way police can clear crimes without ever making an arrest. Its called exceptional clearance and we found when it comes to rape, its far from an exception.

Its likely youve never heard of exceptional clearance. Neither had Robin Smith-Bright even though it is why her daughters rape investigation is closed.

What is that? Smith-Bright asked.

Its supposed to mean police couldnt arrest a suspect for reasons beyond their control, even though they have the evidence they need. Those reasons can be anything from the death of the suspect, the victim not wanting to prosecute, or the suspect is already in custody for another crime. Thats not how Evans County used it in Smith-Brights daughters case. They exceptionally cleared it citing a lack of evidence. Sheriff Mac Edwards didnt see a problem with that.

My exceptionally cleared, the previous sheriffs exceptionally cleared might be two different things, Edwards said.

We explained there is only one definition for exceptional clearance. Edwards said, Its a play on words I guess.

No, its not.

Credit: WXIA

11Alive's Kristin Crowley with Evans County Sheriff Mac Edwards

RELATED:An Exceptional Problem: How police are clearing rape cases without making arrests

Exceptional clearance is the same no matter your state, city or county. The FBI lays it out clearly. Every agency must determine who the suspect is, where the suspect is and have enough evidence to make an arrest. Then, they need to have an exceptional reason on why they cant arrest the suspect. If they cant do that the case doesnt qualify.

Theres a high threshold because when police report their numbers to the public, exceptionally clearing the case is the same as solving a case with an arrest. Thats why it should be relatively rare, but we found thats not the case in rape investigations.

In 2021, Evans County exceptionally cleared 40% of its rape cases compared to 24% of its burglary cases and 0% for homicide and robbery.

Atlanta Police have a 20% ex-clearance rate for rape compared to 5% for robbery, 3% for homicides and just 2% for burglaries.

Its an issue across the entire state of Georgia. Eight agencies had an exceptional clearance rate of 40% or higher. Another 24 agencies had a rate of at least 18% or higher.

Dave Thomas, with the International Association of Chiefs of Police, said anything over 9% is alarming.

When we start getting in higher percentiles, I want to go in and do some data mining and look at the cases. Look and see whether they met the criteria, Thomas said.

Credit: WXIA

Dave Thomas

Thats exactly what 11Alive Investigatesdid.

We filed open records requests with departments in 48 counties across Georgia. We found agencies that exceptionally cleared cases for insufficient evidence, because a story was fabricated, or because it was transferred, all of it a misuse.

Nearly every department that used exceptional clearance for rapes used it incorrectly at least once.

Multiple agencies acknowledged they made a mistake once we brought it to their attention. But for Bright-Smith its not enough. She wants her daughters case reopened, and she wants the suspect charged.

Read the original here:

Rape cases incorrectly cleared by police. Here's why suspects are walking free - 11Alive.com WXIA

Read More..