Page 2,340«..1020..2,3392,3402,3412,342..2,3502,360..»

Machine learning reduced workload for the Cochrane COVID-19 Study Register: development and evaluation of the Cochrane COVID-19 Study Classifier -…

This article was originally published here

Syst Rev. 2022 Jan 22;11(1):15. doi: 10.1186/s13643-021-01880-6.

ABSTRACT

BACKGROUND: This study developed, calibrated and evaluated a machine learning (ML) classifier designed to reduce study identification workload in maintaining the Cochrane COVID-19 Study Register (CCSR), a continuously updated register of COVID-19 research studies.

METHODS: A ML classifier for retrieving COVID-19 research studies (the Cochrane COVID-19 Study Classifier) was developed using a data set of title-abstract records included in, or excluded from, the CCSR up to 18th October 2020, manually labelled by information and data curation specialists or the Cochrane Crowd. The classifier was then calibrated using a second data set of similar records included in, or excluded from, the CCSR between October 19 and December 2, 2020, aiming for 99% recall. Finally, the calibrated classifier was evaluated using a third data set of similar records included in, or excluded from, the CCSR between the 4th and 19th of January 2021.

RESULTS: The Cochrane COVID-19 Study Classifier was trained using 59,513 records (20,878 of which were included in the CCSR). A classification threshold was set using 16,123 calibration records (6005 of which were included in the CCSR) and the classifier had a precision of 0.52 in this data set at the target threshold recall >0.99. The final, calibrated COVID-19 classifier correctly retrieved 2285 (98.9%) of 2310 eligible records but missed 25 (1%), with a precision of 0.638 and a net screening workload reduction of 24.1% (1113 records correctly excluded).

CONCLUSIONS: The Cochrane COVID-19 Study Classifier reduces manual screening workload for identifying COVID-19 research studies, with a very low and acceptable risk of missing eligible studies. It is now deployed in the live study identification workflow for the Cochrane COVID-19 Study Register.

PMID:35065679 | DOI:10.1186/s13643-021-01880-6

Read the original here:
Machine learning reduced workload for the Cochrane COVID-19 Study Register: development and evaluation of the Cochrane COVID-19 Study Classifier -...

Read More..

Getting a Read on Responsible AI | The UCSB Current – The UCSB Current

There is great promise and potential in artificial intelligence (AI), but if such technologies are built and trained by humans, are they capable of bias?

Absolutely, says William Wang, the Duncan and Suzanne Mellichamp Chair in Artificial Intelligence and Designs at UC Santa Barbara, who will give the virtual talk What is Responsible AI, at 4 p.m. Tuesday, Jan. 25, as part of the UCSB Librarys Pacific Views speaker series (register here).

The key challenge for building AI and machine learning systems is that when such asystem is trained on datasets with limited samples from history, they may gain knowledge from the protected variables (e.g., gender, race, income, etc.), and they are prone to produce biased outputs, said Wang, also director of UC Santa Barbaras Center for Responsible Machine Learning.

Sometimes these biases could lead to the rich getting richer phenomenon after the AI systems are deployed, he added.Thats why in addition to accuracy, it is important to conduct research in fair and responsible AI systems, including the definition of fairness, measurement, detection and mitigation of biases in AI systems.

Wangs examination of the topic serves as the kickoff event for UCSB Reads 2022, the campus and community-wide reading program run by UCSB Library. Their new season is centered on Ted Chiangs Exhalation, a short story collection thataddressesessential questions about human and computer interaction, including the use of artificial intelligence.

Copies of Exhalation will be distributed free to students (while supplies last) Tuesday, Feb. 1 outside the Librarys WestPaseo entrance. Additional events announced so far include on-air readings from the book on KCSB, a faculty book discussion moderated by physicist and professor David Weld and a sci-fi writing workshop. It all culminates May 10 with a free lecture by Ted Chiang in Campbell Hall.

First though: William Wang, an associate professor of computer science and co-director of the Natural Language Processing Group.

In this talk, my hope is to summarize the key advances of artificialintelligence technologies in the last decade, and share how AI can bring us an exciting future, he noted. I will also describe the key challenges of AI: how we should consider the research and development of responsible AI systems,which not only optimize their accuracy performance,but also provide a human-centric view to consider fairness, bias, transparency and energy efficiency of AI systems.

How do we build AI models that are transparent? How do we write AI system descriptions that meet disclosive transparency guidelines?How do we consider energy efficiency when building AI models? he asked. The future of AI is bright, but all of these are key aspects of responsible AI that we need to address.

See more here:
Getting a Read on Responsible AI | The UCSB Current - The UCSB Current

Read More..

How quantum computing is helping businesses to meet objectives – Information Age

Johannes Oberreuter, Quantum Computing practice lead and data scientist at Reply, spoke to Information Age about how quantum computing is helping businesses to meet objectives

Quantum is emerging as a new vehicle for business problem solving.

Quantum computing is an evolving technology that promises to enhance an array of business operations. Based on quantum mechanics that focus on the smallest dimensions of nature molecules, atoms and subatomic particles quantum computers are set to provide faster solutions to complex business problems, through testing multiple possible solutions for a problem simultaneously.

The basis for quantum computing is a unit of information known as a qubit; unlike bits, which can only have the values zero or one, can come in the form of anything in between, which allows for this new approach to become possible, and is called a superposition. Combined, multiple qubits can produce many outcomes at the same time. Every extra qubit doubles the search space, which therefore grows exponentially.

Many companies are looking into how quantum can bolster industries and provide new use cases for businesses. One organisation thats exploring this space is Reply, which has been developing solutions for optimisation in logistics, portfolio management and fault detection, among other areas.

Discussing how Reply is helping to provide possible use cases to its clients, quantum computing expert Johannes Oberreuter said: We work on a level which translates the problem into a quantum language that is as universal as possible, and doesnt go too deep into the hardware.

The first thing weve found thats delivering value now is the domain of optimisation problems. An example is the travelling salesman problem, which has lots of applications in logistics, where complexities and constraints also need to be accounted for, like during the pandemic.

Very often, problems, which are found too complex to be optimised on common hardware, are tackled by some heuristics. Usually, theres a team or a person with experience in the domain, who can help with this, but they dont know yet that there are better solutions out there now. Quantum computing allows for problems being presented in a structured way similar to a wish list, containing all business complexities. They are all encoded into a so-called objective function, which can then be solved in a structured way.

Companies have used all sorts of algorithms and brain power to try to solve optimisation problems. Finding the optimum with an objective function is still a difficult problem to solve, but here a quantum computer can come to the rescue.

Pushing parameters

According to Oberreuter, once a quantum computer becomes involved in the problem solving process, the optimal solution can really be found, allowing businesses to find the best arrangements for the problem. While current quantum computers, which are suitable for this kind of problems, called quantum annealers now have over 5,000 qubits, many companies that enlist Replys services often find that problems they have require more than 16,000-20,000 variables, which calls for more progress to be made in the space.

You can solve this by making approximations, commented the Reply data scientist. Weve been writing a program that is determining an approximate solution of this objective function, and we have tested it beyond the usual number of qubits needed.

The system is set up in a way that prevents running time from increasing exponentially, which results in a business-friendly running time of a couple of seconds. This reduces the quality of the solution, but we get a 10-15% better result than what business heuristics are typically providing.

Through proofs-of-concepts, Reply has been able to help clients to overcome the challenge of a lack of expertise in quantum. By utilising and building up experience in the field, a shoulder-to-shoulder approach helps to clarify how solutions can be developed more efficiently.

Machine learning has risen in prominence over the last few years to aid automation of business processes with data, and help organisations meet goals faster. However, machine learning projects can sometimes suffer from lack of data and computational expense. To combat this, Reply has been looking to the problem solving capabilities brought by quantum computing.

Oberreuter explained: What weve discovered with quantum machine learning is you can find better solutions, even with the limited hardware thats accessible currently. While there will probably never be an end-to-end quantum machine learning workflow, integration of quantum computing into the current machine learning workflow is useful.

Some cloud vendors now offer quantum processing units (QPUs). In a deep learning setup for complex tasks, you could easily rent it from the cloud providers by individual calls to experiment, if it improves your current model.

What weve found interesting from our contribution towards the quantum challenge undertaken by BMW and AWS, is the marriage of classical machine learning models with quantum models. The former is really good at extracting attributes from unstructured data such as images, which are then joined by a quantum representation which provides an advantage for classification.

How organisations can drive value from AI on the edge

Mike Ellerton, partner at Go Reply, spoke to Information Age about Replys recent research conducted into edge AI, and how organisations can drive value from the technology. Read here

Additionally, quantum technologies are being explored for cyber security, with the view that soon quantum computers can solve problems that are currently insurmountable for todays technologies. A particular algorithm thats been cited by Reply, that could be solved by quantum computing, is the one used for RSA key cryptography, which while trusted to be secure now, is estimated to need 6000 error-free qubits to be cracked in the space of two weeks.

Quantum technology for cyber security is now on the shelf, and were offering this to our clients to defend against this threat, said Oberreuter. Quantum mechanics have a so-called no-cloning theorem, which prevents users from copying messages sent across a communication channel. The crux is that in order for this to work, you need a specialised quantum channel.

We have experts who specialise in cyber security, that have been leading the effort to craft an offering for this.

Reply is a network of highly specialised industry companies, that helps clients across an array of sectors to optimise and integrate processes, applications and devices using the latest technologies. Established in 1996, the organisation offers services for capabilities including quantum, artificial intelligence (AI), big data, cloud and the Internet of Things (IoT). More information on the services that Reply provides can be found here.

This article was written as part of a paid-for content campaign with Reply

View post:
How quantum computing is helping businesses to meet objectives - Information Age

Read More..

A machine learning model based on tumor and immune biomarkers to predict undetectable MRD and survival outcomes in multiple myeloma – DocWire News

This article was originally published here

Clin Cancer Res. 2022 Jan 21:clincanres.3430.2021. doi: 10.1158/1078-0432.CCR-21-3430. Online ahead of print.

ABSTRACT

PURPOSE: Undetectable measurable residual disease (MRD) is a surrogate of prolonged survival in multiple myeloma (MM). Thus, treatment individualization based on the probability of a patient to achieve undetectable MRD with a singular regimen, could represent a new concept towards personalized treatment with fast assessment of its success. This has never been investigated; therefore, we sought to define a machine learning model to predict undetectable MRD at the onset of MM.

EXPERIMENTAL DESIGN: This study included 487 newly-diagnosed MM patients. The training (n=152) and internal validation cohort (n=149) consisted of 301 transplant-eligible active MM patients enrolled in the GEM2012MENOS65 trial. Two external validation cohorts were defined by 76 high-risk transplant-eligible smoldering MM patients enrolled in the GEM-CESAR trial, and 110 transplant-ineligible elderly patients enrolled in the GEM-CLARIDEX trial.

RESULTS: The most effective model to predict MRD status resulted from integrating cytogenetic [t(4;14) and/or del(17p13)], tumor burden (bone marrow plasma cell clonality and circulating tumor cells) and immune-related biomarkers. Accurate predictions of MRD outcomes were achieved in 71% of cases in the GEM2012MENOS65 trial (n=214/301), and 72% in the external validation cohorts (n=134/186). The model also predicted sustained MRD negativity from consolidation onto 2-years maintenance (GEM2014MAIN). High-confidence prediction of undetectable MRD at diagnosis identified a subgroup of active MM patients with 80% and 93% progression-free and overall survival rates at five years.

CONCLUSION: It is possible to accurately predict MRD outcomes using an integrative, weighted model defined by machine learning algorithms. This is a new concept towards individualized treatment in MM.

PMID:35063966 | DOI:10.1158/1078-0432.CCR-21-3430

Link:
A machine learning model based on tumor and immune biomarkers to predict undetectable MRD and survival outcomes in multiple myeloma - DocWire News

Read More..

Associate / Full Professor of Theoretical Biophysics and Machine Learning job with RADBOUD UNIVERSITY NIJMEGEN | 278686 – Times Higher Education (THE)

Associate / Full Professor of Theoretical Biophysics and Machine Learning

A world from which we demand more and more requires people who can make a contribution. Critical thinkers who will take a closer look at what is really important. As a Professor, you will perform leading research and teach students in the area of theoretical biophysics and physics-based machine learning, to strengthen the role and visibility of the international Theoretical Biophysics landscape.

As a successful candidate you will join the Department of Biophysics at the Donders Center for Neuroscience (DCN) and perform internationally leading theoretical research in an area of theoretical biophysics or physics-based machine learning. You are interested in applications of theoretical biophysics methods to neuroscience problems studied in the DCN, and you will engage actively in interdisciplinary research collaborations with other physicists in the Faculty of Science and with external partners. You will contribute to the teaching and the innovation of Radboud's popular theoretical machine learning and biophysics courses, and possibly contribute to other core undergraduate physics subjects taught at the Faculty of Science. You will supervise students' research projects at the Bachelor's, Master's and PhD levels. Finally, you will contribute to the effective administration of Radboud University and the acquisition of research funding, and will strengthen the role and visibility of Radboud University in the international Theoretical Biophysics landscape.

Profile

We are

The Donders Institute for Brain, Cognition and Behaviour of Radboud University seeks to appoint a Professor of Theoretical Biophysics and Machine Learning. The Donders Institute is a world-class research institute, housing more than 700 researchers devoted to understanding the mechanistic underpinnings of the human mind/brain. Research at the Donders Institute focuses on four themes:

Language and Communication

Perception, Action, and Decision-making

Development and Lifelong Plasticity

Natural Computing and Neurotechnology.

We have excellent and state-of-the-art research facilities available for a broad range of neuroscience research. The Donders Institute fosters a collaborative, multidisciplinary, supportive research environment with a diverse international staff. English is the lingua franca at the Institute.

You will join the academic staff of the Donders Center for Neuroscience (DCN) - one of the four Donders Centers at Radboud University's Faculty of Science. The Biophysics Department is part of the DCN. Neurophysicists at DCN mainly conduct experimental, theoretical and computational research into the principles of information processing by the brain, with particular focus on the mammalian auditory and visual systems. The Physics of Machine Learning and Complex Systems Group studies a broad range of theoretical topics, ranging from physics-based machine learning paradigms and quantum machine learning, via Bayesian inference and applications of statistical mechanics techniques in medical statistics, to network theory and the modelling of heterogeneous many-variable processes in physics and biology. The group engages in multiple national and international research collaborations, and participates in several multidisciplinary initiatives that support theoretical biophysics and machine learning research and teaching at Radboud University.

Radboud University actively supports equality, diversity and inclusion, and encourages applications from all sections of society. The university offers customised facilities to better align work and private life. Parents are entitled to partly paid parental leave and Radboud University employees enjoy flexibility in the way they structure their work. The university highly values the career development of its staff, which is facilitated by a variety of programmes. The Faculty of Science is an equal opportunity employer, committed to building a culturally diverse intellectual community, and as such encourages applications from women and minorities.

Radboud University

We want to get the best out of science, others and ourselves. Why? Because this is what the world around us desperately needs. Leading research and education make an indispensable contribution to a healthy, free world with equal opportunities for all. This is what unites the more than 24,000 students and 5,600 employees at Radboud University. And this requires even more talent, collaboration and lifelong learning. You have a part to play!

We offer

Additional employment conditions

Work and science require good employment practices. This is reflected in Radboud University's primary and secondary employment conditions. You can make arrangements for the best possible work-life balance with flexible working hours, various leave arrangements and working from home. You are also able to compose part of your employment conditions yourself, for example, exchange income for extra leave days and receive a reimbursement for your sports subscription. And of course, we offer a good pension plan. You are given plenty of room and responsibility to develop your talents and realise your ambitions. Therefore, we provide various training and development schemes.

Would you like more information?

For questions about the position, please contact Ton Coolen, Professor at +31 24 361 42 45 or ton.coolen@donders.ru.nl.

Practical information and applications

You can apply until 25 February 2022, exclusively using the button below. Kindly address your application to Ton Coolen. Please fill in the application form and attach the following documents:

The first round of interviews will take place around the end of March. You would preferably begin employment on 1 September 2022.

This vacancy was also published in a slightly modified form in 2021. Applicants who were rejected at that time are kindly requested not to apply again.

We can imagine you're curious about our application procedure. It offers a rough outline of what you can expect during the application process, how we handle your personal data and how we deal with internal and external candidates.

We drafted this vacancy to find and hire our new colleague ourselves. Recruitment agencies are kindly requested to refrain from responding.

The rest is here:
Associate / Full Professor of Theoretical Biophysics and Machine Learning job with RADBOUD UNIVERSITY NIJMEGEN | 278686 - Times Higher Education (THE)

Read More..

Heard on the Street 1/24/2022 – insideBIGDATA

Welcome to insideBIGDATAs Heard on the Street round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Enjoy!

COVID-19: A Data Tsunami That Ushered in Unprecedented Opportunities for Businesses and Data Scientists. Commentary by Thomas Hazel, founder & CTO at ChaosSearch

From creating volatile data resources to negatively impacting forecasting models, there have been countless challenges the pandemic has caused for organizations that rely on data to inform business decisions. However, there is also an upside to the data tsunami that COVID-19 created. The movement to all-things-digital translated into a tsunami of log data streaming from these digital systems. All this data presented an incredible opportunity for companies to deeply understand their customers and then tailor customer and product experiences. However, theyd need the right tools and processes in place to avoid being overwhelmed by the volume of data. The impact spans all industries, from retail to insurance to education.Blackboard is a perfect example. The world-leading EdTech provider was initially challenged at the start of the pandemic with the surge of daily log volumes from students and school systems that moved online seemingly overnight. The company quickly realized they needed a way to efficiently analyze log data for real-time alerts and troubleshooting, as well as a method to access long-term data for compliance purposes. To accomplish this, Blackboard leverages its data lake to monitor cloud deployments, troubleshoot application issues, maximize uptime, and deliver on data integrity and governance for highly sensitive education data. This use case demonstrates just how important data has become to organizations that rely on digital infrastructure and how a strong data platform is a must to reduce the time, cost, and complexity of extracting insights from data. While the pandemic created this initial data tsunami, tech-driven organizations that have evolved to capitalize on its benefits, like Blackboard, have accepted that this wave of data is now a constant force that they will have to manage more effectively for the foreseeable future.

Cloud Tagging Best Practices. Commentary by Keith Neilson, Technical Evangelist at CloudSphere

While digital transformation has been on many organizations priority list for years, the Covid-19 pandemic applied more pressure and urgency to move this forward. Through their modernization efforts, companies have unfortunately wasted time and resources on unsuccessful data deployments, ultimately jeopardizing company security. For optimal cyber asset management, consider the following cloud tagging best practices:Take an algorithmic approach to tagging. While tags can represent simple attributes of an asset (like region, department, or owner), they can also assign policies to the asset. This way, assets can be effectively governed, even on a dynamic and elastic platform. Next, optimize tagging for automation and scalability. Proper tagging will allow for vigorous infrastructure provisioning for IT financial management, greater scalability and automated reporting for better security. Finally, be sure to implement consistent cloud tagging processes and parameters within your organization. Designate a representative to enforce certain tagging formulas, retroactively tag when IT personnel may have added assets or functions that they didnt think to tag and reevaluate business outputs to ensure tags are effective.While many underestimate just how powerful cloud tagging can be, the companies embracing this practice will ultimately experience better data organization, security, governance and system performance.

Using AI to improve the supply chain.Commentary by Melisa Tokmak, GM of Document AI, Scale AI

As supply chain delays continue to threaten businesses at the beginning of 2022, AI can be a crucial tool for logistics companies to speed up their supply chain as the pandemic persists. Logistics and freight forwarding companies are required to process dozens of documents such as bills of lading, commercial invoices and arrival notices fast, and with the utmost accuracy, in order to report data to Customs, understand changing delivery timelines, collect & analyze data about moving goods to paint information about the global trade. For already overtaxed and paperwork-heavy systems, manual processing and human error are some of the most common points of failure, which exacerbate shipping delays and result in late cargo, delayed cash flow & hefty fines.As logistics companies have a wealth of information buried in the documents they process, updating databases with this information is necessary to make supply chains more predictable globally. Most companies spend valuable time analyzing inconsistent data or navigating OCR and template-based solutions, which arent effective due to the high variability of data in these documents. Machine learning-based, end-to-end document processing solutions, such as Scale AIs Document AI, dont rely on templates and can automate this process; AI solutions allow logistics companies to leverage the latest industry research without changing their developer environment. This way, companies can focus on using their data to cater to customers and serve the entire logistics industry, rather than spending valuable time and resources on data-mining.ML-based solutions can extract the most valuable information accurately in seconds, accelerating internal operations, reducing the number of times containers are opened for checksdecreasing costs and shipping delays significantly. Using Scales Document AI, freight forwarding leader Flexport achieved significant cost savings in operations and decreased the processing time of each document. Flexports documents were formerly processed in over two days, but with Document AI, were processed in less than 60 seconds with 95%+ accuracy, all without having to build and maintain a team of machine learning engineers and data scientists. As COVID has led to a breakdown of internal processes, AI-powered document processing solutions are helping build systems back up: optimizing operations to handle any logistic needs that come their way at such a crucial time.

IBM to Sell Watson Health. Paddy Padmanabhan, Founder and CEO of Damo Consulting

IBMs decision to sell the Watson Health assets is not an indictment of the promise of AI in healthcare. Our research indicates AI was one of the top technology investments for health systems in 2021. Sure, there are challenges such as data quality and bias in the application of AI in the healthcare context but by and large there has been progress with AI in healthcare. The emergence of other players, notably Google with its Mayo Partnership, or Microsoft with its partnership with healthcare industry consortium Truveta are strong indicators of progress.

Data Privacy Day 2022 Commentary. Commentary by Lewis Carr, Senior Director, Product Marketing at Actian

In 2022, expect to see all personal information and data sharing options get more granular as to how we control them both on our devices and in the cloud specific to each company, school or government agency. Well also start to get some visibility into and control over how our data is shared between organizations without us involved. Companies and public sector organizations will begin to pivot away from the binary options (opt-in or opt-out) tied to a lengthy legal letter that no one will read and will instead provide the data management and cybersecurity platforms with granular permission to parts of your personal data, such as where its stored, for how long, and under what circumstances it can be used. You can also expect new service companies to sprout up that will offer intermediary support to monitor and manage your data privacy across.

Data Privacy Day 2022 Commentary. Commentary by Rob Price, Principal Expert Solution Consultant at Snow Software

The adoption of cloud technology has been a critical component to how we approach privacy and data protection today. A common misconception is that if your data is offsite or cloud-based its not your problem but that is not true because the cloud is not a data management system. Two fundamental factors for data protection and security are the recovery point objective (how old can data be when you recover it) and the recovery time objective (how quickly can you recover the data). Every companys needs are different, but these two factors are important when planning for data loss.

Sign up for the free insideBIGDATAnewsletter.

Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1

Read more here:
Heard on the Street 1/24/2022 - insideBIGDATA

Read More..

Collaboration with NTT Research to advance computational neurobiology – Harvard Office of Technology Development

January 24, 2022 - Neurobiologists at Harvard University have entered a joint research agreement with NTT Research, Inc., a division of NTT, to study animal neuro-responses with the hope of informing future artificial intelligence systems. The five-year research project, launched in the fall of 2021, enables researchers at the two organizations to collaboratively study how animals maintain behavioral flexibility, specifically in the task of navigation. Greater understanding of how this challenge is approached in biology may eventually enable the design of new computing machines with similar capabilities. The agreement was coordinated by Harvard Office of Technology Development.

The principal investigator is Venkatesh Murthy, PhD, the Raymond Leo Erikson Life Sciences Professor of Molecular and Cellular Biology at Harvard and the Paul J. Finnegan Family Director of the Center for Brain Science. Murthys counterpart at NTT Research for the joint project is Physics & Informatics (PHI) Lab Research Scientist Gautam Reddy, PhD, who was previously an Independent Post-Doctoral Fellow at Harvards NSF-Simons Center for Mathematical and Statistical Analysis of Biology.

This joint research aims to better elucidate how animals maintain the ability to respond appropriately to a wide variety of complex real-world scenarios. The investigators expect the results from one aspect of the research to be a source of new, biologically inspired ideas for artificial reinforcement learning systems that rely on representation learning. Such ideas have played a major role in recent advances in artificial intelligence. Results from another aspect of the research should provide a quantitative understanding of how animals track trails, as well as identify the basic elements of general behavioral strategies that perform flexibly and reliably in the real world. Murthys lab has a long track record in experimental and computational neurobiology. Expertise relevant to the joint research includes the ability to record from or image many individual neurons in the brain while an animal performs behavioral tasks. This technical expertise will enable the research team to understand what computations are performed by biological neural networks when an animal is navigating in a complex world.

Murthy and Reddy have previously worked together on understanding the computational principles behind olfaction. Their focus was on how the smell receptors in the nose respond to blends of odorous compounds. During his time at Harvards NSF-Simons Center for Mathematical Biology, Reddy worked on the theory behind how animals track scent trails and on developing a computational framework to explain how evolution optimizes organisms.

I am delighted to continue this line of inquiry with Dr. Reddy through the NTT Research PHI Lab, Murthy said. The brain is an example of an extremely efficient computational device, and plenty of phenomena within it remain unexplored and unexplained. We believe the results of these investigations in neurobiology will reveal basic understandings and prove useful in the field of artificial intelligence.

Efficient computation is at the heart of quantum computing and neuroscience. Inspired by neuroscience, recent advances in machine learning have recently begun to change how we process data, said NTTs PHI Lab Director Yoshihisa Yamamoto, PhD. This joint research project could provide a rich source of animal-inspired algorithms that generalize across various research domains within NTT and inspire truly novel interdisciplinary ideas.

Adapted from a press release by NTT Research.

Here is the original post:
Collaboration with NTT Research to advance computational neurobiology - Harvard Office of Technology Development

Read More..

AJM Book Talk: What Is Real? by Adam Becker – Lone Star Ball

I recently finished What is Real? by Adam Becker, and it is a book that Ive not stopped thinking about after reading it. It deals with quantum physics and the nature of reality, a topic Im fascinated with, but which I realize many people dont find interesting, so I get it if you close the browser or otherwise move on to something else after reading this sentence. But I have a tendency to want to write about things Ive been thinking about a lot, and I have this here blog, and theres not much baseball going on, so...

What is Real? came out in 2018, when Becker was 34, and was written as a result of him getting a Sloan Foundation grant a couple of years earlier to research and write a book on the history of the foundations of quantum physics, with a particular emphasis on the continued dominance of the troubled Copenhagen Interpretation. The fact that grant summary calls it the troubled Copenhagen Interpretation provides a pretty clear hint at the direction which Becker is coming from, something I wasnt aware of when I was reading the book, but which became clear pretty quickly.

Becker has an interesting academic background he got his B.A. from Cornell in Physics and in Philosophy in 2006, got a masters a year later from the University of Michigan in Physics, and then got a Ph.D. from UofM in Computational Cosmology in 2012. Hes done a lot of writing and teaching, and is currently at the Lawrence Berkeley National Laboratory in a position he describes on his website as being Science Writer and Communications Specialist, and describes himself as Author and Astrophysicist.

Before I really dive into this, I want to talk a little bit about my dad (and if youre just interested in Beckers book and quantum physics you can skip the next several paragraphs and resume reading after the * * * ). As many of you know, hes a veterinarian who has a clinic and, as of several years ago, a no-kill animal shelter in southwest Fort Worth. Hes had his own veterinary clinic for almost 50 years.

He was also the first person in the family to go to college. He grew up in a working class family, as that term was understood in the early and middle parts of the 20th century, the oldest of three children. His father my grandfather worked for the railroad, was in the union, and had the type of nice, steady job that was much more commonplace back then than it is now, the type of job that is romanticized is some quarters and decried as wage slavery in others. My grandfather worked hard at a job that, as best as I can tell, he didnt particularly care for, but which he had to do in order to provide for his wife and children.

One of the things that my dad has always talked about, for as far back as I can remember, was that when he was a kid, when he was growing up, he looked at the life his father had, the life others in the neighborhood had, and said to himself, Im not going to live like this. Im not going to work every day for someone else doing a job that I dont really want to do in order to just scrape by. He was adamant he was not going to live his life that way.

Unlike his eldest son, my father has always had tremendous drive and a great work ethic. He started working at an early age, always had a job from his adolescence, often times more than one. He got up early in the morning throughout his teenage years to throw papers. One of his jobs was working behind the counter at Swensons Ice Cream for whatever reason, thats one of the ones that really sticks in my mind.

And while he was doing this, he always got very good grades in school. Again, unlike his eldest son, he had the willpower and drive to pay attention, get his schoolwork done, and put in the effort and the work to get good grades. He did well enough in high school to get into Texas A&M University, which he (and my mom, when they got married after his freshman year) worked to put himself through, both undergrad and vet school.

All through school, elementary, middle and high school, then undergrad and graduate school, he says he only made one C. And that was in Physics. My niece, who is currently at A&M seeking to follow in his footsteps and go to vet school (shes currently an undergrad), was talking to us about that part of the family lore over Christmas, and how thats also the thing that she has the most problems with. Biology, both of them really enjoyed and understood and were good at. Chemistry was, for both of them, something they could handle, but was a strong suit. Physics, though, was her biggest hurdle, just like with her grandfather.

I thought that was interesting because for me it is the exact opposite. I have never understood biology. I have never been good at it. I struggled with it in school, and still have a hard time grokking it. On the other hand, physics THAT I get. I can process it. I was always good at it in school, both high school and college. Its something Im fascinated by, and enjoy reading about.

As I was explaining that in our discussion, my dad turned to my niece and said, Well of course he likes physics. He loves all that math. I dont want to deal with all that math. I want to deal with whats alive, with whats REAL.

* * *

You can draw a line, if you want, between biology and physics, with biology dealing with what the real, physics with the abstract, and chemistry straddling the line, having a connection to both. During the Enlightenment, there wasnt so much of this schism there were simply Natural Philosophers, exemplified by the Royal Society, who sought to understand nature, in whatever form. But as our understanding of nature, of the world, of the universe surrounding us increased, the breadth and depth of knowledge made it harder to keep up (Albert Einstein famously said that Johann Goethe, who died almost 200 years ago, was the last man in the world to know everything), and resulted in ever greater degrees of separation and specialization of the sciences.

But part of what I find so fascinating about physics is that, contrary to what my dad said, it is about whats real. Yes, I do like and grok math more than...whatever it is you need to like and grok to get biology, but I hit a wall at Calculus II, and am not going to pretend I have any sort of detailed knowledge or understanding of the level of math necessary to do high level physics. But on a fundamental level, what makes it so gripping is that it is about understanding or trying to understand how Everything with a capital E works.

And with quantum physics, part of what makes it so fascinating is how bizarre and non-intuitive reality is or at least how we perceive reality. Becker, examining the past century of work in quantum physics, ends up delving into the meta-issue of, what does it mean to understand how Everything works?

As is inherent in any examination of the history of quantum physics, Niels Bohr is a major figure in the book, and while Becker says (in response to a negative review of What Is Real?) that he doesnt think he paint[s] Bohr as a villain, he does come across to me as, at least, the antagonist. The breakthroughs Bohr and Werner Heisenberg made in the 1920s in taking the findings and theories that Max Planck, Albert Einstein and others made in the first quarter of the 20th century and using it to build mathematical models that reflected the observational data led to what is known as the Copenhagen Interpretation.* It is the Copenhagen Interpretation that Becker spends much of the book assailing, and as Bohr is the godfather of the Copenhagen Interpretation, it ends up coming across as challenging or, less charitably, attacking Bohr.

* The term Copenhagen Interpretation was not used at the time it appears to have been coined at some point in the 1950s but its use became widespread. Im using it to refer to the collection of principles that eventually came to be known as the Copenhagen Interpretation even when referring to events that occurred before the actual term was invented.

As Becker illustrates in his book, it would be hard to overstate the influence Bohr had over the development of quantum theory as it was developing from its early embryonic stages. In 1913 Bohr used Plancks quantum theory and Ernest Rutherfords model of the atom as a springboard to create what is now known as the Bohr model of the atom. This is the model for the atom which we commonly use even today to visualize an atom a nucleus surrounded by concentric orbital shells which contain electrons, and which electrons can jump between (the quantum jump). The model was groundbreaking, and was one of the major contributions that led to Bohr receiving the Nobel Prize in 1922.

By 1925, as a Nobel Laureate and one of the leading lights in the still-nascent field of quantum theory, Bohr headed up the Institute for Theoretical Physics at the University of Copenhagen. That year, one of his students, Werner Heisenberg, published the paper that ultimately led to Heisenberg promulgating the revolutionary matrix mechanics formulation of quantum mechanics. Two years later, while working under Bohr, Heisenberg developed his Uncertainty Principle*. Paul Dirac and Ernest Schrodinger spent time at the Institute during this time as well, and the Institute quickly became Ground Zero for the development of quantum theory and quantum mechanics.

* The Uncertainty Principle provides that there is an inherent mathematical limit to how much we can know about both the position and the momentum of a particle the more certainty we have about one, the less we have about the other. This is often interpreted as meaning that the act of measuring position or momentum changes the state of the particle (the Observer Effect) and that does happen, since the interaction of a photon to measure results in a change in the particles position and/or momentum, and was the original basis of Heisenbergs theory. However, the Uncertainty Principle exists independent of any external measurement for example, atoms cooled to extremely low temperatures have very little uncertainty in momentum, resulting in their smearing, their wavefunctions overlapping with other nearby atoms, and their occupying the same states, resulting in a Bose-Einstein Condensate.

As work developed in these fields, Becker appears to acknowledge, Bohr was less theoretical physicist and more of a guiding supervisor and philosopher. This is complicated by the fact that Bohr was one of those scientists who, as Becker notes repeatedly, struggled mightily to put his thoughts into words, resulting in his writing being famously opaque, his views vague and sometimes contradictory. The combination of Bohr being revered by the generation of quantum physicists working in the pre-WWII era as almost a gatekeeper, someone whose blessing was needed for ones work to have merit and his being unable to express his views clearly resulted in Bohr seemingly transferring into an oracle over time.

Whatever weaknesses Bohr had in elucidating his own fundamental opinions, he was a successful advocate of the theories he and his students advanced. Albert Einstein, long painted in the second half of his career as a curmudgeon who wrongheadedly refused to accept quantum mechanics, challenged the inherently probabilistic nature of the universe described by the Copenhagen Interpretation (god doesnt play dice)*, and challenged the Copenhagen Interpretation over the next decade, particularly its violation of causality and its reliance on the observer effect in wave function collapse.

* The actual quote, translated into English, is I, at any rate, am convinced that [God] does not throw dice, from a letter written by Einstein to Max Born in 1926, and is an unfair oversimplification of the subtle and complicated views Einstein held on the subject of quantum mechanics.

Einstein argued that, at a minimum, the Copenhagen Interpretation was incomplete, and argued for the existence of hidden variables or other phenomena that were driving the results the Copenhagen Interpretation was finding. Einsteins problem, however, was that, whatever the reason for it, the math behind the Copenhagen Interpretation worked, and Einstein couldnt give a definitive contrary explanation as to why. When John von Neumann produced a proof that (supposedly)* conclusively established that there could be no hidden variables, that appeared to put an end to the debate. Bohr won, Einstein lost, and the Copenhagen Interpretation was the Law of the Land.

* As it turns out, that proof was flawed, and a central theme of the second half of the book is an individual challenging the von Neumann proof, having found the flaw, and being ignored because, well, hes John von Neumann and youre just some rando. It took decades, but the flawed nature of the proof appears to have now been accepted.

Beckers deep dive into the historical underpinnings of both the Copenhagen Interpretation which is, in essence, an umbrella term for various underlying views and principles which are generally embraced by those who studied under Bohr or his students, and who built upon the work derived therefrom and its almost universal adoption in the twentieth century lays the groundwork for the second half of the book, in which he follows the efforts of a few isolated individuals who sought to challenge what had become quantum mechanical orthodoxy.

Becker emphasizes what can best be described as a philosophical split between what can best be described as mainstream Copenhagen Interpretation and those who question it. That philosophical split is over the importance relevance, even of understanding why quantum mechanics works. While Bohr himself was not necessarily completely dismissive of the why, he never really elucidated a comprehensive and coherent explanation as to the whys, and over time, adherents of the Copenhagen Interpretation mostly, according to Becker, quit worrying or caring about the whys of it. Instead, the mindset that Becker lays out is one of Shut up and calculate! (a phrase coined by N. David Merman, though often attributed to Richard Feynman) the math works, the theories accurately predict the results, and thats all that matters.

The most significant example of this problem is the collapse of the wave function, which is exemplified in the Two-Slit Problem*, a thought experiment that has since been experimentally verified which showed (among other things) that, under the rules of quantum mechanics, whether a single photon of light acts as a wave or a particle depends on whether it is being observed.

* I remember the first time I read about this, in a book called Schrodingers Kittens and the Search for Reality, by John Gribbin. It blew my mind. I had to re-read it a couple of times to make sure I understood it. Niels Bohr famously said, A person who wasnt outraged on first hearing about quantum theory didnt understand what had been said, and that describes my reaction in reading Gribbins book for the first time.

While much of quantum physics is inherently unintuitive, the idea that a particle behaves differently depending on whether or not it is observed goes well beyond unintuitive, and even beyond anthropocentrism into the realm of solipsism. Bohr espoused the theory of complementarity the idea that objects have complementary properties that cannot be measured simultaneously as a way of getting around the subjective element of the observer effect.

This line of thought seems to hold that what an object does when it is not being measured is unknowable, and thus irrelevant. This leads to the philosophical split mentioned above, whereby the Copenhagen Interpretation gets characterized as not describing reality, but simply providing a mathematical framework that is describing what is happening at the quantum level. Under this mindset, whether or not a photon is in reality a wave, a particle, sometimes one and sometimes the other, both, or something else entirely is irrelevant since the math works, theres no reason to wonder whether or not it reflects a literal depiction of what is happening at that level.

Becker opines that disregarding what is real over the second half of the twentieth century, and the embrace of the Copenhagen Interpretation without regard to what the results and data say about the underlying nature of reality, is due in no small part to the Cold War and the extent to which university funding was tied to government grants and the ability to produce practical results. Becker spends a fair amount of time looking at the politics of the Western World particularly the United States in the post-war era, and how it impacted not just fields of study, but individuals. Becker writes at length about the travails of David Bohm, and iconoclastic physicist who challenged the Copenhagen Interpretation, but who also was hamstrung in his ability to teach or work in the United States do to his being a Communist. Bohms quixotic life and career, and alternative interpretations, get quite a bit of coverage in What Is Real?

Becker tracks the efforts of the occasional renegade, leading to John Stewart Bell, who both identified the flaw in von Neumanns proof and established that the local hidden variable theory set forth in the Einstein-Podolsky-Rosen Paradox (part of the challenge to the Copenhagen Interpretation, and put out in 1935) violated quantum theory. The EPR Paradox noted that under the quantum mechanics, there would exist the phenomenon of quantum entanglement particles that are entangled in such a way that, when separated, measuring the spin state of one of them would immediately result in the other particle having the opposite spin state. This phenomenon derided by Einstein as spooky action at a distance violated locality (and thus a fundamental part of causality). Einstein hypothesized that local hidden variables would have to be involved, thus preserving the notion of locality and establishing that the Copenhagen Interpretation was incomplete.

Bell ends up being the hero of What is Real?, as Becker follows out Bells Theorem has led to the questioning of the Copenhagen Interpretation orthodoxy and the promulgation of alternative theories notably the many worlds theory, which Becker describes as having gained significant support. Under the many worlds theory, rather than a collapse of the wave function triggered by an observation resulting in a particle or object assuming one of the possible forms it could take, with a likelihood based on the probabilistic nature of the wave function, every possible outcome available under the probabilistic model occurs, with each outcome results in the branching off of a new universe.

While Becker says he himself doesnt advocate for the many-worlds interpretation in What is Real?, its hard to come away from Beckers book not feeling like that is his view. And theres appeal to the many worlds interpretation not least of which that, it seems to me at least, it provides some actual explanation for what is occurring on the quantum level. Conversely, the Copenhagen Interpretation, as set forth by Becker, requires one to either assume everything is probabilistic and nothing is real until the act of observing rife with philosophical questions or choosing to just shut up and calculate!, and if one does choose to think about the whys, having to accept the equivalent of Xenas a wizard did it explanation.

One of the problems that one has in grappling with this issue particularly in the framework of the Copenhagen Interpretation/Not Copenhagen Interpretation duality Becker uses in his book is that, as Becker acknowledges, there is not one specific Copenhagen Interpretation. It is not a specific set of laws and rules so much as it is a collection of guiding principles which have been used by mainstream physicists for most of the last century or so. Becker notes that such a loosy-goosy definition provides a degree of flexibility to its adherents, as two followers of the Copenhagen Interpretation may have diametrically opposed views as to a certain element of quantum theory, and yet still rightfully claim to be part of the Copenhagen Interpretation orthodoxy. That can make critiquing the Copenhagen Interpretation like nailing Jello to the wall, as a particular criticism can be deflected by saying, well, that doesnt mean the Copenhagen Interpretation is wrong, because heres someone who is an adherent who agrees with that criticism, and really, its a theory that is bigger than just that one particular issue. Conversely, though, that big umbrella means that there are many more things that Becker can identify as being part of the Copenhagen Interpretation and attack, thereby eroding the credibility of the theory as a whole.

That being said, it seems like theres something apropos about the Copenhagen Interpretation, which has as one of its core tenets the fundamental indeterminability of things at the quantum level, having such a level of uncertainty as to what it really is.

Personally, I understand the desire to look at the underpinnings of what quantum mechanics describes. On a practical level, I get that it is enough to know that the math works, and not worry about the extent to which the math is simply a formalism, a symbolic approximation of things we dont maybe cant understand at the quantum level, versus an accurate description of reality. I dont have to know how an internal combustion engine works in order to drive a car.

But that doesnt mean that I dont want to know that I dont want to understanding that fundamental question that is the title of Beckers book. And its part of what made it such a compelling read for me, and a book that Im likely going to re-read at some point in the future.

Read the original post:

AJM Book Talk: What Is Real? by Adam Becker - Lone Star Ball

Read More..

Why Is Silicon Valley Still Waiting for the Next Big Thing? – The New York Times

In the fall of 2019, Google told the world it had reached quantum supremacy.

It was a significant scientific milestone that some compared to the first flight at Kitty Hawk. Harnessing the mysterious powers of quantum mechanics, Google had built a computer that needed only three minutes and 20 seconds to perform a calculation that normal computers couldnt complete in 10,000 years.

But more than two years after Googles announcement, the world is still waiting for a quantum computer that actually does something useful. And it will most likely wait much longer. The world is also waiting for self-driving cars, flying cars, advanced artificial intelligence and brain implants that will let you control your computing devices using nothing but your thoughts.

Silicon Valleys hype machine has long been accused of churning ahead of reality. But in recent years, the tech industrys critics have noticed that its biggest promises the ideas that really could change the world seem further and further on the horizon. The great wealth generated by the industry in recent years has generally been thanks to ideas, like the iPhone and mobile apps, that arrived years ago.

Have the big thinkers of tech lost their mojo?

The answer, those big thinkers are quick to respond, is absolutely not. But the projects they are tackling are far more difficult than building a new app or disrupting another aging industry. And if you look around, the tools that have helped you cope with almost two years of a pandemic the home computers, the videoconferencing services and Wi-Fi, even the technology that aided researchers in the development of vaccines have shown the industry hasnt exactly lost a step.

Imagine the economic impact of the pandemic had there not been the infrastructure the hardware and the software that allowed so many white-collar workers to work from home and so many other parts of the economy to be conducted in a digitally mediated way, said Margaret OMara, a professor at the University of Washington who specializes in the history of Silicon Valley.

As for the next big thing, the big thinkers say, give it time. Take quantum computing. Jake Taylor, who oversaw quantum computing efforts for the White House and is now chief science officer at the quantum start-up Riverlane, said building a quantum computer might be the most difficult task ever undertaken. This is a machine that defies the physics of everyday life.

A quantum computer relies on the strange ways that some objects behave at the subatomic level or when exposed to extreme cold, like metal chilled to nearly 460 degrees below zero. If scientists merely try to read information from these quantum systems, they tend to break.

While building a quantum computer, Dr. Taylor said, you are constantly working against the fundamental tendency of nature.

The most important tech advances of the past few decades the microchip, the internet, the mouse-driven computer, the smartphone were not defying physics. And they were allowed to gestate for years, even decades, inside government agencies and corporate research labs before ultimately reaching mass adoption.

The age of mobile and cloud computing has created so many new business opportunities, Dr. OMara said. But now there are trickier problems.

Still, the loudest voices in Silicon Valley often discuss those trickier problems as if they were just another smartphone app. That can inflate expectations.

People who arent experts who understand the challenges may have been misled by the hype, said Raquel Urtasun, a University of Toronto professor who helped oversee the development of self-driving cars at Uber and is now chief executive of the self-driving start-up Waabi.

Technologies like self-driving cars and artificial intelligence do not face the same physical obstacles as quantum computing. But just as researchers do not yet know how to build a viable quantum computer, they do not yet know how to design a car that can safely drive itself in any situation or a machine that can do anything the human brain can do.

Even a technology like augmented reality eyeglasses that can layer digital images onto what you see in the real world will require years of additional research and engineering before it is perfected.

Andrew Bosworth, vice president at Meta, formerly Facebook, said that building these lightweight eyeglasses was akin to creating the first mouse-driven personal computers in the 1970s (the mouse itself was invented in 1964). Companies like Meta must design an entirely new way of using computers, before stuffing all its pieces into a tiny package.

Over the past two decades, companies like Facebook have built and deployed new technologies at a speed that never seemed possible before. But as Mr. Bosworth said, these were predominantly software technologies built solely with bits pieces of digital information.

Building new kinds of hardware working with physical atoms is a far more difficult task. As an industry, we have almost forgotten what this is like, Mr. Bosworth said, calling the creation of augmented reality glasses a once-in-a-lifetime project.

Technologists like Mr. Bosworth believe they will eventually overcome those obstacles and they are more open about how difficult it will be. But thats not always the case. And when an industry has seeped into every part of daily life, it can be hard to separate hand-waving from realism especially when it is huge companies like Google and well-known personalities like Elon Musk drawing that attention.

Many in Silicon Valley believe that hand-waving is an important part of pushing technologies into the mainstream. The hype helps attract the money and the talent and the belief needed to build the technology.

If the outcome is desirable and it is technically possible then its OK if were off by three years or five years or whatever, said Aaron Levie, chief executive of the Silicon Valley company Box. You want entrepreneurs to be optimistic to have a little bit of that Steve Jobs reality-distortion field, which helped to persuade people to buy into his big ideas.

The hype is also a way for entrepreneurs to generate interest among the public. Even if new technologies can be built, there is no guarantee that people and businesses will want them and adopt them and pay for them. They need coaxing. And maybe more patience than most people inside and outside the tech industry will admit.

When we hear about a new technology, it takes less than 10 minutes for our brains to imagine what it can do. We instantly compress all of the compounding infrastructure and innovation needed to get to that point, Mr. Levie said. That is the cognitive dissonance we are dealing with.

Continued here:

Why Is Silicon Valley Still Waiting for the Next Big Thing? - The New York Times

Read More..

Schrdingers Pedophilia: The Cat Is Out Of The Bag (Box) – Forbes

Erwin Schrdinger

According to a December 2021 report from the Irish Times, the Nobel Prize-winning Austrian physicist Erwin Schrdinger was a pedophile.

To refresh memory: Schrdinger made many astounding contributions to the burgeoning fields of quantum physics, electrodynamics, molecular biology, and color theory. For example, nearly 100 years ago, he gave the world a way to calculate the probable energy and position of electrons in space and time. For non-scientists, the thought experiment now called Schrdinger's Cat is perhaps his most fun gift to science. Imagine a cat living in a sealed box that is equipped with a contraption that has a 50% chance of killing it via a random, subatomic event. Quantum physics deals in probabilities rather than fixed realities. Given this, the question Schrdinger posed was whether two probable states could exist at once. Could the cat in the box be simultaneously alive and dead? Having asked, Schrdinger also pointed out that there is a measurement problem that precludes answering his question with certainty. No one can learn if the cat exists simultaneously in two states of being because, the moment that anyone opens the box and looks in, the cat becomes either alive or dead.

Perhaps we all should have known. The evidence given by the Irish Times has been staring us in the face for almost a decade. Citing specific sources, the Irish Times article details two stories.

Ithi Junger. British astrophysicist John Gribbin reported in his 2013 biography Erwin Schrdinger and the Quantum Revolution that, at the age of 39, Schrdinger became enamored of 14-year-old Ithi, whom he was tutoring in math. As well as the maths, the lessons included a fair amount of petting and cuddling [as Schrdinger stated in his diary] and Schrdinger soon convinced himself that he was in love with Ithi. There is no evidence that things went beyond petting and cuddling when Ithi was 14, but before he died in 1961 Schrdinger admitted that hed impregnated her when she was 17. Her abortion left her sterile.

Barbara MacEntee. According to Schrdingers biographer Walter Moore (writing in 2015 in Schrdinger: Life and Thought), Schrdinger kept a list in his diary of the women and girls hed romanced. A 12-year-old named Barbara MacEntee was on it. Schrodinger approached her when he was 53 years old. Her astonished family asked a Catholic priest to intervene. As Moores biography explained, the priest had a serious word with [Schrdinger], and muttering dark imprecations, [Schrdinger] desisted from further attentions to Barbara, although he listed her among the unrequited loves of his life.

Quoting directly from Schrdingers diaries, Moore revealed that the physicist justified his attraction to girls by considering that, being a genius (which he believed no woman ever could be), he was naturally entitled. It seems to be the usual thing that men of strong, genuine intellectuality are immensely attracted only by women who, forming the very beginning of the intellectual series, are as nearly connected to the preferred springs of nature as they themselves. Nothing intermediate will do, since no woman will ever approach nearer to genius by intellectual education than some unintellectuals do by birth so to speak.

After reading the Irish Times article, I found myself emotionally agape. Wanting, perhaps, to preserve for myself the glory of Schrdingers scientific reputation, I tried to normalize his behavior. Searching out examples of pedophilia among other western cultural heroes, I learned that:

Edgar Allen Poe married his 13-year-old cousin.

As reported in the Paris Review, after his wife died, Mark Twain dabbled with angel-fish. (Thats what he called schoolgirls.) Theres no indisputable evidence of sexuality but there was an awful lot of promiscuous cuddling.

Horatio Alger, who wrote rags-to-riches stories about young, disadvantaged boys, was briefly a Unitarian minister. He was expelled from the ministry for boy-directed pederasty.

Rock and roll stars with sexual interests in very young girls have included Jerry Lee Lewis, who was 22 when he married his 13-year-old cousin. (Rumor has it that she still believed in Santa Claus.) A 24-year-old Elvis Presley dated Priscilla when she was 14.

Pedophilia is surprisingly common. The psychiatric professions Diagnostic and Statistical Manual V places its prevalence in the male population at 3%-5% and acknowledges that it is both highly resistant to treatment and less prevalent among women.

The DSM V has no listing for Pedophilia. Rather, it defines Pedophilic Disorder as a diagnosis assigned to adults (defined as age 16 and up) who have sexual desire for prepubescent children.

According to information published by Johns Hopkins All Children Hospital, puberty is not a now-you-see-it-now-you-dont kinda thing. For girls, the first signs can come at around age 8. For some girls, the transition into biological adulthood happens quickly. Even if it doesnt, it generally resolves by around age 16. For boys, puberty generally starts about two years later.

Thinking about Schrdinger in the context of the DSM V definition of Pedophilic Disorder, one point of confusion arose for me. At age 14 and age 12, Ithi and Barbara may have reached puberty already. Does that absolve his behavior with them in any way?

By modern standards, legally it does not. In Ireland, the age of sexual consent is now 17. In the United States, each of the 50 states has its own law that defines the age of consent as 16, 17, or 18.

Whats more, there are enormous differences between grown men and very young women in terms of social power and the sort of confidence necessary to make wise, safe, sexual choices.

This is all to say that pedophilia like that of Edwin Schrdinger is probably abhorrent any way you look at it.

In 2015, the Norwegian philosopher Ole Martin Moen of the University of Oslo wondered in the Nordic Journal of Applied Ethics about The Ethics of Pedophilia. According to Moen, pedophilia itself is a morally neutral sexual preference. It is the actions harming children that are immoral.

Being a pedophile is unfortunate for the pedophile himself, who will most likely not have a good sexual and romantic life.... he wrote.

Pedophile as victim seems a stretch even for someone like me who wants to preserve her image of Schrdinger as an intellectual hero. Moen, however, may have been serious about his pity the poor pedophile idea. He suggested a way in which society at large might save pedophiles from themselveswhile saving children from pedophiles. It could legalize ways for pedophiles to satisfy their compulsions with victimless entertainments like kiddie porn fiction and computer-generated images.

At least the use of computer-generated images would obviate the need for the truly repulsive act of posing real-life children in sexual ways. As Moet pointed out, though, a question of practicality remains. Would making use of such fiction and images satisfy pedophiles urges? Or might it instead encourage them to act out dangerously?

Moet cited research done in the 1990s that suggests that kiddie porn does not make pedophiles more prone to engage in real-world sex with children and that, indeed, it may give pedophiles a harmless outlet for their sexual urges. This idea seems to be buttressed by 1999 and 2011 research also cited by Moet; when Japan and the Czech Republic lifted their bans on kiddie porn, the rates of child rape dropped. From those two countries criminal justice experiences, Moet concluded that "Granted our current knowledge, it therefore seems that texts and computer-generated graphics with pedophilic content may result in less adult-child sex.

Schrdingers cat is both alive and dead until you open the box to look, whereupon it becomes either ... [+] alive or dead.

As helpful as such texts and graphics may turn out to be, I have trouble imagining a more divisive issue to raise state by state in todays America than whether governments should allowor perhaps even sponsorkiddie porn.

Meanwhile, Im still dealing with my shock about Erwin Schrdinger. For me right now, he exists in two states of being at once. In other words, Erwin Schrdinger has become Schrdingers Cat. He is both a beacon of scientific light and a monster. Both/and, not yet either/or. That being said, some behavior is just too putrid to tolerate. As the revelations about his behavior continue to curdle inside of me, one of those views will take precedence. Very soon, I suspect, I will say, Hes dead to me.

More here:

Schrdingers Pedophilia: The Cat Is Out Of The Bag (Box) - Forbes

Read More..