Opinion | ChatGPT and the Human Mind: How Do They Compare? – The New York Times

To the Editor:

Re Noam Chomsky: The False Promise of ChatGPT, by Noam Chomsky, Ian Roberts and Jeffrey Watumull (Opinion guest essay, nytimes.com, March 8):

Dr. Chomsky and his co-authors are correct that A.I. is nothing like the human mind, which took millions of years to evolve using the resources of the whole earth. A.I. developed over a few decades using a minuscule fraction of the earths riches.

The human brain is amazingly slow, inaccurate and forgetful. It is incapable of quick high-precision floating-point arithmetic, which solves equations to many decimal places. Computers are millions of times faster, with essentially infallible memory, perfect attention and limitless patience. The computer was a product of the human mind, which is truly wonderful.

Contrary to the writers assertions, there is no doubt that machines will eclipse and replace humans at science, math and engineering within this century. But future A.I. will exploit Bayesian algorithms rather than boring old deep learning like ChatGPT. (Bayesian methods use the minimal amount of training data, promise optimal accuracy and quantify uncertainty, capabilities that deep learning lacks.)

It is hard to imagine that computers would also eclipse humans in terms of evil.

Fred DaumCarlisle, Mass.

To the Editor:

Noam Chomsky and his co-authors have explained from a linguistic perspective the unbridgeable chasm that separates A.I. and chatbots, remarkable products of language analysis and synthesis, from human intelligence and knowledge.

But there is a more fundamental difference than the ones mentioned. The intelligence that chatbots create is an abstraction of mind and knowledge, amputated from the primary human data of bodily feelings and emotions on the one hand, and from sensory-perceptual awareness of the external world on the other.

The only way technology can solve this problem would be to create hybrid humans with implanted robotic connections, a development I shudder to contemplate.

Michael RobbinsAmherst, Mass.The writer is a psychoanalyst, a former professor of clinical psychiatry at Harvard University, and the author of Consciousness, Language and Self.

To the Editor:

In their thoughtful and clarifying article on the new breed of A.I. marvels, Noam Chomsky and his co-authors conclude that we can only laugh or cry at their popularity.

On balance, I fear that tears are in order, followed rapidly by hard work to circumvent the potentially destructive powers of artificial intelligence. The Wests lethal cocktail of judgmentalism, commodification and surveillance could all too conceivably lead to A.I. being employed primarily for the oppression of the individual.

Once that happens, we will be looking to Kafka, Bulgakov and Frost for lessons on how to say one thing but mean entirely another.

Fin KeeganNewport, Ireland

To the Editor:

Its been less than six months since ChatGPT exploded into public awareness. It immediately became controversial. Some would outlaw it. Some embrace it. Others applaud.

ChatGPT is a top-notch new learning tool. It even has the potential to break writers block. Why are schools pushing back? Some fear cheating, as though rectitude were more important than learning.

Consider this. Assign students to have ChatGPT write a paper. Then, ask those students to critique the resulting essay by standards of logic, bias, scholarship, content, style and creative thinking. After that, ask the students to rewrite the paper to overcome the shortcomings that their critique has disclosed.

I cant think of a better way to teach better thinking, better writing and better research than by having human students critique a machine-written essay.

What are we afraid of? Lets have faith in our human species.

Jack CummingCarlsbad, Calif.

To the Editor:

Noam Chomsky and his co-authors are right on target. ChatGPT is fascinating, but the hype is way overblown.

My experiences in two areas of interest could not have been more different. In the data science arena, it performs very well when writing Python programs based on my demands, although it requires some editing.

On the other hand, in my hobby area, history, it produces wildly inaccurate results but delivers them with great confidence. The reasons it does this are provided by the essays writers.

Sorry, kids, I would not count on it to write term papers.

Roger GatesFort Worth

To the Editor:

Re Wellesley Students Vote to Open Admissions to Transgender Men (news article, March 15):

Wellesley students pressuring the college to admit trans men have the issue exactly backward. They fail to make the appropriate distinction between sex and culture.

Sex is a biological category generally assigned at birth (or at some point in utero). Its various components may occasionally be at odds with one another. Gender is a cultural category that reflects how a person lives a life, which may at times be at odds with that persons sex.

Womens colleges are cultural/educational institutions devoted to women. They commonly admit trans women, as well they should. It is not in line with that mission to admit trans men or even those preferring to escape traditional gender categorization altogether.

Judith ShapiroBryn Mawr, Pa.The writer was the president of Barnard College from 1994 to 2008 and is emerita professor of anthropology at Barnard and Bryn Mawr College.

To the Editor:

Excuse After Excuse: Black and Latino Developers Struggle to Expand (Real Estate, March 5) points to lack of capital access as a key reason for the abysmal number of successful Black and Latino developers. This challenge is experienced by people of color across industries.

To fix this, we must reform lendings most consequential step: underwriting. Traditionally, underwriters look unfavorably on factors like smaller down payments and higher debt-to-income ratios that are more prevalent among nonwhite borrowers because of longstanding systemic racism.

There are more fair methods to determine an applicants likelihood and ability to repay. Our Underwriting for Racial Justice working group includes lenders piloting different underwriting approaches, such as evaluating credit histories instead of using hard credit score cutoffs. The result is high-performing and more racially diverse portfolios.

The financial industry has an opportunity to replace underwriting standards that perpetuate the crisis of representation in the development industry and beyond. We can spread more equitable practices to make real, systemic change.

Erin Kilmer NeelOakland, Calif.The writer is executive director of Beneficial State Foundation, which seeks a more equitable banking system.

To the Editor:

How Tech Tips the Scales on Gratuities (Business, March 2) shines a bright light on a systemic issue reflecting how this country values its workers. Rather than use tech to guilt customers into tipping, we should pay all workers a living wage thats baked into the cost of goods and services, as it is in so many other nations.

Tom SalyersTakoma Park, Md.The writer is director of communications at the Center for Law and Social Policy.

Link:
Opinion | ChatGPT and the Human Mind: How Do They Compare? - The New York Times

Related Posts

Comments are closed.