randombio.com | Science Dies in Unblogginess | Believe All Science | I Am the Science
Friday, March 10, 2023 | science commentary

Groupthink in science

ChatGPT is the ultimate in groupthink. It might just be the greatest thing for science ever


A nything the government touches, it destroys. Nowhere is that libertarian maxim more true than in academic science. Academic science is a euphemism for government-funded science. Academia's idea of supporting science is to put a Ph.D. in an empty room with a broken sink and a pile of broken equipment dating back to the Cold War and tell him to get some grants.

A typical academic researcher has more contact with the federal government than anyone at the university, with the possible exception of the university bureaucrats. The system was bad when I left. Now that I've been forced back in I discovered it has deteriorated to the point where many if not most scientists spend zero—not almost zero, but literally zero—time doing science.

Non-conformist pawn
An image not generated by AI

Let me put that in bold, as we're supposed to do in a grant. Scientists no longer do science. At least, not in biomedicine. That infamous article at Nature mag where the author clasped his hands beseech­ing the heavens for an explanation as to why oh why are scientists no longer making breakthroughs was typical of the humans doing what they do best: ignoring reality.

The reason they're not making breakthroughs is simple: scientists are too busy to do science.

The government has expanded its inefficiency so much that scientists spend all of the time that remains after useless meetings writing grants. Any science that gets done is not done by the scientists, but by technicians and grad students paid from those grants.

Some scientists, it is true, actually check the students' work before signing their name to it. Those who don't, or those who accidentally hired a student who is good with Photoshop, eventually pay the price.

Consensus is groupthink

Last week I was on a grant review panel, where the government asks us to decide which grants are worth funding. We started with 175 grants. All but fifty or so were immediately thrown away. Of those fifty nearly identical grants, we had to select the ten or so we thought the government would be most likely to want to pay for. Those were the ones the group decided, by consensus—and yes, Virginia, there is consensus in science—were “exciting.” Exciting doesn't mean innovative and significant. It means that the project conforms to the current fad in science and maybe nudges it forward just a bit. There is such a thing as too exciting. Let's not get carried away.

Even Anthony Fauci, the demon bureaucrat of Covid, knows this. He recently said this on Fox News:

[T]he whole scientific community feels that you have to have some degree of being able to manipulate organisms. . . . I think the entire scientific community of virology and infectious disease would argue strongly that if you shut down all of that research, a lot of things that would be important for the health of the country would not be able to be done.

Fauci was trying to defend gain-of-function research, but he accidentally revealed the current paradigm in biomedicine: manipulating the genome of some animal. Every single grant we reviewed wanted to do this. The ones that didn't are now just scattered electrons floating in space. Researchers are clinging to that fad because they saw what happened to climatology and anthropology. They all know if the government replaces it with something else, the next fad will be something worse.

Applicants who feel the process is unfair are right. One reviewer made so many false statements about one of the grants I was championing I thought he must be confusing it with a different one. Even if the chairperson drops those comments from the final statement, it swayed the other reviewers. The applicant, whoever he or she is, will no doubt wonder what psychotropic drugs we were on.

The solution

Grant writing is the perfect chore for ChatGPT. Indeed, the researchers and the chatbot are both expert at producing gray, stereotyped, soulless prose describing meaningless experiments on transgenic animals that have no ability to actually get the disorder the researcher wants to treat despite the vast number of mutations that have been introduced into it.

The reviewers also know the grant writers have no intention of doing all that great stuff they propose. They have either done it already, which is the only way they could be so confident it's going to work, or they will, by the time the grant gets funded, have either forgotten what they proposed to do or discov­ered that the whole idea was stupid. Well, no matter, no matter: we'll use the money to generate preliminary data for the next grant. That brings us to the main rule about grants:

The purpose of writing a grant is to get enough money to write the next grant.

This is not unique to science, of course. One time I donated fifteen bucks to some nonprofit organization. They used every cent of it to bombard me with requests to send them more. If we're foolish enough to join some website's “VIP” program the same thing will happen, though nowadays it's all done by computers sending messages to other computers that nobody ever reads.

And therein lies the solution. Let the chatbot write our diversity statements, then let it write our grants. Let it fill out those Vertebrate Animal forms, the Select Agent forms, the Resource Sharing Plan forms, the Authentication of Key Resources forms, the Personnel Justification forms, the PHS398 forms, the SF-424 forms, the Biohazard forms, and whatever else I've forgotten, and send them in. Let another chatbot review them—a perfect task for it, as a grant review panel is essentially an exercise in groupthink—and create a third chatbot to wrangle with the bureaucrats about how much, if any, of the grant money gets distributed to the lab. And leave us the hell alone.

When ChatGPT turns government inefficiency into a black hole from which nothing can escape, the rest of us can go back to work in peace.


mar 10 2023, 7:13 am


Related Articles

Is 'disruptive' science really decreasing?
A group of economists and management experts forgot the first rule of economics: you get what you pay for

Artificial intelligence is not really intelligent. That goes double for ChatGPT
Regurgitating text slurped from the Internet isn't what we had in mind

Survivorguy in academia
Today we have a nightmarish survival narrative, complete with visions of “post-structuralism” being “academic” due to “its” “excessive” “use” of binary “quotation marks.”

How close are they to real AI?
We read the textbook on ‘deep learning’ so you don't have to.


On the Internet, no one can tell whether you're a dolphin or a porpoise

back
science
technology
home