Thursday, September 15, 2005

Google censors, organic chemistry rules, and don't believe a word of it

Searching for the real Paris Hilton--Did you know that Google filters its searches so as to not offend the offendable? It’s an option most people don’t know exists called SafeSearch. You set it on your Google Preferences. You can do moderate filtering, which excludes most explicit images (presumably including Paris Hilton’s reason for fame--she apparently is not big on the missionary position), strict monitoring (which guarantees you get noting of interest) or turn it off and go with the flow, which trust me, you don't want to do). The problem with the filtering is that no filtering system is perfect and most aren’t very good. You get false positives; it filters out things you really want because this is just a machine, after all. It is possible, for instance, to filter out some breast cancer sites that you might want because the damned program thinks its dirty. So, to the rescue we have Google UnSafeSearch for those of you who want to make sure you don’t miss anything important but don’t want to have gynecological adventures in your web searches. (It also lets you filter when kids are on the computer and still not miss what you want when you recover the computer). It cleverly grabs the first 100 results of a filtered Google search and the first 100 results of an unfiltered search and compares the two. It then shows only those results which appeared soley in the unfiltered search so you can see what the filtered search filtered. It’s here. We aim to please, (and thank you Jonathan. Again.)

What’s with this organic chemistry?--Having described who the most cited clinical scientists are in the world, what about the rest of science? Which are the most cited papers in each country in all the sciences? ISI Essential Science Indicators has an answer. In the U.S., the paper cited most often in the last 10 years was “Gapped Blast and PSI-Blast, a New Generation of protein Database Search Programs,” published in Nucleic Acids Research (I have a copy by my bed-stand) by Stephen Altschul et al at the National Institutes of Health. It has been cited--and is still being cited--13,478 times. I have not the faintest idea what it is about but it smells of organic chemistry and I try to avoid such things. The country with the second most cited papers is England and the winner there is "Specificity Of Receptor Tyrosine Kinase Signaling - Transient Versus Sustained Extracellular Signal-Regulated Kinase Activation" by C. J. Marshall in Cell in 1995. It’s been cited 2,544 times and I don’t understand that one either. Too bad the authors don’t get residuals.

Look, I had a grant. I had to say something--And then there is the possibility that most of those papers--indeed most scientific papers--are wrong. A Greek-American physician, writing in the Public Library of Science, has proposed just such a thing. You’ve seen it happen: coffee is bad for you; no it’s good for you; wait a minute, it will take years off your life. Vitamin E will prevent cancer. No, vitamin E will not only not stop cancer it can cause a heart attack. Make up your minds. John P. A. Ioannidis of the University of Ioannina School of Medicine and Tufts, says the truth in a science paper depends on how many other papers on that subject have been published and the “ratio of true to no relationships among the relationships probed,” although I’m not sure what that means. Here’s Ioannidis:
There is increasing concern that most current published research findings are false. The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser preselection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance. Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias.

What he says.

[The picture above is the Piltdown man, one of science's most famous frauds. He probably wouldn't have liked the missionary position either].

No comments: