Printable handouts are culled from free authoritative sources such as UpToDate Patient Handouts, MedlinePlus, Mayo Clinic, Kidshealth.org, and InteliHealth. There are buttons for selecting Spanish language, seniors, low literacy, pediatrics, and large print. Additional languages may be added later.
An article in the August 23 Los Angeles Times offers some insight on physicians’ use of the Internet and Google in daily practice.
While noting that “86% of doctors say they regularly use the Internet on the job,” Dr. Rahul K. Parikh reports that it “sometimes feels like cheating on an exam” and the results, even from Google Scholar, aren’t all they’re cracked up to be. He points out, for example, that Google Scholar pulls up old articles first because their algorithm gives greater weight to articles that are cited more. I didn’t know that!
Read the full story: In Practice: Dr. Google has mixed results
I’ve been training librarians to use the various databases in the Michigan eLibrary for a number of years. Of course, every time I think I’ve got the “advanced” techniques down cold, the publishers come along and tweak the programming behind the scenes to make it easier for those using basic search techniques to get a better result. In the process, my “expert” knowledge gets tossed out the window without my being any the wiser.
A recent post in Laika’s MedLibLog demonstrates this principle with the Cochrane Library. When more is less: Truncation, Stemming and Pluralization in the Cochrane Library points out that Cochrane’s search feature automatically uses stemming, pluralization, and singularization, so searching tumor finds tumor, tumors, tumour, and tumours. The feature only works when searching without truncation, though, so using tumor* will find tumorectomy, etc., but not tumour and its variations.
The lesson for us expert searchers is, from time to time, to check the “help” documents to see if our techniques are still valid — particularly if we start getting odd results.
A Bing search box has turned up at the top of our browsers at work, and I’ve used it a few times, and it’s OK. But I’m still a Google user just because I’m comfortable with it. So today’s NPR story, “Bing vs. Google: a weeklong experiment” was really intriguing. National correspondant James Fallows used Bing … or rather, started to use Bing … to track facts for a news story. He found some things missing from Bing, and some missing from Google, and had the best results using www.bing-vs-google.com‘s side-by-side display of the two. Try it out! (Remember Dogpile?)
Another recent Google story in Search Engine Land: Google’s Personalized Results: the “new normal” that deserves extraordinary attention. Earlier this month, Google announced in its blog that personalized search will now be available to signed-out users. While it’s possible to opt out — see the Google Blog story for details — most of the time, most search results will be “customized” to reflect previous searches. The Search Engine Land story explores the implications.
— Google’s Personalized Results story via Jessamyn West’s librarian.net: the nature of observing disturbs the observed
Biomedicine on Display points us to search-cube, a search engine powered by Google, Thumbshots.org, and Symmetri. Results are displayed as a 3D cube, weighted toward images. A search for “Michigan Health Sciences Library Association MHSLA” displays images from this and other blogs as well as the front pages of the MHSLA website and MHSLA blog. It’s not comprehensive — the information seems to come from the first few pages of a Google search — but a creative way to look at results.
Science Roll brings another visual search display to our attention: Wikipedia Roll, a mashup that organizes results of a Wikipedia search into clusters of related information. For example, a search for “medical library” offers the text of the Wikipedia article on the topic, a cluster of key elements, a cluster of associations, and a cluster called “see also”, with links to the National Library of Medicine, Fred Kilgour, and the Canadian Health Sciences Library Association, among others. The clusters and results box can be moved around the screen, and clicking a link from the display (such as “National Library of Medicine”) performs a Wikipedia Roll search on that topic.
Cognition Technologies offers a free semantic search engine pulling results from Medline abstracts. They call it Semantic Medline, or Medline.Cognition; it has two URLs, http://www.semanticmedline.com/ and http://medline.cognition.com/ . The help page delineates the proper use of capitalization, quote marks, Boolean, proximity, pattern matching, wildcards, and required vs. optional search words for Cognition searching.
An interesting feature: a set of dropdown boxes appearing on the results page that allow the user to tweak the search by selecting a more appropriate meaning for a search term, where necessary. For example, the search term “pain” has three meanings: unpleasant physical feelings (the default choice); vexatious person/hassle/annoyance; and unpleasant emotional experience. A “use all” option is also available.
My sample searches using Semantic Medline sometimes retrieved more than a matching PubMed Boolean search, and sometimes less. It appears that foreign-language articles are not included in Semantic Medline.
Thanks to David Rothman for the pointer –
Unbound MEDLINE (http://www.unboundmedicine.com/medline/ebm/) provides a clean, simple interface for filtered clinical query searches in PubMed data. Keyword and “advanced” searching, as well as “browse by topic,” are also available. While it looks great in a web browser, it was designed with the handheld device in mind. Unbound MEDLINE is a service of Unbound Medicine (http://www.unboundmedicine.com/), which “develops next-generation knowledge management systems in healthcare.”
Thanks for this tip to Exploring the Evidence Base (http://clinicallibrarian.wordpress.com/), a “bliki” from the Centre for Clinical Effectiveness in Melbourne.