Archive for February, 2010
In the United States, a signature is often colloquially called a “John Hancock,” after the first and most flamboyant signatory of the Declaration of Independence. According to legend, Hancock allegedly signed his name so largely and clearly so that King George could still see it without his glasses. Even thought killjoys debunked this tall tale long ago, it is a lot more fun (and completely harmless) to just go on pretending that it is true.
Despite enjoying widespread name recognition, John Hancock has suffered the strange fate of remaining an anonymous historical figure to most Americans. He was not only one of colonial America’s most ardent revolutionaries and philanthropists, but he also served as the governor of Massachusetts for nine-terms and was the president of the Continental Congress when the Declaration was signed. Born in Braintree, Massachusetts in 1737, he was orphaned by his biological parents and adopted by a wealthy merchant uncle who was childless. In 1763, his uncle died and John Hancock inherited what was said to be the greatest body of wealth in New England. He risked much of his fortune on the success of the revolution and took the risk of being hanged for treason by signing the Declaration (and so boldly too boot!).
Perhaps we know so little about John Hancock because his iconic signature says everything we really need (or want) to know about him: that he was bold, brave and thoroughly committed to the cause of an independent America. His bold signature set the standard for the rest of the Founding Fathers to follow.
What Do You Think? »
In the twelfth month, which is the month of Adar, on its thirteenth day … on the day that the enemies of the Jews were expected to prevail over them, it was turned about: the Jews prevailed over their adversaries. – Esther 9:1
And they gained relief on the fourteenth, making it a day of feasting and gladness. – Esther 9:17
[Mordecai instructed them] to observe them as days of feasting and gladness, and sending delicacies to one another, and gifts to the poor. – Esther 9:22
Purim is a holiday that commemorates the deliverance of the Jewish people of the ancient Persian Empire from (super evil dictator) Haman’s plot to exterminate them. Purim is celebrated on the 14th day of Adar (usually in March), the day that, according to the Book of Esther, Haman had chosen to kill the Jews (after casting lots to help him decide), and the day that the Jews successfully defended themselves against his vicious attack.
In cities that were walled in the time of Joshua, Purim is celebrated on the 15th of the month, because the book of Esther says that in Shushan (a walled city), deliverance from the massacre was not complete until the next day. The 15th is referred to as Shushan Purim. In leap years, when there are two months of Adar, Purim is celebrated in the second month of Adar, so it is always one month before Passover.
Purim is celebrated by a public reading of the book of Esther. During the public recitation, it is customary to boo, hiss, stamp feet and rattle noisemakers whenever (super evil dictator) Haman’s name is mentioned during the service. There is also mutual gift giving and a celebratory meal, replete with costumes and lots of wine. In sum, Purim is a blast, and many Jews describe it as their favorite holiday.
What Do You Think? »
A Giant hairy nevus, also known as a Giant congenital nevus, is a dark colored and often-hairy patch of skin that is present at birth. They are commonly found on the upper or lower parts of the back or the abdomen, but they may also be found on the arms, legs palms, soles, and even the mucous membranes or mouth. While doctors know that they result from a proliferation of benign melanocytes in the dermis, epidermis, or both, the etiology of the condition remains unknown.
Smaller in infants and children, the nevus continues to grow along with the child if it is not removed during infancy, and it usually measures larger than 8 inches by the time it stops growing. As a nevus matures, it often becomes markedly thicker and elevated. To make matters worse for an unfortunate nevus bearer, prominent dark hairs usually sprout out of the cursed eyesore at the onset of puberty. It also may develop variations in color, and the surface may become mottled with additional growths. In sum, a giant hairy nevus screams, “Look at me World!” despite every effort of the part of the world to avoid doing so. The conventional horrors of adolescence- complete with acne, greasy hair, braces and mood swings- suddenly look like something to be grateful for.
Surgical excision is the standard of care for the removal of a giant hairy nevus (the earlier the surgery is performed, the better are the results, starting at 6 months old). Moreover, congenital nevi are not merely a source of psychological torment for those who suffer from the condition; they also carry a high premalignant potential for melanoma. After the excision of a giant hairy nevus (which sometimes involves several surgeries if it is HUGE), lasers and dermabrasion is often used to improve the appearance of the skin and help alleviate the inevitable scarring.
Skin grafting is also necessary in instances where the nevus is extremely deep and or large. However, doctors caution that these techniques may only remove the visible portion of the nevi, and may make it harder to detect skin cancer. Thus, an important part of follow-up treatment involves frequent examinations to check for signs of melanoma in the affected area(s).
Unsurprisingly, psychological counseling is also urged to help sufferers cope with the emotional impact of having a disfiguring disorder.
What Do You Think? »
It is unsurprising that there are so many persistent myths, legends and lore about the life of George Washington, the Grand Poobah of America’s Founding Fathers. Contrary to one of the most enduring myths about our first President, Washington never in fact owned a wooden set of teeth. However, there is always a grain of truth behind every myth, and Washington did in fact suffer from lifelong dental problems that forced him to rely on many sets of dentures during his lifetime.
Washington lost his teeth at a young age by today’s standards, due in no small part to the primitive oral hygiene practices of the day and a long history of debilitating illnesses he described at length in his journals: a nasty case of smallpox in 1751; a bout of violent pleurisy the following year; severe headaches and dysentery in 1755; and a wretched outbreak of now-exotic “breakbone fever” (dengue fever) in 1761. This is not to mention the intermittent attacks of malaria, flu and chronic rheumatic complaints that he also endured over the ensuing years.
Thus, violent toothaches followed by the removal of said offending tooth were a yearly occurrence for Washington, until he finally had no teeth left. He wrote movingly about his battle with infected and abscessed teeth, inflamed gums, and the ill-fit of his first few sets of dentures. In fact, his chronic dental problems are often partially credited with exacerbating his legendary bad temper.
The most remarkable thing about the enduring legend of Washington’s wooden teeth is the fact that the truth is so much more remarkable than the fiction: his favorite set of dentures (of which he had two pairs), crafted by Dr. John Greenwood (the most prominent American dentist of the day), were actually carved from HIPPOPOTAMUS ivory and pure gold!
One pair was lost long ago, but the other set was donated to the University of Maryland Dental School. The Dental school kindly loaned them to the Smithsonian in 1976 for a bicentennial exhibit, only to be punished for their generosity when they were mysteriously stolen from the Smithsonian’s storage area. Sadly, they have never been recovered, and the culprit remains at large…..
What Do You Think? »
Even celebrities aren't immune to the scourge...
The bane of most adult women everywhere, cellulite is an unsightly cosmetic condition that occurs when the skin of the thighs, buttocks, abdomen and/or pelvic region develop a dimpled and uneven appearance. It is the phantom that jeers at you during bathing suit season, stalks you in dressing rooms and drives women to avoid lights-on romps in bed. Even though it is a descriptive term rather than an actual “physical” condition, it is all too real to those afflicted with the blight of “cottage cheese” thighs.
Most women (even thin ones) start getting cellulite after puberty. However, most men (even fat ones) never develop it, because the connective tissue under men’s skin is crisscrossed like a net, which better restrains their fatty deposits. Women’s tissue bands are organized in vertical columns instead (why God why?), so fat is more likely to bulge irregularly. Moreover, thanks to estrogen, women have more fatty reserves to begin with.
Sadly, no “cure” currently exists for the treatment of cellulite. This has presented the beauty industry with a perfect opportunity to make a lot of money off women’s anxiety about their looks (nothing new there). Hundreds of cellulite “treatments” abound, from contour-refining lotions to massage machines with laser light sources. Despite the dubious efficacy of these products, cellulite reduction devices generated more that $47 million dollars in revenue in 2008, and industry insiders predict that it will grow to $62 million by 2013.
However, dermatologists say that a lasting cellulite remedy would have to address the complex interplay between skin, fat, connective tissue and underlying muscle. In sum, ladies shouldn’t hold their breath for a “cure” to come along anytime soon.
What Do You Think? »
Is this supposed to be a threat?
Prohibition in the United States, also known as “The Noble Experiment” was the period of almost fourteen years when the manufacture, sale and transportation of alcohol was made illegal as mandated by the ratification of the Eighteen Amendment of the Constitution. The Temperance movement blamed alcohol for many societal ills-such as crime, murder and prostitution- and its membership was overwhelmingly (and unsurprisingly) women who were sick of their husbands coming home drunk when they weren’t permitted to join in on any of the fun.
Hell hath no fury like a group of sober women scorned, and the Senate finally caved into the substantial pressure from Temperance organizations in 1919, and Prohibition officially went into effect on January 1, 1919, the only day of the year in which the nation officially wakes up with a collective hangover.
Despite the lofty aims of “The Noble Experiment,” Prohibition had the opposite intended effect on the drinking habits of Americans; alcoholism rates soared in the 1920s, as did crime, thanks to the meteoric rise of bootlegging and organized crime. In an effort to curb Frustrated by the public’s unexpected defiant reaction to the new laws, federal officials decided to resort to more extreme measures to curb illicit drinking.
Dubbed the “chemist’s war on Prohibition,” the feds ordered the “denaturing” of all industrial alcohols manufactured in the United States, which were regularly stolen by bootleggers and sold to unwitting and thirsty consumers. Denatured alcohol is ethanol that contains additives that render it poisonous and/or palatable, and thus undrinkable. When the bootleggers responded to this obstacle by hiring chemists to “renature” industrial alcohol, the feds responded by ordering manufacturers to make their products deadlier.
By the mid-1920s, these new toxic formulas included notorious poisons, including kerosene, brucine (a plant alkaloid closely related to strychnine), gasoline, benzene, cadmium, iodine, zinc, mercury salts, nicotine, ether, formaldehyde, chloroform, camphor, carbolic acid, quinine, and acetone. The Treasury Department also demanded more methyl alcohol be added—up to 10 percent of total product. It was the last that proved most deadly. Alas, while the feds managed to successfully kill an estimated 10,000 people, they were unable to quell America’s passion for booze.
And to that, I propose a toast….
What Do You Think? »
The placebo effect is the measurable, observable, or felt improvement in health or behavior not attributable to a medication or invasive treatment that has been administered. The idea of the placebo effect originated with H.K. Beecher in the 1950s, when he discovered that 35% of his patients were satisfactorily relieved of their symptoms by a placebo alone.
Subsequent studies have even claimed that the placebo effect is even more effective than Beecher thought, with an estimated 50-60% of test subjects satisfactorily treated with placebos for certain conditions. However, a new study published in the Journal of the American Medical Association last month sent shock waves through the medical community, when researchers reported that the benefits of antidepressants are hardly more than what patients get when they take a placebo pill. In sum, their findings suggest that antidepressants are little better than expensive tic tacs.
However, many doctors are reluctant to blow the whistle on antidepressants, fearing that it effect their patient outcomes. The placebo effect relies on a complex interplay between belief, expectation and attitude, and doctors (perhaps correctly) fear that telling their patients the truth will upset this equilibrium and that their depression will return. However, antidepressants are NOT just placebo pills with futuristic names; they are extremely costly drugs with notable side effects, such as weight gain, liver damage and sexual dysfunction. Moreover, people who stop taking antidepressant medications abruptly often experience a host of withdrawal symptoms, including twitches, tremors, blurred vision, and nausea—as well as depression and anxiety.
This study raises some very uncomfortable ethical questions about these popular medications, especially in light of the estimated 27 million Americans who are currently prescribed antidepressant medication. Doctors might find themselves caught between being dishonest and preserving the positive outcomes of their patients or telling them the truth and taking the risk that they might revert back to their depression. Now that’s a hard pill to swallow…
What Do You Think? »