Archive for March, 2010
The practice of castration has historically been employed to great effect in certain cultures of Europe, The Middle East, India, Africa and China, for punitive, religious and/or social reasons. One of the most famous victims of castration, the medieval French philosopher, scholar, teacher, and (later) monk Pierre Abélard, was forcefully castrated by the relatives of his lover, Héloïse, as punishment for having “seduced” her. While castration is no longer an accepted means of pursuing vigilante justice, a growing number of countries utilize surgical and chemical castration on convicted sex offenders in the hope of reducing recidivism rates. Unsurprisingly, the long-term efficacy and ethics of this treatment are hotly debated.
The states of California, Florida, Georgia, Montana, Oregon, Wisconsin, Louisiana and Iowa have authorized the use of chemical castration for sex offenders in an effort to prevent sexual assaults. Chemical castration is the administration of medication designed to reduce libido and sexual activity, in the hope of preventing rapists, child molesters and other sex offenders from repeating their crimes. Unlike surgical castration, in which the testes are removed and the production of testosterone ceases completely and permanently, chemical castration is not a form of sterilization and is generally considered reversible when the treatment is discontinued. In an effort to stave off criticism by human rights activists, officials were quick to point out that taking the hormone is still entirely voluntary; the incentive, though, is that the incarcerated sex offenders who agree to take the injection will serve much shorter prison sentences.
Interestingly, many feminists have taken issue with the implementation of this practice, arguing that rape is more about power than it is about sex. Thus, while a rapist may have a lower libido, it doesn’t necessarily mean they won’t have the compulsion to exert force or dominance over another human being. While the jury is still out with respect to the long-term impact of chemical castration on recidivism rates, statistics strongly support the efficacy of physical castration in reducing the re-offense rate of sex offenders. In a landmark 1963 study, German researchers reported that physical castration resulted in a 20-year re-offense rate of less than 2.3% vs. 80% in the untreated control group, much lower than had been anticipated.
Research indicates that chemical castration can temporarily reduce sex drive, compulsive sexual fantasies, and the capacity for sexual arousal. Life-threatening side effects are rare, but many men demonstrate a greater propensity towards weight gain (despite exercise) and reduced bone density, which increases long-term risk of cardiovascular disease and osteoporosis. These men may also experience other “feminizing” effects, such as gynecomastia (breast growth) and a general loss of body hair and muscle tone. Interestingly, many sex offenders who have voluntarily submitted to chemical castration actually report being happy with its effects, claiming that it relieves them of the daily burden of fighting off their overwhelming sexual urges.
What Do You Think? »
For high school students who long to attend the college of their choosing, the SAT (formerly Scholastic Aptitude Test and Scholastic Assessment Test) is more than just another test; it is the gatekeeper of their future. Thus, it should come as no surprise that this $45, three hour and forty-five minute test has elicited heated controversy since it was first introduced in 1901. The charge that the SAT is slanted in favor of privileged children—“a wealth test,” as Harvard law professor Lani Guinier calls it—has been ubiquitous. The large body of research on class and cultural bias in standardized testing generally has two main components: In some questions, white people are made to look superior to minorities; and in some questions, there is a presumption of knowledge that is more likely to be held by whites than minorities, providing white students with a hidden advantage.
One of the most widely cited examples of this bias is the now-infamous “oarsman-regatta” analogy question. The question asked the test taker to identify the pair of terms that are the most analogous to the relationship between “runner” and “marathon”. The correct answer was “oarsman” and “regatta”. Critics of this question argue that the correct response assumes that test takers are familiar with crew, a sport that actually manages to be whiter and more elitist that squash, if that is even possible. Accordingly, Fifty-three percent (53%) of white students correctly answered this question, while only 22% of black students got the question right.
Educational Testing Service (ETS), the non-profit organization which creates questions for the SAT, GRE and a plethora of other standardized test, responded to this criticism by developing a set of fairness guidelines (which are posted on their Web site). “Every question is reviewed by trained sensitivity reviewers to ensure that questions do not contain any materials that might contain racial, gender, geographic, or other obvious biases,” said Thomas Ewing, who directs media relations for the testing service.
However, even if we start from the premise that most cultural bias has been successfully eliminated from the test, these changes do little to address the real inequality underlying the test: the meteoric rise of the extremely lucrative SAT coaching business, which has boomed over the past decade. These ubiquitous tutorials tout a series of special tips and strategies for acing the SAT, and these classes have become de rigueur for privileged students. Thus, even with heightened attention towards ameliorating “bias” on the actual SAT, the fact remains that the privileged will always find a way to use their money to maintain every advantage for their children. Frankly, I can’t say that I would do any differently if given the choice….
What Do You Think? »
Rote learning is a term for a learning technique which focuses on fixing information in your memory through repetition (memorization), and has traditionally been the backbone of elementary school curricula throughout the world. In the United States, rote learning has been strongly criticized by some educators who believe that the process involves learning facts without developing a deeper understanding of them. These critics characterize rote learning as “out of style,” “ghastly boring” and “mindless.” They argue, for example, that memorizing vocabulary words is pointless if children do not know how to use them in conversation. On the other hand, proponents of rote learning maintain that it is a condition precedent in the learning process which establishes a foundation for the development of the deeper understanding that will develop with time. Moreover, they also defend memorization as an absolute necessity in some areas, such as learning multiplication tables, state capitals, foreign languages and steps in a complex process or equation.
The general consensus today is in agreement with the critics of rote learning, and new national curriculum standards have been refashioned to reflect the belief that instant recall is superfluous in the internet age. In fact, today’s technology has people insisting that school children “should no longer be forced to memorize facts and figures because such information is readily available on the internet.” However, it is not entirely clear that dismissing all forms of rote learning will in fact make students stronger in the long run; many an elementary teacher insists that the process of rote learning may well serve as a catalyst to the development of the brain as a whole, leading the way to the possibility of higher order thinking down the road.
Despite having fallen out of favor in the United States, the rote learning system is still enthusiastically practiced around the world, particularly in Asian countries such as India, China and Japan. Notably, these nations are admired for their high test scores in mathematics and science in international comparisons. While it is admirable that teachers are now devoting more classroom time on developing higher order thinking skills, the all-or-nothing approach to curriculum reform that prevails in the U.S. education system leaves no room for assessing different scholastic subjects differently. The best approach for learning math and science for example, as reflected in the much higher aptitude scores found in Asian countries, might be the old fashioned, repetition and drills approach of rote learning. However, subjects that require more analytic thinking, such as humanities subjects, should eschew rote learning in favor of critical analysis.
At the end of the day, it is clear that both approaches to learning are relevant, and when truly examined, are almost inseparable. Neither approach should be adopted as the absolute be-all or end-all approach to teaching America’s children.
What Do You Think? »
Little is written in the Synoptic Gospels about the life of St. Thomas the Apostle; nevertheless, his distinctive personality is clearer to us than that of some of the other twelve disciples (with the exception of Judas, of course). He is often referred to as “Doubting Thomas,” a term that has come to be used to describe someone who will stubbornly refuse to believe something without direct, physical and personal evidence (also known as a skeptic).
According to the Gospel of John, Jesus appeared to a group of his disciples after he was resurrected, but Thomas was not in attendance for this special guest star appearance. The lucky disciples excitedly told him, ‘We have seen the Lord!’ But he [Thomas] said to them, ‘Unless I see the nail marks in his hands and put my finger where the nails were, and put my hand into his side, I will not believe it.’” Eight days later, Jesus appears before His disciples again (this time Thomas was there). Though the doors were locked, Jesus came and stood among them and said, ‘Peace be with you!’ Then he said to Thomas, ‘Put your finger here; see my hands. Reach out your hand and put it into my side. Stop doubting and believe.’ Thomas said to him, ‘My Lord and my God!’ Then Jesus told him, ‘Because you have seen me, you have believed; blessed are those who have not seen and yet have believed.’
What Do You Think? »
Is it mean to assume that they are members of a non-selective eating club?
Princeton University’s singular eating clubs first garnered national attention in 1920, when F. Scott Fitzgerald published his first (semi-autobiographical) novel, This Side of Paradise. These mysterious eating clubs have (perhaps unfairly) been instrumental in shaping Princeton’s reputation as the “Iviest of Ivies”- a leafy bastion of old money and privilege populated by elite New England wasps and Southern gentry who stroll around campus with voices that sound like money and nary a black person in sight. Of course this is not an entirely fair characterization of the school, but neither is it an entirely unfair one either.
Eating clubs are private institutions that are a cross between dining halls and social houses, where the majority of Princeton upperclassmen eat their meals. These co-ed clubs are housed in stately mansions off campus, primarily along Prospect Avenue, also known as “The Street”. Princeton Undergraduates currently have ten eating clubs to choose from. Five of these clubs- Cap and Gown Club, Princeton Tower Club, The Ivy Club, Tiger Inn, and University Cottage Club- bill themselves as “selective” and chose their new members through a process called “bicker.” The other five clubs- Cloister Inn, Princeton Charter Club, Colonial Club, Quadrangle Club, and Terrace Club – are non-selective, and the members are chosen through a lottery system called “sign-in.” Despite the fact that a whopping 70% of Princeton upperclassmen are members of eating clubs, the Princeton administration maintains that they are not officially affiliated with the University.
In his first novel, Fitzgerald offered a primer on the distinct character and social standing of the Princeton eating clubs. He pegged the exclusive Ivy Club as “detached and breathlessly aristocratic” and Tiger Inn as “broad-shouldered and athletic, vitalized by an honest elaboration of prep-school standards” — both descriptions that could apply today. Fitzgerald was a member of the University Cottage Club, “an impressive mélange of brilliant adventurers and well-dressed philanderers.” Almost a century later, most Princeton coeds would not quarrel with this assessment.
You are probably still wondering why the heck some students decided to form the first official eating club in 1879, aptly titled, “Ivy.” Well for starters, the Princeton campus dining hall was a Boschian nightmare; the founders of “Ivy” banded together to hire a cook after being driven to despair by the abysmal culinary offerings and sporadic operations of the campus’s notoriously subpar dining hall. Shortly thereafter, a number of copycat eating clubs sprang up across Princeton’s campus, also seeking an escape from the inedible cuisine and jonesing for the opportunity to recreate the petty horrors of the High School cafeteria cliques on a grand scale. Moreover, to the chagrin of many of the status obsessed students, Princeton banned fraternities and secret societies from the middle of the nineteenth century until the 1980s. Thus, the eating clubs came to fulfill two powerful needs for the everyday Princeton student: namely, sustenance of the body and sustenance of the ego through belonging to something that not everyone can belong to.
In 2007, Meg Whitman, an alumna of Princeton and chief executive officer of eBay (as well as a former member of the “selective” Cap and Gown club), donated a whopping $30 million dollars towards a new residential complex on the campus, known collectively as “Whitman College.” These Über luxurious living quarters include duplex suites, semiprivate dining rooms, classrooms, digital photo lab, performing arts theaters (complete with dressing rooms), a piano lounge (seriously?) and a movie room. Whitman’s deluxe digs are aimed at giving “independents” (upperclassmen who bravely eschew joining an eating club) a happy alternative to spending two years alone, unloved and forced to subsist on ramen and bagel bites while being forced to live in the unappetizingly nicknamed “junior slums.”
Non-Sheeples of Princeton rejoice!
What Do You Think? »
During the hellish Second Sino-Japanese War (July 7, 1937 – September 9, 1945), an estimated 7,643 people died of the bubonic plague in the outbreaks that exploded after the Imperial Japanese Amy Air Service deliberately bombed parts of Hunan and Zhejiang provinces with fleas infected with the bubonic plague. The fleas were specially raised by the imperial army’s Epidemic Prevention and Water Supply Unit, better known as the notorious Unit 731. Japanese doctors infected yellow rats with the plague and dropped them into flea-filled oil drums. Workers then loaded the weaponized fleas into ceramic shells designed to burst open over the population of these provinces. Japanese generals hoped that a nation-wide plague epidemic would collapse China’s grain harvest and its army would be starved into surrendering.
The Bubonic plague, caused by the Gram-negative bacterium Yersinia pestis, is a flea-borne infection that enters the skin and ravages the lymphatic system, killing about two out of three of infected patients in 2-6 days without treatment. Scientists believe that this nasty bacterium was probably the cause of the Black Death, which killed more than a third of the European population (more than 25 million people) in the 1300s.
The horrors of Unit 731 came into sharp focus on Aug. 27, 2002, when Tokyo judge Koji Iwata issued a landmark decision on a class-action lawsuit brought by 180 Chinese victims of the 1940-41 plague, who sought damages for the biological horrors inflicted by Unit 731. Despite the flood of testimony from alleged victims and even some of its perpetrators, the Japanese government has long denied the allegations leveled at the much-loathed Unit. However, Hon. Iwata found that “The deployment of biological weapons was a strategic part of Japan’s war plans and was carried out under orders from the central army,” and that Unit 731 was responsible for planning and carrying out these atrocities. However, he stopped short of granting the aggrieved parties any compensation, ruling that there is no international law that enables individuals to sue for war damages.
Thanks but no thanks, Judge Iwata.
What Do You Think? »
The term “Diseases of affluence” is used to describe specific diseases which are linked to the increased prosperity of a given society. This is in contrast to “Diseases of poverty” (such as AIDS, malaria and tuberculosis), which disproportionately afflict people who live in poorer nations. Unlike Diseases of poverty-which are usually communicable via infection, poor sanitation/hygiene, and/or the lack of enforceable environmental health regulations- Diseases of affluence are not contagious and are generally tied to unhealthy “lifestyle” habits, such as the over-consumption of high-fat, high-sugar food and a lack of regular physical activity. Experts have posited a number of theories regarding the possible causes of the rising prevalence of these diseases in wealthy countries, including:
An overall decrease in strenuous physical exercise, often because of the increased reliance on cars (haven’t you noticed that the more parking lots a given city or state has, the fatter the people are);
Easy accessibility in society to large amounts of low-cost, heavily processed food;
More consumption of food generally, with much less physical exertion expended to counter the effects of all of these extra calories;
Higher consumption of meat and dairy products;
More foods which are processed, cooked, and commercially provided (rather than seasonal, fresh foods prepared locally at time of eating;
Increased leisure time;
Prolonged periods of inactivity;
Greater use of alcohol and tobacco; and
Longer life-spans (this especially holds true for cancer and heart disease).
Currently, the obesity “epidemic,” one of the leading causes of preventable death worldwide, has become the most prevalent and high profile disease of affluence in wealthy nations over the past twenty years. Obesity poses an especially daunting public health problem in the United States, where an estimated 34% of adults over 20 are classified as obese (and a whopping 67% of adults over 20 fall into the category of qualifying as overweight or obese). Moreover, obesity drastically increases the likelihood that an individual will develop another “affluent” disease, such as Type 2 diabetes, heart disease and certain types of cancer.
With another three years to go and hoping to find something remotely interesting to devote her energies to now that she lives in the gilded cage of the White House, Michelle Obama has recently announced her intention to spearhead a “very ambitious” program to combat childhood obesity. Called “Let’s Move,” her campaign has set the formidable goal of eliminating the problem of childhood obesity in one generation. Ms. Obama hopes to accomplish this by addressing what she calls the “four key pillars”: educating parents about nutrition and exercise; making cafeteria lunches healthier; ensuring that healthy food becomes more affordable and focusing more attention on physical education in schools.
As laudable as the First Lady’s anti-obesity campaign seems to be on paper, it will prove ineffective if it merely parrots the other health promotion programs that have attempted to combat obesity by further fanning the flames of anti-fat hysteria that has whipped the nation into an irrational frenzy. Alas, we are a nation of overweight people who despise our (swelling ranks) of fat people, perhaps fearing that one day we will share their “humiliating” fate. However, until America makes a real effort to confront its schizophrenic attitude towards body fat, we are doomed to remain enslaved by it.
What Do You Think? »