Aarøe Nissen is a 22-year-old math student at Aarhus University,
Denmark, with extraordinary memory abilities. He has competed in memory
sports for several years. He can recite the number Pi to more than
20,000 decimal points, recall thousands of names, faces and historical
dates and remember the order of a pack of cards.
In 1973 Mary Rowe,
while working for the President and Chancellor at MIT, coined the
notion of micro-inequities, which she defined as “apparently small
events which are often ephemeral and hard-to-prove, events which are
covert, often unintentional, frequently unrecognized by the perpetrator,
which occur wherever people are perceived to be ‘different.’ " Examples
of micro-inequities include:
checking emails or texting during a face-to-face conversation
consistently mispronouncing a person's name
interrupting a person mid-sentence
making eye-contact only with males while talking to a group containing both males and females
taking more questions from men than women
confusing a person of a certain ethnicity with another person of the same ethnicity
Our perception of time varies greatly
depending on our age, mood, stress level and psychological health and
stability. Psychological disorders, such as Parkinson's disease,
attention deficit hyperactivity disorder and schizophrenia, can mess
with the brain's time keeping mechanism and warp our estimation of time.
Patients suffering from these disorders are unable to properly
coordinate events in time. Patients over- or underestimate time
intervals ranging from several seconds to minutes.
How does this
happen? How does the brain manage to keep track of time and what goes
wrong in psychological disorders? Our senses (sight, hearing, smell,
taste and touch) use specialized sensory systems with task-specific
neurons to process sensory input. Yet there is no specific sensory
system for time. So how does our sense of time come about?
Kukla had an excellent post over at Leiter Reports a few days ago about
whether the tendency to pursue an MA before your PhD is a good thing or a
bad thing for philosophy as a profession. I think this is an important
question but I must admit that I found many of the comments (or
unintended implications of the comments) enormously puzzling (if not
straightforwardly offensive). At some point during the debate there
seemed to be somewhat of a consensus that if you pursue an MA after your
bachelor's degree before pursuing a PhD, then your chances of becoming a
great philosopher (as opposed to a good philosopher) are greatly
diminished. Some commentators did offer reasons for thinking this
(whereas others didn't), but I must admit that I didn't quite understand
I regret to inform you that Awesome Bigname Philosophy Journal cannot accept your paper for publication. After having googled the title of your paper, and failing that, lines from your abstract and paper, our referee discovered your identity. He found that you are a nobody from an lackluster university, without a tenured or tenure track position but only a lowly [adjunct teacher, grad student, postdoc etc], and [a woman, black, non-English speaker etc] to boot. Therefore, after a perfunctory glance at your paper, the referee has decided that your paper is not of high enough quality to be published in ABPJ.
We pass on referees' comments in the hope that they may prove useful. We receive over n submissions each year, and must reject many very competent papers, especially those written by people on the bottom of the academic ladder. We hope that your work will find a home in another journal, though obviously one not as highly regarded as ABPJ.
In the supernatural thriller Memory,
written by Bennett Joshua Davlin, Dr. Taylor Briggs, who is the leading
expert on memory, examines a patient found nearly dead in the Amazon.
While checking on the patient, Taylor is accidentally exposed to a
psychedelic drug that unlocks memories of a killer that committed
murders many years before Taylor was born. The killer turns out to be
his ancestor. Taylor’s memories, despite being of events Taylor never
experienced, are very detailed. They contain the point-of-view of his
ancestor and the full visual scenario experienced by the killer.
the movie is supernatural, it brings up an interesting question. Is it
possible to inherit our ancestors’ memories? The answer is not black and
white. It depends on what we mean by ‘memory’. The story of the movie is
farfetched: there is no evidence or credible scientific theory
suggesting that we can inherit specific episodic memories of events that
our ancestors experienced. In other words, it’s highly unlikely that
you will suddenly remember your great-great-grandfather’s wedding day or
your great-great-grandmother’s struggle in childbirth.
We are conducting a study of color
discrimination and short-term color memory. I would be grateful if you
would participate in the study. You'll need to use the left and right
arrow keys to adjust the color of a square to fit the color of a second
image. It will only
take about 5-10 minutes. Click on the link below to begin. www.synesthesiaresearch.com/study
more important, we would really like to encourage people - including
WHITE MEN - to apply for the site visit training. It is important that
we have allies involved because having mixed teams will be more
effective than just a group of women....who are feminists, besides!
Not long ago, the Times of London published an article examining "why everyone wants to be Danish." It covered Danish society (according to scientists, Danes are the world's happiest people), Danish fashion ("a leather trim here, a matelot stripe and an edgy trilby there") . . . Danish sperm donation (last year, more than five hundred British women were artificially inseminated in Denmark—an ad for one clinic read, "Congratulations, it's a Viking!"). A "How Danish Are You?" quiz asked, "You like your skies a) Blue b) slate grey c) slate grey with vultures circling the carrion of slaughtered youth."
have worked on your paper on this extraordinarily complicated and remarkably
interesting solution to the knowability paradox for four years. You finally got
it right. After polishing the piece, vetting the ideas at department colloquium
talks and workshops and incorporating feedback from cohorts, you submit to a
top-ten mainstream journal. Then you patiently, very patiently, wait a year and
a half and get a rejection. Hostile referee reports! They HATE the paper.
witnessed his son Charlie, who had childhood epilepsy, undergo
frightening seizures. The boy would convulse and loose consciousness.
Medications didn't help. As his seizures continued, his cognitive
abilities slowly deteriorated. Jim, who wasn't a medical doctor, decided
to start investigating alternative treatments. After days in the
library looking through books and medical journals, he found a book on childhood epilepsy written by Dr. John Freeman, the director of the Pediatric Epilepsy Center
at Johns Hopkins Hospital. The book described that a diet that mimics
the metabolism of starvation by cutting most dietary sources of
carbohydrates and proteins could in some cases cure drug-resistant
who once set several major league baseball records, suddenly could
barely keep his body upright during practice. He would fall while
running bases, stumble over curbs and mishandle fielding plays. His
wife, Eleanor, was concerned. Her husband held records for most
consecutive games played, 2131 to be exact, and most career grand slams.
Though Lou said it was just a phase, Eleanor got on the phone with the
Mayo Clinic in Rochester, Minnesota. Charles William Mayo wanted them to
come right away. They arrived on June 13, 1939 and six days later on
Lou’s thirty-sixth birthday the doctors told Eleanor that her husband
suffered from amyotrophic lateral sclerosis (ALS). Lou Gehrig died less
than two years later.
"So many tangles in life are ultimately hopeless that
we have no appropriate sword other than laughter," said Gordon Allport, an
American psychologist and one of the founders of the study of personality.
Scientists have studied the effects of mirthful laughter, positive thinking and
optimism on feelings of self-worth, mood disorders and depression since the
In The Antidote:
Happiness for People Who Can't Stand Positive Thinking British author and Guardian feature writer Oliver
Burkeman takes issue with "the cult of optimism," the convention that
phony smiles, jovial laughter and positive thinking is a surefire path to
happiness. Positive thinking is the problem, not the solution, Burkeman teaches
us. He believes people have come to trust that a "Don't worry. Be
happy" attitude toward life is the only route to contentment. People seem
to be of the conviction that if you have negative thoughts and see your own
limits, you cannot be happy. So to be happy we must set out on a journey that
changes your mindset from negative and inhibited to enthusiastic, fervent and
animated. We are told to visualize our dreams and goals, eliminate the word
"impossible" from our vocabulary and put a big fabricated smile on
our physiognomy. All that actually can lead to unhappiness, Burkeman says.
When I was a child one of my favorite books was about a pair of
identical twins who decided to switch clothes. They looked so much alike
that their parents had had to dress one in blue and the other in green.
The twin boys fooled their parents for a long long time. An obedient
3-year old, I was thrilled by their ingenuity and boldness.
parents can usually tell the difference between their identical twins--grandparents, teachers, neighbors and peers sometimes cannot. And for
good reasons. Identical twins very often look almost exactly alike. No
surprise there, if identical twin share all of their DNA.
"Lucy in the Sky with Diamonds" was a product of the Beatles'
experimentation with psychedelic drugs is still a subject of great
debate among Beatles fans and music experts. But it was no secret that
the lyrics of many of the pop legend's famous tracks was inspired by
LSD, including "I am a Walrus," "Tomorrow Never Knows," and "What's The
Real Mary Jane." The Beatles' creating during a hallucinogenic trip is
not a rare case of acid-driven creation, invention or discovery. The
double helix structure of DNA occurred to geneticist and neuroscientist Francis Crick
while he was tripping on the Lucy drug and low-level tech Kari Mullis
hit on the idea behind Polymerase Chain Reaction (PCR), a now
widely-used technique for amplifying a single piece of DNA by a factor
of 100 billion, while cruising along the Pacific Coast Highway one night
in his car on LSD.
I am currently finishing a paper on the semantic and logical properties
of 'seem'. As 'seem' is a subject-raising verb, we can treat 'it seems'
as a sentential operator. This raises the question of how this operator
behaves logically. Is it hyperintensional? Does it distribute over
conjunction? Over disjunction? Over conditionals? Does it commute with
I think it's fairly obvious that 'it seems' is hyperintensional. It
seems to Lois Lane that Superman is not Clark Kent but it doesn't seem
to her that Superman is not Superman. The other questions are harder.
The U.S. legal system gives preference to adult testimony in court
cases. In 2002 Thomas Junta was accused of killing a man in a
Massachusetts hockey rink quarrell in 2000. Thomas’ son 12-year-old
Quinlan Junta was a key defense witness for his father but his testimony
did not convince the jury. Thomas was found guilty and sentenced to 6
to 10 years in state prison.
In his famous paper entitled "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information"
cognitive psychologist George A. Miller of Princeton University argued
that our working memory, our ability to hold information in our minds
for a few seconds, is limited to 9 items. That's fewer items than the
items of a regular out of town American phone number. In light of this
you might wonder what to say about cases of people with extreme memory
abilities. Chao Lu
holds the Guinness world record in reciting Pi, a record dating back to
2005. Lu recalled 67,890 digits of Pi in 24 hours and 4 minutes with an
error at the 67,891st digit, saying it was a 5, when it was actually a
0. How is it possible to retrieve this quantity of information
accurately through working memory? Is it magic? After talking to several
people working in memory sports, we found out that it's not.
You are preparing for your upcoming exam, reading through thousands of pages. Suddenly you realize that you forgot to pay attention to what you actually read. You were reading along but your thoughts were elsewhere. "Good God," you think. "Hours of wasted time." You turn back the pages and start over. This time you make sure you pay close attention.
Recent research, to appear in the journal PNAS, suggests that you may be wasting even more time by doing that. You don't need attention to comprehend what you read or to do math. In fact, you may not even need consciousness. The researchers, located at Hebrew University, used a technique known as Continuous Flash Suppression (CFS) to suppress consciousness in some 300 research participants for a short period of time. In CFS a series of rapidly changing images is presented to one eye, whereas a constant image is presented to the other. When using this technique, the constant image supposedly is not consciously perceived until after about 2 seconds.
At our lab in St. Louis we are working with several people with superhuman abilities, also known as “savant skills.” My research assistant Kristian Marlow and I are also currently finishing a book entitled The Superhuman Mind (under contract with an agency, see updates here). We are blogging about these cases almost daily over at Psychology Today. The following are four brief stories about some of the individuals we are working with.
Bryan Jackson recently wrote to Sally Haslanger to ask why there were no women on the list of metaphysicians on wikipedia. Sally shared this very good question with a few philosophers who have been interested in similar questions. After some discussion via email Sally started revising the entry. I have no idea how to revise wiki entries or what the rules are for making revisions, but I strongly encourage wiki-tech-y people to make further improvements to the list. Below the fold you can see the old wiki entry, which Bryan was referring to, and the entry as of tonight.
The lore we are told inspired by, say, Putnam (not a disinterested spectator) and more recently Huw Price, who thinks we delude ourselves, is roughly this: after the founders of analytical philosophy had successfully ridden philosophy of its thirst for metaphysics, Quine, discerning a crack in Carnap's edifice, re-opened the door to our deposed Queen, μεταφυσική, in "On What There Is" (and "Two Dogmas"); with the door ajar and Alvin Goldman and Dan Dennett distracted by 'naturalizing' everything, Hillary Putnam developed a Quine-ean argument from the authority of science for the really real existence of numbers and, more significantly, David Lewis -- perhaps spurred on by some Antipodes -- drove a truck through the opening by embracing modal realism.
We love linear stories [Carnap --> Quine --> Lewis], don't we, so even the descriptive metaphysics of Strawson's Individuals (1959) can't quite be squished into, shall we say, our conceptual scheme. Now consider the following paragraph written in 1930:
The pursuit of metaphysics as the study of generic characters of existence has been slowly regaining its professional adherents. Once its central theme, reaction to the unchecked flights of nineteenth century romantic speculation has well nigh banished metaphysics as a legitimate subject matter for philosophy. But the problems which professional philosophers refused to consider became acutely pressing in the special sciences. It was to be expected that ere long comprehensive treatises on the nature of existence would appear, fashioned by philosophers were where sensitive to the advances of recent science as well to the ancient tradition that philosophy is the systematic study of being. To the series of distinguishes essays on metaphysics which contemporary philosophers have contributed, these volumes [by Whitehead--ES] are a notable addition.--Ernest Nagel (1930 "Alfred North Whitehead," republished in Sovereign Reason, p. 154.)
Five old puzzles to brush up on your logic skills before the GREs:
(1) Yesterday I ran into my colleagues Eric Wiland and John Brunero, who were trying to sneak out to watch the baseball game. I asked Eric: "Is any of you a truth-teller?" Eric said something that sufficed for me to know the answer to my question. Are Eric and John liars or truth-tellers?
Inspired by one of Mohan's older posts, I finally got around to redoing the Gender - Career Harvard IAT. I completed it twice. The first time, I sort of cheated. I really didn't want "female" to be associated with "family." I got the result I wanted (or close enough): "Your data suggest a slight association of Female with Career and Male with Family compared to Male with Career and Female with Family." But I really was kind of cheating. I thought carefully about the questions before answering. Well aware of the reasons behind my "good" results, I decided to complete the test again. This time I focused on doing it as fast as possible (as they actually do request in the instructions). The results now revealed my gender biases: "Your data suggest a slight association of Male with Career and Female with Family compared to Female with Career and Male with Family."
It's that time again: Undergrads finishing up will soon be sending
out applications to Ph.D. programs and grads finishing up are about to
head out on the job market. Though choices might be limited owing to
financial restraints, there likely will be some who will have the
opportunity to choose among several departments.As I advice students about this on a daily basis, and reputation plays a crucial factor in the desirability of a department, I thought it would be worth mulling over what makes a department reputable.
In response to the claims of partisans of Xphi, there has been a tendency to deny that intuition matters much in philosophy (or even exists for philosophical purposes recall recent discussion by Jennifer Nagel on Williamson here; a topic that Catarina and Brit have explored recently here; here, for more etc.)) But in my recent readings I found a lovely (critical!) account of the "rules of the game" in Maudlin:
The rules of the game in this sort of analytic project are relatively clear: any proposed analysis is tested against particular cases, usually imaginary, for which we have strong intuitions. The accuracy with which the judgments of the analysis match the deliverances of intuition then constitutes a measure of the adequacy of the analysis. Unfortunately, it is often the case that the question of how the intuitions are arrived at is left to the side: getting the analysis to deliver up the right results, by hook or by crook, is all that matters, and this, in turn encourages ever more baroque constructions. But if we care about intuitions att all, we ought to care about the underlying mechanism that generates them...--Tim Maudlin, The Metaphysics of Physics, 146-7.
Now Maudlin's book got published in 2007. So, it is certainly possible that in response to Xphi and more general
methodological self-reflection analytical philosophy is in a period of
transition. But the "rules of the game" were once very clear, and we should be skeptical of claims to the contrary.