Scientists are bipolar masochists in white coats. And we are okay with that.
This post will be somewhat atypical. No science-heavy references, no hard data, no hypotheses — nothing new. It’ll be just one general and optimistic story for the non-specialist, a story and opinion that celebrates neuroscience to the bone.
Imagine taking a scalpel and making an incision on an anesthetized patient’s head. As the blade glides down a shaved patch of gelatinous skin, the blood underneath begins to flow out and glistens a bright crimson. It quickly dries to a rust on the scalpel and gauze. Some yellow cubes of fat almost bursts at the seams of the skin around it, and white skull finally comes into sight. Flakes of bone fly around as you drill a hole open. As soon as the hole gives way, the pinkgrey custard that is your brain appears. It is engulfed by a spider-web of of purple bloody veins. Congratulations! Before you, finally, is the seat to everything that is You, and there is nothing You-er than your mind, than your brain. So, let’s study You.
To boot, neuroscience is hard. Like, really hard. It’s chock-full of dizzying experiments — “Make neurons light-sensitive and shut off the dentate gyrus during contextual fear conditioning kthx.” These experiments are followed by the olympics in the form of data analyses — “What do you mean you did a one-tailed and not two-tailed ANOVA?!” And, because it’s carried out by humans, it’s also infused with politics — “We can’t collaborate with him because he’s the competition, he has more money than us, and he stole our idea!”
It takes quite some time to learn how to frame questions as falsifiable , replicable, and preditive hypotheses. This means knowing how to tell when you’re wrong, how to reproduce a result, and how to anticipate the ramifications of that result. The only limitations, though, are patience and creativity (and funding.) The world of neuroscience simply is a vast playground for experiments to test theories for contact with reality.
What’s more, studying neuroscience should shift your perspective on how the mind works, because when you’re neuroscientifically literate, the mind looks very different to you. It looks like — it is — the lump of brain tissue sitting directly behind your eyes. With some training, it begins to look more like a machine with interconnected joints and sockets, each moving along by pulling this and pushing that.
We learn quickly that brain cells effortlessly transform all sorts of information into chemical and electrical signals and patterns, into the language of the brain. It is this process that gives rise to something seemingly magical — a memory from your childhood or the feeling of wanting a cheeseburger or the peaks of an orgasm. It is this process that lets thing becomes thought.
Most importantly, we learn that thoughts are only physical processes. This is most obviously and tragically true when broken brain pieces give rise to broken thoughts. And so, when you’re neuroscientifically literate, the brain no longer looks like an ephemeral lump of meatloaf. Indeed, there’s a difference between the real world and what it looks like, physicist Richard Feynman reminds us.
Neuroscience is also hard because it’s full of repetition. Neuroscience is also hard because it’s FULL of repetition. At some point, we all have to replicate a finding to make sure it wasn’t simply due to chance. This is how science self-corrects and builds on itself. We do this happily, begrudgingly, excitedly, angrily, hesitantly, tiredly — all of the feelings that, to anyone else, make us look like nothing short of bipolar masochists in white coats.
To a degree, that is just our nature. Seeing a hypothesis verified after years of work can feel like like a heaping scoop of vanilla bean ice cream drenched in hot fudge multiplied by hitting the lottery. Accordingly, our mood becomes joyful and pleasant. Seeing an experiment fail can feel like an 18-wheeler just parked conveniently on your face, multiplied by the bubonic plague. Accordingly, our mood sucks.
By doing research, we come to know one aspect of nature so intimately well and in such fine detail that we feel like our intuitions have to be right about something. And if things don’t work out according to plan, which they won’t the first time around, then we troubleshoot the life out of the problem.
When the latter happens, we give meaning to the re in re_search. You search again, you self-inflict the entire process, because nature doesn’t give a fig about your stupid hypotheses; but, luckily, you’re smarter than a fig, so you’re in business. Indeed, this stupidity can be positively and directly proportional to the quality of science you’re doing. You may even feel stupider when your science is better. Fittingly, science is an acquired taste and, eventually, you learn to become more like memory and time. You learn to persist.
Closer to home, a day in the lab gives you all sorts of colorful perspectives on how science unfolds. Desks are scattered with obscurely titled papers; computer screens have some program running with lots of numbers on them Matrix-style; culture dishes can be seen with colonies of awful growing on their surfaces; the smells of clorox and coffee and mice become all too familiar. Google calendars overflow with planned experiments for the next few months, and each day adds another week’s worth of work. Some call it the Myth of Sisy-GRADUATING-phus.
Coming into lab means tip-toeing around the edge of what is known and unknown, hoping that 60+ hours of experiments each week, and never-ending reading, translate into a glimmer of never-before-seen ground. When this is the case, we take a step forward. Science has advanced. Huzzah!
Maybe a Western blot showed the presence of the protein we’ve spent months looking for. Maybe the electrode picked up a novel neural correlate of memory. Maybe the mouse no longer has a memory deficit. Maybe we even implanted a false memory in a subject Inception-style (!). Maybe the fMRI revealed the parts of the patient’s brain that were damaged and dysfunctional. Maybe, just maybe, the extraordinary claim we made in lab meeting submitted to extraordinary experimental evidence. These are the things that get our juices flowing.
My perspective is even more optimistic: science is cool because when it works, it just so happens to really work, and that’s exciting. What’s more, we’re living in an incredibly unique and privileged pocket of society if our daily headaches come from a failed experiment and not from the clean drinking water that wasn’t available. Still for some, the excitement runs its course and ends in bitterness. Many become discouraged, frustrated, and jump at the opportunity to leave academia. They say hitting this wall is inevitable, especially in graduate school.
Needless to say, I politely
freaking disagree. Admittedly, if your interests take you somewhere outside the lab, then kudos because it’s important to have fun doing your job. As long as your career is centered on something that’s more important than you are, then fulfillment will happen naturally. And for many, neuroscience is that something. I spent my last summer doing five months of experiments that amounted to this: I learned how to run an experiment that failed very well. But that’s science. I love it because it’s not a job. It’s science.
It helps that our “job” description is pretty straightforward. We get paid to fillet the big picture of nature into pieces and make sense of what we see. Then, we get paid to talk a lot and share this sliver of knowledge with students and faculty alike. Did I mention we get paid to do this? Once we accept a tumultuous lifestyle, then things get really neat. By keeping all the moving parts in the brain untouched and by only tinkering with one variable, say, a gene or a cell-type or a protein, we can observe the overall effects of any manipulation. Any!
In the process, some results are scrapped, falsified, re-interpreted, refuted; others are replicated and supported. A blessed-few even go on to achieve the holy-grail level of theory and law. That is the drama behind brain science, and it works freakishly well because data points are independent of the coffee we didn’t have this morning, of the friend who didn’t pick up the phone last night, and of our passion for science itself — in short, of our emotional contingencies. It fundamentally depends on measurable evidence only. Interpretation comes second.
In reality, and excuse my candidness, being trained in science actually means learning to have a really good bullshit detector — something of incalculable value, because it lets you pinpoint when both sides of an argument are wrong. It protects against neurobabble and tantalizing but meaningless statistics (“Improve your brain power by 30% with this new herb supplement!”). Many would say that it even forces God to become an ever-receding pocket of scientific ignorance, as astrophysicist Neil DeGrasse Tyson reminds us. In the brain, thoughts go in and thoughts go out, but unlike Bill O’Reilly, around whom this detector goes off often, we’re actually working on understanding this process. Indeed, with this detector in hand, a very pretty picture has emerged thus far. Mind is as brain does!
I’ll end by addressing a common criticism. Doesn’t science deflate all the wonder out of the world? Isn’t neuroscience in particular the human condition’s ultimate buzz kill?
Along those lines, and with a poetic license, here’s one gloomy way to look at it in the spirit of John Keats. The cold touch of science forces charms to evaporate into a ghostly air of unbelievable. The cold touch of science unweaves a mystical rainbow into a few wavelengths scattered from droplets of water beads. The cold touch of science humanizes an angel by tearing off its holy wings and grounding its flesh in reality. The cold touch of science lets all of the occurrences in nature become part of the catalogue of common things, of things defined by rules and measured in lines, one unit of reality at a time.
Like Keats, the American poet Walt Whitman speaks for many as he laments at those moments when science plucks truths from the tree of reality and renders them common things:
When I heard the learn’d astronomer;
When the proofs, the figures, were ranged in columns before me;
When I was shown the charts and the diagrams, to add, divide, and measure them;
When I, sitting, heard the astronomer, where he lectured with much applause in the lecture-room,
How soon, unaccountable, I became tired and sick;
Till rising and gliding out, I wander’d off by myself,
In the mystical moist night-are, and from time to time
Look’d up in perfect silence at the stars.
Contrary to Whitman and Keats, scientists look up in bewilderment from time to time all the time! We do in fact see and enjoy plenty of mystery. The only difference is that after basking in a steaming pile of our own ignorance, we then think, “What experiment can we do to unweave that rainbow?”
And what a piece of work is the brain! The 100 billion cells that are folded neatly into a 3 pound lump of undercooked custard still fascinate at least the 30,000+ attendees of the Society for Neuroscience’s annual meetings. That’s 30,000+ people simultaneously unweaving the brain one mind-strand at a time. The physicist Jeremy Bernstein perhaps summarized it best by saying that science as a whole fills him with “… a great sense of elation and a deepened admiration for what the human family, at its best, can accomplish.”
Modern neuroscience is one hell of an achievement worth celebrating to the bone, and I for one am happy to be a part of it. Physicist Brian Greene once said that the wonders of the cosmos transcend everything that divides us. Ditto for the wonders of the mind. A small miracle of neuroscience is that we are the only known species capable of doing it, and it has worked out wonderfully well so far — diagnoses have become more accurate; treatments have gotten better; diseases have been cured. For the first time, we can truly take up Anton Chekhov’s challenge, “Man will become better when you show him what he is like,” the Russian playwright insisted. And so, the miracle of neuroscience is that it does just that.