Tuesday, February 5, 2019

Virtual Selection: Survival of the 🔥Littest 🔥

Five years. Damn.

To borrow a cliché, it’s been the best of times, it’s been the worst of times. I’ve soared and spiraled, toiled tirelessly and cavorted carelessly, climbed over enchanting peaks and sunk to odious abysses. A canorous crescendo, a dissonant decrescendo. Yet here I sit, five years later...

...and I have finally read The Selfish Gene.

Dawkins has glared menacingly at me from his slot on my bookshelf all this time, his gaze palpably fierce. I haven’t ignored it – how could I? – but I have neglected it. This neglect largely being a product of the book’s forty-three-year age. I surmised that, by now, it wouldn’t have much in the way of fresh insights into a field of deep interest to me, evolution.

I surmised poorly.

I’ll spare my dear reader details of the full synopsis, but the book did indeed update my paradigm of the evolutionary process, and I considered the implications. During this period of reflection, I listened to episode #145 of Sam Harris’s Making Sense podcast called “The Information War” with Renée DiResta (well worth a listen) which discussed, among many salient topics, the role of online platforms and advertisers in generating seductive content to attract clicks and hits from a targeted audience. It hit me that the machinery at play here is directly analogous to natural selection. Eerily so.

At the risk of patronizing you, my erudite reader, I am compelled to review how natural selection works before proceeding. DNA wants to replicate, and it does so using cells and organisms as vehicles. DNA replication is an accurate process. Like, an astonishingly accurate process. But mistakes do happen. Many of these errors prove deleterious to the organism…but sometimes a mutation pops up that offers a small survival advantage instead. This mutation then makes the organism more likely to reproduce its genetic code, mutation in tow. Thus the mutation will be passed on and dispersed into the gene pool until, over time, all organisms of that species have it as well. A critical feature of this mechanism is that the mutation only offers an advantage within that specific environment. For instance, a mutation for brown fur might help a mammal in a wooded region, but would hardly come in handy in a polar climate. Darwin defined his theory as “Evolution by Natural Selection” because the nature of the environment ultimately determines the line between “good” and “bad” mutations. It does so indifferently, passively. Nature just is.

The internet, however, is a whole different beast.

Ads run the show with an obvious monetary incentive. Like DNA, they “replicate” (i.e. get clicks) by attaching themselves to websites that users reliably view. The sites and authors thereof, then, act as the organisms, vehicles for housing and transporting genes/ads and eventually propagating them. Our physical selves, the consumers of the media on which ads appear, perform the natural selection. We deem content worthy of perusal, in turn rewarding the websites for a job well done in captivating us. Websites experiment with “phenotypes” (catchy headlines, shorter paragraphs, more images, etc.), and learn which prove most successful. The most successful phenotypes propagate and spread.

I posit that this is more or less how we’ve come to see the explosion of clickbait trash, sensationalized drivel, and shamelessly fabricated articles that pervade the internet. 

As the internet became faster, cheaper, and overall more accessible, it inevitably integrated into everyday society. Smartphones found homes in pockets and purses ubiquitously. Social media moguls like Facebook, Twitter, and Instagram started monopolizing social spheres. Like the Arctic Monkeys song goes, “We moved it all online…” And so we have. Most of it, anyway.

But so as our lives were slowly tangled by the web, it naturally followed that we acquired more and more of our news and information from the internet. This shifted the game entirely, from holistic, periodic bundles to individual, rapid-fire snippets. Single articles could rack up millions of hits, translating into grotesque revenue for advertisers, without the excess fat of your typical newspaper. This is key. Instead of building a bulky organism like a newspaper, consisting of hundreds of cell types, tissue layers, organs, etc., which takes up valuable time and energy, news started acting more like bacteria: Single cells with extreme fecundity, which translated into exponentially more chances to “mutate” and “evolve.”

Articles can be produced continuously, fluidly, flooding your inbox, homepage, Twitter and Facebook feed, et al, with a steady stream of information. You then select which articles you wish to read, providing feedback to the websites as to which phenotypes caught your attention and which didn’t. They take note, the perspicacious motherfuckers, and endeavor to recapture your attention over and over and over…

And so yeah when they do take note, they recreate what worked and that becomes the new norm. They evolved by virtual selection, our selection. But so you might say, “Hey, Seth, I’m a savvy interneter who only reads high-quality journalism like HuffPo, so how does that explain the rise of clickbait and fake news?” To which I would say 1.) Look in a mirror and honestly assess if you haven’t been successfully baited before…because I sure as shit have (also, HuffPo…?), and 2.) You are not representative of America. Clickbait has arisen, thus something caused it to arise, the most likely culprit being the majority of internet users rewarding outlets for baiting them into clicks.

The brain loves dopamine. Or I suppose I should say your consciousness loves it when dopamine binds to certain titillating regions of your brain. It gives you pleasure. We live in a veritable deluge of dopaminergic stimuli. I can’t speak exactly to what effects this has had and continues to have on our behavior, but I would bet money on at least two: Increased distractibility (not a shorter attention span per se but an increased capacity to have that span interrupted) and increased novelty-seeking (meaning constantly looking for that dank-ass meme, that fresh new song, that next “OMFG” Netflix show). 

This doesn't mean I think our genetic code is at all in danger of said deluge, but I do believe that our environment extensively sculpts our brains, thereby deeply shaping our behavior. Constant bombardment of our cortex with blaring music, fast-flitting commercials, spastic video games, infinite torrents of memes, ebullient superhero movies, seasons upon seasons of binge-able shows, endless barrages of all manner of notification blowing up your phone, which all produce measurable spikes of dopamine, simply must have a profound effect on how our brains operate. When was the last time you went more than an hour without checking your phone, swiping through memes, shuffling through songs? Hell, when was the last time you experienced just silence?

Circling back to journalism. Our environment supplies us with a steady diet of daily dopamine pings. Combine this diet with our brains' dispositions for 1.) Avoiding displeasure/discomfort, 2.) Being right (or believing they're right), and 3.) Conserving energy, we get a recipe for clicking links that we strictly expect to find pleasurable, comforting, validating, and that requires a meager supply of brain power to ingest. We select shit like “Big Pharma Doesn't Want You to Know This One Fact” or “Watch as Dog Rescues Boy from Swamp” or “Why We Need to Impeach Trump Now” which rewards that media phenotype and ensures its propagation.

News sites have to stay competitive for advertiser funding, and so an arms race ensues. Outlets carve into different niches (politics vs. sports vs. entertainment vs. whatever the fuck the Kardashians are), adopt different strategies (sensationalism, emotional activation, conspiracy, sexualization, that dude that scientists hate), and vie for perpetual existence in the digital marketplace. Over time, new phenotypic norms are established. The ante is constantly upped. Content increases its irresistibility. We can't help but click and share and click again. And so the cycle repeats. Virtual selection. Survival of the 🔥Littest🔥.

But so now this has major consequences. Newspapers flirt with extinction, unable to compete with the obscene profits from online media. Concomitantly, long-form journalism seems also to be waning, though it apparently has found a home on sites like The New York Times, The Atlantic, and, weirdly, BuzzFeed, among others. Thorough investigative reporting, the product of which is a flowing narrative assiduously yet artfully detailing the subtleties of a multifarious issue that has significant implications in local, national, or global society, simply costs too much relative to its payoff. Most people don’t read that shit. Most people, I argue, are fundamentally less capable of reading that shit. Myself very much included. It irks me when I scroll for what seems an appropriate length for an article, only to note my scroll bar a mere forty percent of the way. Vexed, I then read more quickly and by doing so absorb far less from the article than the author intended.

So what, Seth, who cares? Let me dank my memes in peace. Why does this matter?

I think it matters for several reasons. It’s likely a major contributor to the polarization and tribalistic feud between the political factions as we reward warring outlets with more and more extreme rhetoric (though many other factors are at play here, to be sure). It diminishes our ability to appreciate complexity and nuance, or at least our ability to maintain focus long enough to properly ponder complexity and nuance. It disincentivizes critical thinking and reflection since media has become a series of sugary snacks that we devour instead of a single, hearty meal that requires time to absorb. It nearly led to someone being shot in the whole Pizzagate ordeal. Overall it’s perpetuating a cycle of rewarding journalists for providing us less challenging, less intimate, less meaningful content.

But okay so alright, Bigshot, what’s the solution, then? The solution is simple, if gargantuan. 

We change the environment, shift our selection criteria. We start rewarding hard-hitting journalism again. We stop reading cheeky, buzzword-laden spittle. We stop shying away from nuance, from disagreement, from discomfort. We force ourselves to sit and read and reflect on what we just read. We call out the mendacious, attempting to misinform and mislead, and punish them for it. We start eating meals again and invite friends and family to join us for dinner. Perhaps then we'll find ourselves more nourished, more whole, and perhaps we'll inspire others to do the same.

But perhaps not. Perhaps this isn't a big deal. Perhaps it's just a phase and we'll eventually return to the journalistic days of old. Or perhaps instead it will continue to progress this way until our news feeds resemble something from Idiocracy

What are your thoughts? Am I off my rocker? Am I understating the severity of this phenomenon? Call me. Text me. Email me. Hug me. Smack me. Let's have a chat.

Anyway, it really is great to post again. I know this isn't my best work - certainly some cobwebs need dusting - but I'll continue to polish and refine as I develop this skill in future posts. Which I will write, dammit.

Until then, select wisely, dear readers, and thanks for, you know, reading.

Wednesday, February 5, 2014

Call Me Crazy, But Please Don't

Let's do a little thought experiment.

Say you have a friend, someone you met in college. A classmate. You hit it off. She's intelligent, funny, slightly artistic, down to earth, etc. Over the course of a few months your friendship grows, a bond forms. Then one sunny day in May you notice a dotted line carefully drawn on your friend's right arm, a few inches above the elbow. Naturally, you ask, "What's with the line?" 

Your friend's expression darkens. She looks around anxiously before approaching you and whispering, "That's where I want my arm to be cut off."

Whoa. What do you say to that? What do you say to a friend who you've seen do Calculus integration and heard insightfully interpret the film There Will Be Blood, but who also wants her arm chopped off at a specific location because it feels "intrusive"?

Do you think she's crazy?

This is a real disorder by the way, called apotemnophilia. It has been studied in depth by a budding hero of mine -- renowned neuroscientist/psychologist Dr. Vilayanur S. Ramachandran. Apotemnophilia is a result of some unusual "wiring" of neurons in one's brain: One part of our cortex contains the construct of our body image -- how we see ourselves if we were to close our eyes or turn the lights off. In people with apotemnophilia, the part of the construct pertaining to their unwanted limb is missing. When they picture their body, their right arm is not included in the picture. Therefore they consider the limb an "intruder" deserving of amputation. Knowing the basic neurology behind it, is it fair to call apotemnophiles crazy? I don't think so.

Ramachandran at work. Please note how badass he is.

Using the term "crazy" is a classic example of humans simplifying something they don't understand. We often say someone is "crazy" if they are generally abnormal. The abnormality could be something biological -- like apotemnophilia. "Crazy" people could also perform certain behaviors or harbor beliefs we see as odd or outlandish; people who have visible and unique differences from the rest of humanity but aren't neurologically askew (think "Crazy Cat Lady" or conspiracy theorists). We also use it to describe someone that's excessively demonstrative or irritable or stubborn -- someone with an extreme personality, most likely due to strong environmental factors in their life (e.g. heavy stress, intense pressure, tragic circumstances, etc.) In any usage, "calling crazy" always ignores a broader concern. "Calling crazy" is a way to explain something that seems inexplicable. It's a heuristic. It's convenient.

And it's bogus.

Let's go back to our thought experiment. Your friend just divulged that she desperately wants to rid herself of her arm because she feels it doesn't belong there. I'll bet your first reaction would be like "Okay, this person is weird." I know I would be taken aback. That's just not something you hear every day. 

But after the shock subsided, we should ask why -- why does she have this intractable urge? In every other way she is normal except for this. To me, that's not crazy. It's not normal either, but clearly some abnormal science is at play. She doesn't deserve to be labeled as crazy; she deserves an explanation of why she wants her arm chopped off. Most importantly, she deserves treatment for her illness.

Consider mentally handicapped people. Do we label these people as retarded? Do we make fun of them for how stupid they are? Maybe as kids, but we quickly grow out of it because, as adults, we realize how absurd and cruel it is. Mentally handicapped people can't help that they are mentally handicapped. Similarly, mentally ill people can't help that they are mentally ill. So why do we still call them crazy or weird? We should realize that these mentally ill people need help, just like someone with a physical illness needs help.


Depression is one of the most prominent of these illnesses -- it's the leading cause of disability in the U.S. for ages 15-44.

In the ER where I work I have heard patients labeled as crazy. It's not meant to hurt or harm anyone. It's lighthearted, intending to brighten the often dark and stressful mood. It can be fun to talk about how "crazy" this patient is acting and to laugh about it. It might even be necessary as a way to handle stressful patients. They're spending enough cognitive resources in struggling to treat a difficult patient that few resources are left over to think about why they are difficult to treat. It's hard to think about what might be causing one's "craziness," or to consider the circumstances that patient experienced that led them to act the way they do. But every time I interact with those "crazy" patients they seem anything but. Many of them are living in absolute turmoil. Others have addictions that bring out the worst in them. Still many more are completely normal, and when I treat them reasonably, they instantly reciprocate.

When you think about it, we're all a little crazy. I talk to myself incessantly. I argue with myself. I make funny faces in the mirror and laugh hysterically (in the literal sense of the word). I scream songs in the shower. Sometimes I make weird noises just because they sound novel, followed by more hysterical laughter. Other times I converse with characters in video games. I even hallucinated once as a kid (in fact, as much as 10% of the population has hallucinated). It's these quirks that contribute to my individuality. How could I judge anyone who displays similar oddities, biological or not?

In my opinion, no one is truly "crazy", and we should stop using the word to describe people. Crazy doesn't tell us what's really going on with someone. It  does the opposite -- it neglects the underlying physiology or environmental factors that produce the "craziness". Crazy trivializes. Crazy makes one's biology and social environment the butt of a joke, which in turn discourages those with actual disorders from telling people about their symptoms.

"Crazy" people don't exist. Disordered people exist. Mentally ill people exist. Extremely stressed people exist. The more widespread this notion becomes; the more understanding we become of "crazy" people; the more we recognize the pressures of a person's environment; the more we encourage those with illness to discuss their abnormalities and the less we judge/label/stigmatize them, the better off humanity will be. Instead of being afraid of persecution and ridicule, those suffering from disorders can be accepted for who they are biologically. They can receive treatment earlier and live a more "normal" life. Wouldn't you want that if you suffered from mental illness or extreme stress?

My plea is this: Let's stop taking this easy way out and labeling people as crazy. If someone is acting particularly strange, there's probably a good reason why, a reason that's likely beyond his or her control. And who cares anyway? Diversity is the cornerstone of our success as a species. It's the cornerstone to any species' success. Instead of admonishing, trivializing, fearing, or dismissing this diversity -- instead of labeling it as weird or crazy -- we should embrace it. Our differences make us who we are. They separate me from you. They define us. And they should not be grounds for ridicule. 

No one is crazy. We're just human.





This cat, on the other hand, is completely bonkers.



Patiently edited by Ken McGurran

Friday, December 13, 2013

To read, or not to read?

As most of you know, I have graduated from college and thus am no longer enrolled in any classes. Yet my thirst for knowledge has not been sated by any means. In fact, it has grown exponentially over the last few months. To quell this famished monster inside of me, I have started reading at an unprecedented rate.

Since August, I have read 18 books, 3 plays, 135 comic books, a handful of sonnets, and innumerable articles online, mainly concerning news in science, technology, economics, politics, video games, and music.

And there are still tons of books out there, yearning for me to turn their pages, flex their spines, and inhale their oddly pleasant aromas.

Reading incessantly has led me to make some interesting observations concerning changes in my behavior and abilities. But are my observations real and backed by research, or just circumstantial? Let's find out!

Observation #1: Articulation

The first nuance that I noticed was an increase in my ability to string together fluid sentences in conversation and procure the "right" words more readily. It doesn't happen every time I open my mouth, but it certainly happens way more often than it used to. I actually have always been somewhat of a clumsy speaker, often stammering and tripping up on my words, unable to succinctly and accurately translate my thoughts into spoken words. Now, I'm finding that, regularly, the words just flow out of me and everything I'm thinking seamlessly transitions to everything I'm saying (this effect is greatly facilitated by coffee). That's not to say I've ceased stammering -- would that I had!

(It's also important to note that the effect isn't as noticeable in my writing. I have always been better at putting my thoughts on paper than speaking them. Really, the only change has come in my slightly expanded vocabulary and an increased rapidity of my writing, which, ultimately, makes little difference.)

And the Research says...

The most comprehensive study I found said that reading -- or print exposure -- was highly correlated with vocabulary, cultural knowledge, spelling ability, and verbal fluency. Researchers Keith Stanovich and Anne Cunnigham (1992) in their landmark study showed that someone who is exposed to a lot of print (i.e. reads often) is more likely to have a larger vocabulary, more cultural knowledge, a better ability to spell words, and also a better ability to write/speak (i.e. is more articulate!) than someone who is not exposed to print.

WARNING: Correlation is not the same as causation -- reading print did not directly lead to these increases. But there is clearly a relationship there.

Shakespeare. . fanc_ l_ he words hii' iitt Io ll ' . for INVENTS THEM.
Shakespeare was a singularly articulate person (that is, if he was indeed  just one person).

Another study conducted by Stanovich (1995) found that "...exposure to print was a significant predictor of vocabulary and declarative knowledge even after differences in working memory, general ability, and educational level were controlled. These results support the theory of fluid-crystallized intelligence and suggest a more prominent role for exposure to print in theories of individual differences in knowledge acquisition and maintenance." In other words, the amount the subjects read was again strongly correlated with their vocabulary and their amount of "declarative knowledge" -- knowledge that can be declared, like facts and things of that nature.

Also, a meta analysis of 20 studies showed that students learned about 15% of the unknown words they came into contact with while reading. So there's a bit of causality there.

But, to answer my above question, no reading does not directly make someone more articulate. But it is strongly correlated with an expansive vocabulary, spelling ability, and heightened verbal fluency. Read more often, and you may find yourself naturally using bigger words and interjecting more facts into your every day conversations.

Observation #2: Increased Empathy

Another thing I found pleasantly peculiar is that, when watching a character of some kind on TV or in the theater, I can now more easily understand their emotions, get inside their heads, feel what they feel, and think how they think. That's not to say I do this to an astonishing degree, but I frequently get flashes of emotional insight when watching...well, anything! From emotion-heavy films like 12 Years a Slave with powerful protagonists, to narcissistic buffoons like Dennis Reynolds from It's Always Sunny in Philadelphia, It's almost simple to slip into their skin and experience what they're experiencing.

Dennis (portrayed masterfully by Glenn Howerton) is such a great character. Highly recommend Netflix-ing this show.

And the Research says...

Indeed, reading does increase empathy! But only by reading literary fiction. A very recent study found that reading literally fiction temporarily expanded people's ability to feel/understand others' emotions That means reading Twilight and 50 Shades is not going to cut it, whereas works like The Lord of the Flies, The Great Gatsby, and Catcher in the Rye (all fantastic books by the way) will give you an increased ability to experience what someone else might be going through. Note that the effect was temporary, however, and it remains to be seen if reading loads of literary works over a long period of time will make someone permanently more empathetic. The study also implies that, more broadly, thinking about any work of art will produce this effect. So if you want to feel what others are going through, go to an art museum and/or read some Hemingway!

Observation #3: Ability to Abstract

When I used to read things, especially literature, my logical, straightforward brain would rarely (if ever) be able to see the "bigger picture" of the story. I was thematically inept, useless when it came to questions like "What was this story about?" Nowadays, more and more themes and meanings jump out at me while reading. After finishing Lord of the Flies, I instantly knew what it was about and what the broader message was (I won't spoil it because I think it's something you, Faithful Reader, should really read. Excellent piece of literature.) This ability also extends to most art, whether it be paintings, photography, video games, or film. My ability to think abstractly has definitely expanded since I started reading like a man possessed. It also has generally made me more creative, albeit in a pedestrian and sometimes silly way.

And the Research says...

Through the dozens of articles I perused, none answered this question definitively. However, I did learn that reading increases one's ability to think in general.

Just this past September, science finally proved that reading does help kids do better in school. Not only reading, but literacy activities, such as writing and listening to words/books being read led to significant success in school, even in the subjects of science and mathematics. As one researcher pointed out, "We found that as the amount of reading increased, the students who weren't very good readers had more and more difficulty with the math and science items. Reading is crucial to success in school. It's the glue that's holding it together. "

Another recent study by some researchers at Stanford had students read Jane Austen (renowned author of Pride and Prejudice, among other classics) while having their blood flow within their brains imaged via fMRI (i.e. the researchers peered inside their brains to see which areas were receiving the most blood flow, hence the most active, and observed the overall activity of their brains.) What the researchers found was that, apparently, attentive reading "requires the coordination of multiple complex cognitive functions." Reading really gives your brain a workout.

In the process of writing this blog, my routine browsing of the internet led me to another study  by MIT which showed that even the best schools don't improve students' abstract reasoning ability. My interpretation of this: if the brightest students -- who, it's reasonable to assume, were exposed to a lot of print in their early years and had tons of books at their disposal which helped them become exemplary students -- aren't better at abstract reasoning despite being great readers and great in school, then reading is probably not related to better abstract thinking.

My boy, Sir Isaac Newton, may have been the best abstract thinker of all time. Dude was truly brilliant.

But, one study did conclude that reading literary works led to an increased ability to think critically and analytically. Not quite abstract reasoning, but those types of thinking definitely help in subjects like math and science.

Beyond my own observations, I found an overwhelming pile of research that fleshed out the myriad benefits of reading. Here's a list of a few I thought were of particular note.

Other Benefits of Reading:

  • Reading reduces stress. The act of reading is inherently relaxing (most people sit in a leisurely position to do it). But reading just for 6 minutes can reduces stress levels up to 68%, which outperformed listening to music, going for a walk, and drinking some tea.
  • Reading is good for your health. Mental health aside, the more you read, the more likely you are to come across bits of information that will help you live a healthier lifestyle. 
  • Reading can help counteract the effects of aging. As we age, our mental capabilities decline. But remember, reading is a workout for your brain. Just as physical exercise can prolong the health and youth of your body, reading can prolong the youth and efficacy of your brain!
  • Reading improves your writing ability. This one's pretty obvious. The more writing you read, the better you'll be at writing something yourself.
  • Reading is free entertainment. No need for research on this. There are literally thousands of books at our local public library waiting to be read by you, for free!

An artist's rendering shows computer stations at the new BiblioTech bookless public library in Bexar County, Texas. The library is holding its grand opening Saturday.
But maybe not for long? Texas just recently opened an all-digital public library. Is that the way of the future? 

Back to the first question posed in the title: To read, or not to read? Is this even a question? The benefits of reading are seemingly endless. Yet how many people regularly, actively, intently read? The average American watches 34 hours of television a week. What if the average American read literature 34 hours a week? What would that America look like?

Do yourself a huge favor and make a conscious effort to read more. I understand many of you are in school and are probably (hopefully!) reading textbooks and various materials daily, as well as writing papers on said materials. Instead of being bogged down by these tasks, the next time you open that textbook, think of all the good that getting through one chapter will do for your brain and health. Don't think of it as something you have to do, think of it as an opportunity to expand your vocab become a more knowledgeable and capable thinker.

On top of textbooks, try and find time for some literary works. Reading these, as we've seen, will increase your ability to think critically, which will undoubtedly help you out come test time. You'll also become more empathetic, which is always a good thing in my opinion. And they are just plain enjoyable to read.

The message here is pretty plain: Make a habit of reading. It'll do your brain and body a whole lotta good.


          


Thanks for reading.




Edited by Ken McGurran


Sunday, November 17, 2013

Slacktivism: "Like this if you think disasters are bad!"

[Written Thursday, November 14]

Last Friday, Typhoon Haiyan commenced its path of devastation in the Eastern Philippines, savaging structures and homes with torrents of water and wind, literally tearing families apart, drowning thousands, displacing thousands more. This was a true nightmare.

And the nightmare is still ongoing.

Satellite image of the super typhoon. Monstrous.

To put the storm in perspective, Haiyan was 3.5 times stronger than Hurricane Katrina, the storm that ravaged the Gulf Coast, most famously New Orleans. Despite as many as 700,000 Philippine natives heading warnings of the impending disaster and evacuating their homes, over 2,000 people have been confirmed dead and some officials believe as many as 10,000 have perished.

On top of the tragic body count, thousands upon thousands of people have lost their homes, livestock, possessions, crops -- their entire way of life.  As one tearful resident of the village Guiuan told CNN, "Everything, everything's gone."

One of the many horrifying scenes of Tacloban, the capital of the Philippine province of Leyte.

Sometimes, "everything" includes family. One man (among hundreds of others I'm sure) desperately searches  for six of his family members among the wreckage. He recounts his horror to ABS-CBN. "We all got separated from each other when the strong waves hit... We got separated. I couldn't even hold on to my child."

"I couldn't even hold on to my child." Can you even imagine?

The storm has prompted millions of people into action, doing their small part to help: donating, volunteering as operators, helping with searches, distribution of supplies, etc. But it has also prompted millions more to passively tweet things like, "Thoughts and prayers going out to the victims of the Philippines!"

"Help! Food. Water." They clearly need more than our thoughts and prayers.

There's absolutely nothing wrong with this gesture. Actually, it's admirable. It shows that humans are caring, empathetic, and sensitive to death and destruction. But it's only a gesture. That's the problem.

I recently read about this study on 'Slacktivism.' I encourage you to read it but for those of you who will inevitably neglect to do so, I'll summarize. Or rather, Kirk Kristofferson, PhD student at British Columbia's Sauder School of Business will do it for me with this quote.

   "Our research shows that if people are able to declare support for a charity publicly in social media it can actually make them less likely to donate to the cause later on."

That's bad, bad news for charities. This implies that people who "like" the Red Cross on Facebook are actually less likely to donate to it because they publicly declared their support of the organization. I like to call this social hypocrisy: Saying you do something over social media and then proceeding to rarely, if ever, do that thing in your life. The motivation for this is simple. When someone posts something over social media, the public has access to those claims and they think, "Hey, that dude's a super nice guy." The poster feels good about himself because he displayed good intentions and revealed his admirable nature. What's more, he likely posted this in private, on his phone or tablet or computer. No one is there to see his words manifest as actions. Thus, feeling accomplished and free from any obligation, the poster goes on to never do what he said he would.

I would say most of us are guilty of this -- myself very much included. I've tweeted the tweets, retweeted them, "liked" the Facebook pages, and made all the admirable gestures and emotional sentiments we come to expect following a tragedy. But I never once donated a dime or volunteered an hour of my service to these tragedies.

                   

Not until the words I couldn't even hold on to my child started resonating and reverberating in my head, unable to be ignored, much less forgotten. So, for the first time in my life, I decided to act. I donated money to the Philippines Red Cross.

It felt pretty great.

This post hits on another prevalent theme in my life right now, something I'm learning about, practicing, getting better at, in my pursuit of becoming a physician: altruism. There's a lot to be said about this word and what it means, too much for this post alone.

The most important thing about altruism is that it has been critical to humankind. Hunter-gather societies had to be collaborative, they had to share food, had to work together to take down game, drag it back to their camp, skin it, cook it, had to stick together to fend off predators. Otherwise they wouldn't  have survived, and I wouldn't be typing this right now, and you, Faithful Reader, would not be reading it. Humans thrived by being altruistic. It's literally in our DNA. So why have we gotten away from that principle? Why aren't we more altruistic?

One possible (and likely) answer to that latter question might be due the nature of our "altruism." I think we -- and by "we" I mean "me and my generation" -- are less traditionally altruistic. What I mean is that we as a whole don't act more selflessly for the good of other people, but instead we find other ways to be "altruistic." I would argue that the satisfaction of "liking" or creating an altruistic Facebook page/post or tweeting/retweeting an equivalent tweet rewards that person, gives them a certain satisfaction that in turn makes them less likely to actually act on their altruistic instinct. On top of that, clicking your mouse a couple of times is considerably easier and less financially and metabolically expensive than going out and helping those in need. It's like having your cake and eating too: A gleeful disposition for feeling like you did something meaningful while expending little to no resources.

This is a problem. This is not true altruism. This is slacktivism.

I'm not pressuring you to follow my example and donate now. I'm not saying "look how good I am for giving money, everyone!" I'm not calling you out. I am only making a sincere request. Please don't be a slacktivist. We are all in this world together. We all come from the same ancestors. We occupy the same delicate earth. We could have just as easily been ravaged by Typhoon Haiyan or Hurricane Katrina as those humans who were. And I think we owe it to those unfortunate humans, those most distant relatives, to help them in their time of need. Not because it feels good, not because your religion compels you, certainly not because it will elevate your public status or moral standing, but because they are human, and so are you.






Edited by Ken McGurran

Friday, November 1, 2013

Why I Love Snapchat

Many of you who know me well are well aware of my mild disdain for social media (which seems hypocritical since I have Facebook, Twitter, and a blog). This is particularly true for image-heavy fads like Vine, Instagram, and Pintrest. Initially, I found the idea of a "selfie" to be inherently vain and slightly abhorrent.

Then I downloaded Snapchat. Oh, the selfies that ensued...

You Faithful Readers must have heard of Snapchat by now. It's an app that allows you to take fleeting pics or videos, accompany them with a limited amount of text or crude digital paint, and send them to any number of your friends for a designated amount of seconds (1-10). Once time runs out, however -- and herein lies the beauty -- that image or video, innocent or incriminating, is lost forever. Just gone. Period. (Unless your friend(s) displayed quick enough reflexes to capture a screenshot of said Snap.)

My reflexes failed not in capturing this precious Snap of my friend Joel.

This is truly an ingenious idea, and a fad I never thought I would succumb to. Nonetheless, the more Snaps I send, the more enthralled I become with that seductive app, for several reasons.

Reason #1: Silly, yet intimate.

I am not a silly person, instead tending toward introversion and introspection. That's not to say I'm never silly or don't like to have fun in the more traditional sense of the word, nor do I think that silliness is always inappropriate. In fact, I think acting silly from time to time is greatly beneficial to one's life.

We all know that laughter is the best medicine, but also I think that showing a silly side of your self -- giving friends and family a glimpse of just how silly you can be at the expense of your pride or dignity -- demonstrates the extent that you value to those friends and family. What I'm trying to say is that by acting silly through Snapchat, you're intimating to your friends/family that they are close enough within your social circle to see an embarrassing side of you, in turn causing them to value you as a friend more, making them more likely to reciprocate said silliness, and thus strengthening the social bond between you.

        
This is Drew. He's something of a Snapchat prodigy.

Through Snapchat, I've been able to show my friends my capacity for ridiculousness, something I normally don't display in public settings. In return, many of my friends Snap me back in equal (and often greater) silliness, causing me to feel closer to them. It's refreshing and fun, and since those images people send disappear forever afterward, there is no limit to how silly (or vile) Snaps can be...think of the implications! [Note that these points I'm making are based solely on my intuition/experience and are not (to my knowledge) backed by research.]

    
Certainly not my most ridiculous -- those snaps have thankfully been lost forever.

Reason #2: Face-to-pixelated-face communication.

Going along with the first reason, I feel that Snapchat is just an inherently more intimate way to communicate for the sole reason that it involves visual imagery. When you send a simple text to someone, emotion and nonverbal communication are "lost in transmission" so to speak. Even when you're on the phone, you still can't see the person you're talking to, creating a sense of distance and separation. This depersonalizes the conversation, thus resulting in a less intimate exchange. When you add even a low quality, ephemeral image to your text, the conversation becomes much more personal, enhancing the social connection between you and your friend.

Just look how much Brady misses his "boyz." So sincere and cute...

This might be Snapchat's greatest asset: the simulation of face-to-face communication. Snapchat allows users to simulate emotions (like excitement, sadness, cheerfulness, rage, etc.), make funny faces, or create videos that explicitly show what they are up to (instead of just texting back "Nothing much, what're you up to?"). All of this serves to enhance the intimacy of this digital conversation beyond what a text or phone call can offer. One shining example in my personal life attests to this: My old friend and neighbor now lives in Bismarck, and I used to never seem him and rarely talked to him since he moved there. Now, with Snapchat, I "see" him almost everyday and feel almost as close to him as I did back in the good 'ol days.

Again, these are only observations made by me (and most likely many others) through extensive experience with the app and not backed by any research.

Reason #3: All fades, given time.

The third reason is undoubtedly what led to the explosion of Snapchat's popularity in the first place: Time limits. Having a set amount of time before that Snap fades into oblivion allows people to express themselves in ways texts or picture messages never could. People can communicate secret information without fear of being found out because once time's up, that savory piece of gossip or address to the exclusive party will be gone forever.

Furthermore, the time limit hearkens to our more...sinful nature. I won't even begin to speculate on the number of lewd, incriminating, sexually charged Snaps that have been sent since the app's creation (none by me, I assure you), but I imagine it's an obscene figure. Nevertheless, couples can communicate more freely and explicitly than ever before, which no doubt fuels each others' sex drives and in turn leads them to be more sexually active, which is usually a good thing for relationships. On the other hand, I'm sure many screenshots have been taken of these revealing snaps, promoting blackmail, exposing infidelity, and ultimately leading the destruction of relationships. But I think the pros of the time limit outweigh the cons. In general, this sort of "anything goes" mantra that Snapchat boasts eviscerates any limit to creative and experimental communication, which I believe is a good thing.

Overall, I think Snapchat deserves a fair shot from anyone with a smartphone, and I hope I did a good job of persuading you of its value. Continuing my recent them of openmindedness, I ask you all to give it a try, see what you think. You (probably) won't regret it!

Drew certainly hasn't.


Follow me on Twitter (@Elder_Bass) or find me on Facebook to stay updated on future publications.


Edited by Ken McGurran

Wednesday, October 16, 2013

Gravity, the non-Newtonian, film kind.

Before I start, note that this post will contain no spoilers for the film Gravity, at least nothing one couldn't glean from the trailers.

Also, please go see Gravity.

I remember the first time I saw a trailer for the film Gravity. I may as well have bought my ticket then, not just because of Clooney's beautiful almond complexion, but because of the space setting (for which I hold a strong fascination) and the seemingly incredible visuals. Then the end of trailer revealed certain 3-dimensional intentions...like 80% of movies nowadays. Despite rolling my eyes, I wasn't dissuaded from the prospect of seeing it; I figured a 2D version would also be available.

Though I wasn't completely opposed to seeing this in 3D...

The weekend it came out I was in Minneapolis and didn't have a chance to see it. Consequently, other people saw it before me, namely a film buff friend of mine who holds a notoriously cynical and blunt worldview. Thus I was taken aback when he told me to see Gravity in 3D. Like literally taken aback.

Let's take a minute to flesh out everything I loathe about 3D "films."

1.) 3D movie tickets cost more money.

--My inherent frugality steers me clear of any 3D version of a desirable film. The added "immersion" doesn't justify an extra $2+ from my wallet.

2.) 3D might be the biggest media gimmick in recent memory, a clear scam, and an annoying fad.

--I don't think it's any secret that slapping on 3D to a movie is a sure way to sell more tickets; certainly studios know this otherwise they wouldn't do it ALL THE EFFING TIME. It's a common misconception that 3D movies are expensive to make; it's actually pretty cheap to add 3D, relative to the increased revenues from doing so, only costing an average of 18% more in the budget. Adding 3D to a "film" is just an easy way for studios to increase revenue, because us lemmings buy in to the "immersion" factor.

Exhibits A, B, and C: Jonas Brothers: The 3D Concert Experience. Just a straight money grab if I've ever seen one.

3.), 4.), and 5.) 3D does not add to the movie-going experience, and actually detracts from it by giving you a headache. Also, wearing the glasses tints the picture slightly, making the film artificially dark.

--This is definitely my most subjective, opinionated reason for hating 3D, even though it's backed by research. On average, moviegoers claim 3D does not enhance the film and actually can cause discomfort in the form of a headache, though obviously some people will claim it does enhance it, otherwise how would 3D films make money?

I'll stop there. 

The only time 3D impressed me was the first time I actually saw a film in 3D -- the premiere of Avatar. It was 2009, I was a freshman in college, and all my dorm-hall friends had succumbed to the hype surrounding James Cameron's magnum opus, a film literally 15 years in the making. Slightly jaded towards mainstream movements as I was (as we learned in the last Harry Potter-themed post), I begrudgingly went along with my friends to the film, and attended it with a clear bias against it (which I regret in hindsight, as I strive to become less and less biased...even though my bias came true and the film itself was terrible).

But, it honestly was the most visually striking film I had ever seen. What Cameron's script (utterly) lacked in imagination, Pandora's flora, fauna, and landscapes more than made up for it. His colorful world came to life in 3D, seducing my optical nerves. Though reluctant to admit it to my friends immediately after the film, I had been immersed in Avatar.

This scene was particularly beautiful. I forget the context but will always remember the visuals.

Then every movie started using 3D and I actively avoided all of them.

Until Gravity. 

(Back to present day)

Shortly after hearing the recommendation for seeing Gravity in 3D, a couple friends and I shelled out the extra cash for some 3D tickets, donned our glasses, and were consequently captivated.

I'm no film expert, but I've seen my fair share and I can tell you that I don't think I've ever seen a film more perfectly paced with its action and intensity. Top notch directing and some the most gripping, white-knuckle scenes I have ever sweated through. Bullock had a rocky start but (my dislike for her hesitates to admit this) she finished with a wonderfully strong, emotionally charged performance. Clooney was beautiful, as always. His acting was pretty solid too.

But what really made the film for me was the sense of depth added through the 3D. Think about it: space is a vacuum, absent of gravity (aside from gravitational pull/orbit). Objects "float" in space, and when a force is acted upon an object, said object will "float" indefinitely into the abyssal expanse of stars and planets. The director used 3D to intimately convey this concept in an unprecedented way. Exploding debris flew past my eyes while the astronauts tumbled helplessly within frame, all the while that indifferent and stolid blue sphere pervades the background. These layered shots by the director make the viewer feel so close to the action and at the same time give him/her a sense of the infinite depth of space, heightening the "gravity" of each scene and thus eliciting even stronger emotions.

What's more, many pieces of the debris in the film have symbolic weight as they flutter passively around the wreckage. Peaceful shots focusing on these objects amid the shambling remains of shuttles and satellites evoke even more emotion when viewed in 3D.

My friend Ken said it better than I ever could: "The idea here is obviously that even high order cognitive processes (i.e. interpreting symbolism) can be enhanced by somewhat simplistic measures that target sensation. To be poetic, 3D helps to blend the visceral and the rational. More intense sensory experiences can lead to more intense perceptual/cognitive ones." More simply, he's saying that the use of 3D in this example enhances our sensory experience, which in turn enhances our brain's reaction to that sensory input, causing more neurons to fire and thus allowing us to have a more profound understanding of the emotions and symbolism within the film.

In summation, while viewing this film I obtained such an intimate feel for the vast expanse of space, which added to the emotions of despair and helplessness the film continually (and masterfully) instills in the viewer. When I wasn't holding my breath, I kept saying to myself,  This is how you use 3D.

Gravity turned out to be one of my favorite and most memorable moviegoing experiences of my life, and as a result I have definitely become less cynical of the use of 3D in movies -- though not by much. Again, my message for this blog lands on open-mindedness: I blatantly refute that Gravity would have been as awesome in 2D as it was in 3D, and I almost let my "refined" taste for film prevent me from fully experiencing -- in my opinion -- Alfonso Cuarón's best film (and he's made some good ones).

Seriously, go see Gravity, but see it in 3D...

...it's so good.

Thanks for reading.


Edited by Ken McGurran

Wednesday, October 2, 2013

A Lengthy Life Lesson, Taught By Harry Potter



I have finally done it. I have read Harry Potter, J. K. Rowling's imaginative epic, in its entirety.

And it took me way too long to do.

I remember when the Harry Potter craze first reared its enormous head like it was last year. Coincidentally, that year, 2002, was one of the most influential and crucially developmental years of my life. *Initiate sequence: LIFE STORY*

2002 marked the first year I would attend public school. I had been home schooled before, but my mom decided she wanted to start teaching again, thus I was enrolled in sixth grade at good ol' Valley Middle School. As fall approached, emotions like trepidation and excitement swap laps in my brain.

Public school meant having actual teachers with deadlines and homework and various classrooms scattered throughout the hallways of a vast school building. But more than that, public school meant trying to fit in. Even as an 11 year old, I remember having an acute sense of self-awareness, and all sorts of anxieties popped in my head, the most pressing being How would kids view me as a scrawny home schooler? My only solace, as it happened, was also my only good friend at the time -- a fellow home schooler who would also be new to the school. Logan had far more charisma than me though; he made friends like Nicholas Cage makes movies. I thought by sticking with him, I would be okay, we were in it together and we would survive together.

But he and I had zero classes together that first year. I was thrust into a foreign (and potentially hostile) land, surrounded by strange faces that displayed strange mannerisms and spoke with a strange lexicon. Sure, we had things in common, like sports, Spongebob Squarepants, and Dragonball Z, but I hadn't the faintest idea how to conduct myself in a way that would appear relatable, normal.

I slowly learned some social norms that first semester, and even made a few good friends (some of whom I am still friends with today). As Christmas approached, plans for a seasonal field trip to the movie theater came with it. We had the option between seeing Santa Clause 2 -- the sequel to the raucously delightful Santa Clause starring none other than Home Improvement's Tim Allen-- or we could go see Harry Potter and the Chamber of Secrets.

If you've ever seen Tim Allen act, you'd know how hard of a decision that was for me.

Of course, I had heard of Harry by then, albeit by accident. My across-the-street neighbor, Trevor, was friends with another kid our age, Brett, and Trevor asked me one day as we were hanging out if he was alone in thinking that Brett bore a striking resemblance to Harry Potter with his glasses on. I gaped at him, confused and unsure how to respond. An awkward pause later, I simply asked, "Who's that?" with a furrowed brow. It was their turn to gape.

They told me all about the books and the movies and the general magic surrounding that unlikely hero.

I shrugged it off at the time. I wasn't much of a reader growing up and I never really engrossed myself in anything other than football and Super Smash Brothers. Having grown up so closely to my brother, who is three years older than me, I feel I aged with him, matured before many kids my age. This premature maturation (does that even make sense?) left me indifferent to juvenile fads, which is how I saw this Harry Potter character. All that talk of spells and wizards and monsters sounded like trivial, outlandish child's play (bear in mind I was unnaturally arrogant for an 11 year old which added to this sense of superiority). Looking back, I think I thought I was above Harry Potter, like a kid who no longer believed in Santa Clause (not the Tim Allen Santa Clause, that I could totally believe in). I scoffed at those foolish enough to buy in to such rubbish.

I also think part of that indifference had to do with my inherent tendency to avoid mainstream fads, not unlike a present day hipster. If something was already huge, I didn't want to be a part of it. This extended to Pop Music (Brittney Spears, Blink-182, etc.), Pokémon/Yu-Gi-Oh, even khaki shorts (I wore exclusively Jorts in middle school). I wanted to pioneer a fad, not jump in after everyone else knew about it, and, again, I think this was partially due to my pride.

So, unsurprisingly, I was one of the few students who chose to watch Tim Allen over Harry Potter. And I grew up remaining ignorant of Harry, Ron, Hermione, Neville, Looney Lovegood, The Dark Lord, and all the mystery, misadventures, and magic that followed them.

I wholeheartedly regret this ignorance.

In college I met a host of new people, many of which gave me that same gaping looking Brett and Trevor did all those years ago when I said I neither watched nor read anything Harry Potter. After hundreds of recommendations, I started teasing the idea of delving into the novels at long last, but I was sidetracked with school, other novels (e.g. A Song of Ice and Fire, which I would recommend to anyone), and life.

Then, this past August I decided to embark on a roadtrip to L.A. with my friend, Aaron. The drive down would take 26 hours, and I anticipated I wouldn't have enough music or topics of conversation with which to sustain my sanity for such a long trip. Thus, I started looking for books I might listen to. After a short time thinking, my mind fell to the elusive Harry Potter series, something I knew would be an easy, engaging, and entertaining listen that would help while the hours away.

L.A. was pretty sweet btw.

At first, I listened to the books as a cynical adult (Aaron and I share a tendency for scrutiny and sarcastic humor, so his presence enabled this). We criticized, nitpicked, looked for inconsistencies and flaws, tweeted our misgivings, but despite this hardened approach I still felt a sense of enchantment hang around me every time the narrator illustrated the scenes and characters. There was just something about that world Rowling crafted that drew me in, seduced me off of my high horse and served me a warm and inviting drink that made me smile after every sip.

By the third book, I stopped criticizing and instead allowed myself to jump in fully to the magical universe. I became a kid again. I tore through the books, reading each more fervently and relentlessly than the last. Sure they were flawed, leaving a wake of unanswered questions, but I was all in, man, head over heels, truly immersed. This was exactly the kind of series I would have enjoyed as a teen, I knew: action packed, shrouded in mystery, riddled with riddles and puzzles, and accurately portraying the inexplicable complexity of teenage love.

And when I finished reading the last words, I felt that inevitable sense of gleeful remorse that accompanies the end, that troubling yet comfortable sense of finality one feels after completing any worthwhile book series, but the emotion was so much more palpable and poignant than usual. Regret washed over me, wave upon crashing wave. Why hadn't I read these as a kid? What would my life have been like had I done so? How might I have changed, whom might I have befriended, how many Potter Lego sets might I have purchased? I kept feeling like I had missed out on something truly special, something rare that only my generation could fully appreciate and experience to the greatest extent because it started with us and because it ended with us -- because it was written for us.

But I refused that gift, I shoved it aside in my pride. In doing so, in closing the gates of my mind to that world of magic, I think I robbed myself of a once-in-a-generation opportunity, and my childhood was less bright because of it.

This realization has taught me most lengthy life lesson to date: Don't let pride prevent you from opportunity. Don't shut your mind on things that seem different. Don't not do something just because you don't think you'll like it. Because that thing, that opportunity, it might just change your life for the better.

I think our boy, Alby, said it best (not Albus, the other one, Albert).






"The mind that opens to a new idea never returns to its original size."
                                                       --Albert Einstein












Keep your mind open, everyone; you will undoubtedly be better off if you do.


Edited by Ken McGurran