Conversation: Choice & Feeling

On April 6, 2011 at 12:31 a.m., I posted the first draft of what will eventually become the fifth story in the “Making and Empathy” chapter in the book “Realizing Empathy: An Inquiry Into the Meaning of Making.” surrounding my experience in the metal shop. This is an edited version of the conversation that followed, regrouped and rearranged for clarity and relevance. Click here for the previous installment, which talks about empathy and mastery.


an-lon: (Smiles) Happens to me all the time when drawing and editing, squinting at it and wondering what’s wrong, and 90% of the time, whatever’s wrong is completely orthogonal to all the directions I was previously searching.

slim: The feeling of hindsight obviousness intrigues me quite a bit. I remember being dumbfounded when my friend shared her story of how she overcame her bipolar disorder. She said she finally realized that she had the power to choose not to be depressed. She told me that it was so obvious in hindsight, that she couldn’t understand why she didn’t realize it before. But the reason I was dumb-founded was that I wasn’t depressed, yet I had never realized that, either. I can choose how to feel? That was a completely novel thought.

Since then, I’ve heard many people say things like “we always have a choice.” But I think it’s imprecise to say that we always “have” a choice. I’m sure it took them a lot of struggles to come to that realization. So what they mean is that we have to become knowledgeable of the choice. Or more precisely, we have to “develop” and “make” a choice that wasn’t available to us previously. That can take quite a bit of effort. It’s not just a matter of “snapping out of it.” Once you’re able to just snap out of it, you’ve already learned it.

an-lon: Ironically, Slim, you knew me during a period when I was genuinely depressed. When I attended the International School of Beijing (isb), I was really alone and struggling. Beijing was my first time living in a big city and I experienced culture shock and extreme loneliness.

I was functional — for where I was at the time, I was pretty convinced I’d just get yelled at if I admitted I needed help —but I remember sleeping 10 hours a day because I just didn’t want to wake up, and making a deal with myself that I’d allow myself to contemplate suicide if college wasn’t better. Don’t get me wrong ,  I wasn’t actively suicidal. It was just my way of mentally kicking the can down the street. I truly have no idea what it was like for your friend.

I think there are links between add and depression, but I don’t think I was ever truly chemically predisposed to depression in the way a bipolar person is. In my case, I was depressed first because I was trapped in a small town — before Beijing — then thrown into a big city — Beijing — with no coping skills.

College and D.C. introduced me to the world, and I was fine after that. But I do know from those high school years exactly what depression is. I had plenty of roller coaster ups and downs in my twenties, but nothing like depression. Nothing like that soul-sucking lethargy of my teens.

Unfortunately, I can’t say the same of the past few years. The allergies are a long story, but basically a year into my stay in L.A., I started experiencing mysterious symptoms:  a sore throat that wouldn’t go away for two months and just overall lack of energy. It took many trips to various doctors to figure out what was going on. I’d do something that would help for a while, then get flattened by some new mystery ailments.

The infuriating thing was, that was never anything huge — I’d just be sick, and tired all the time because when you’re not breathing well, you’re not sleeping well, and when you’re not sleeping well, you’re not living well. After a while, this changed my identity, from an energetic, enthusiastic person to one who carefully rationed her energy.

This also made me realize that perhaps that enormous physical energy was all that had held depression at bay all through those 18 years between high school and l.a. I kept the demons at bay by constantly chasing after new pursuits, which was great, but what I didn’t know was that if you take away the physical energy, the scaffolding that remains is a house of cards.

Thing is, during the healthy decade of my twenties, I’d taught myself to push through fatigue, frustration, and fear. Athletics are a good example of this ; you learn to recognize when to push through pain and when to rest. You know the Nike slogan “Just do it”? Well… yeah. Just do it. And with computers, I’m sure I don’t need to explain how stubbornness pays off. Damn. I pushed hard in my twenties, but I scored a lot of victories, too.

The allergies-and-depression cycle of recent years is a bit hard to explain because I really can’t just blame the allergies. There was a breakup, job angst, and moving to a new apartment. But I’ve coped with all of the above before, and there were good things going on in my life, too. It was all incredibly frustrating because while I definitely recognized the symptoms of depression from that extended period in high school, I could not figure out why it was happening again and why I couldn’t just snap out of it.

As with that period in high school, I never stopped fighting. I never stopped going out and doing what I wanted to do. But I did cut back . There was always this triage of what I had energy for and what my priorities were. In my twenties, I just did it all. These past few years, I hit a point where I couldn’t — I had to make choices.

I’m still convinced that the only reason I snapped out of that depressive period — I can’t truly call it depression, but I felt like I was always close to the edge and could never quite get any distance from it — was that I finally got the allergies under control. Exercise and nutrition are a big part of it, but so were allergy shots and an immune system booster vaccine.

No silver bullets, but basically I feel like myself again after having had to walk through sludge the past three years. I’ve kind of forgotten how to run, but at least I know it’s possible again. (Smiles) I spent three years trying to choose not to be depressed, but the fog refused to lift until I finally got my physical health back.

Did I do it all wrong? Would therapy or medication have gotten me over it sooner? I just don’t know. And I perhaps never will. I’ve been playing these past six months entirely by ear. I do feel safe in the assumption that as long as I have my physical health, my mental health is also safe. But
I no longer take it for granted. And I also realize that the madcap coping mechanism of my twenties — constantly sprinting — literally, when it came to ultimate frisbee, probably wouldn’t have lasted forever anyway.

One thing that tends to not work is trying to will yourself into being more organized/disciplined/attentive. That tends to be a recipe for failure, with all the voices in your head yelling at you for being such a lazy slob and a waste of space. What does work is finding clever ways to set things up such that it’s a downhill slide instead of uphill battle — in essence, coming up with a system that makes the good behavior easy instead of difficult. It’s like the judo trick of using the other person’s momentum for a throw, rather than trying to absorb the force of their blow directly

slim: Indeed. I also think the kind of support structure or environment you’re talking about is essential. Although, I would rather use words like “encouraged,” “supported,” or “amplified” to describe the qualities afforded by such an environment over “easy.” I think there is a significant difference between something being easy vs feeling at ease when you’re in relation to something.

Conversation: Ethics & Computers

On March 19, 2011 at 9:28 p.m., I posted the first draft of what will eventually become the Preface in the book “Realizing Empathy: An Inquiry Into the Meaning of Making.” While much has changed since then, I wanted to share with you this edited version of the conversation that followed, regrouped and rearranged for clarity and relevance. Click here for the previous installment, which talks about computer and acting.


joonkoo: I’m wondering if you should make a clearer definition of the user here. For example, is the user a computer programmer using the computer or just an ordinary John or Jane using the computer? I understand that knowing the exact mechanics or physiology of the computing system may tremendously expand the user’s perspectives, but I also imagine that there would be some considerable costs to learning those mechanisms. Would my mother, a middle-aged lady with few digital friends, ever want to know exactly how the processor and memory work for her to get less frustrated the next time she accesses an Internet browser to receive and view photos that I send?

david: Yes, but what either extremist position about users (ordinary John or Jane vs. super programmer) tends to ignore is the bell curve nature of the problem, which is very similar to my indictment of mainstreaming in u.s. public schools. That is, these need to be seen as somewhat unique user groups requiring distinct, differentiated approaches.

But even if you draw three divisions in the bell curve, which would split say 10/80/10, it is still an enormous design problem. People who use Photoshop are still in a discourse community with considerable depth beyond the average person. It’s even worse at the other end of the spectrum. And this is where I think Peter Lucas,9 founder of MAYA design, and resident genius, absolutely nails it, and my guess is that this is what Slim is getting at with his reference to “physics.”

What Peter says is that you must design for the “lizard brain” first, because it’s the only thing that is consistent across that entire bell curve. (Keep in mind, this is my perception of Pete’s message.) If you learn to do this well, the rest may take care of itself. But fail to get that right, and you either have very little chance, or you’ll be dragging a 200 ton freight train behind you the entire way. That is why our experience with modern technology, even the best of it, falls short.

It’s ironic because we’ve had the technology for it to be a solved problem for at least a decade, but very little work has directed all the physics and graphics innovation at solving the problem of making data manipulable objects with “thingness” much the way Bill Gates describes in “information at your fingertips.” It’s also very similar to the way the osi model10 falls out — meaning that designing for the lizard brain is like the physical layer, designing for higher order brain functions move up the brain stem and can be accounted for in a layered semantics kind-of-way.

But I think there’s an element missing here, which is that what you describe about a user’s experience with the computer crashing or slowing downs is an entirely qualitative judgment. I don’t like computers that crash or slow down, either but the experience is arguably the same or worse if I’m driving my car or bicycle. I ran over a piece of metal on my bicycle commute yesterday and was left with a huge gash in my tire, a blowout, and subsequent wheel lock when the metal piece hit the brake that could have easily caused my clipped-to-the-pedals self to go reeling into the river, but this is the experience of an unplanned and unforeseen mechanical failure. Could the bicycle be made to fail more gracefully? Certainly. But at what cost, with what trade-offs, and what marginal utility? Similarly, I had almost the same thing happen with my little Kia a few months ago in almost the same place and I’d raise exactly the same questions. Kevlar tires, tpms, run flat, oh sure, again at what cost and what compromise?

The design problem that the computer presents is no different though I think what tends to happen here is that because computer science is taught from a very narrow perspective, focused on very quantitative problems, we tend to ignore the qualitative ones, and we do that at our user’s peril. There’s also a tendency, unlike other branches of engineering, to not have much rigor in terms of seeing the trade-offs and compromises in a holistic, systems thinking kind-of-way.

I also want computers and software that fail gracefully, and are friendly and usable, but the path there is very long and very hard and is still beholden to the laws of physics, no matter how much we think we exist in a software world where none of the rules still apply and we can acquire all of these things at no cost to us (the designers) or them (the users).

slim: I’m not saying that the trouble with computers is worse than what we feel elsewhere. What I’m saying is that it’s time we consider the design of computers from the point of view of ethics, not just usability, functionality, or desirability. Why shouldn’t computer programmers and designers adopt the same kind of ethical stance that architects do, for example?

From what I have gathered taking classes in architecture,  there’s a tremendous sense of ethics (not morals) and philosophy of life that goes into educating an architect. I never got any of that as a computer scientist — although, truth be told, whether it would have sunk into me at the ripe age of 18 is questionable. But that’s a whole another discussion.

Even in human-centered design, while we talk about designing for human users, we never get deep enough to the heart of what it means to be human. How can we be human-centered, when we don’t even know what it means to be a human? I’m less interested in the computer affording user-friendliness, usability, or graceful failures. That’s a very object-oriented way of looking at this issue. I’m less interested in objects and more interested in relationships. More specifically, I’m interested in finding out how our relationship to the computer can afford the quality of being immersed in an empathic conversation. The kind of quality that, as far as I can tell, makes us become aware of who we are as human beings.

I have nothing against the laws of physics. As a matter of fact, I think the computer should be designed to accept physics as it is. When designers pretend that the laws of physics don’t apply to computers, weird things are bound to happen.

I don’t think physical materials are there to make our lives more convenient or inconvenient. It just is. Yet because of our evolutionary history, there’s something embodied within us — and something we come to embody as we mature — that allows for us to have an empathic conversation with it. I want the same qualities to be afforded in our interaction with computation.

david: Now we’re getting somewhere! So there are several interesting points I’ll make here. As to your first question regarding architects and computer designers, these comparisons usually fall down because of the chasm between consumer electronics and buildings, structures, etc. There are major differences attributing to elements such as rate of change and stability. Also, classic failures exist in that world, too, though not in the numbers of computers failing, but that’s probably a problem of sample size more than anything. To me, Frank Lloyd Wright’s cantilevers at Fallingwater are beautiful, but they’re not robust from an engineering standpoint. Hmm, where have I seen that before?

The problem with education that you describe is exactly what I was alluding to earlier with computer science’s focus on the quantitative, but I think this is a maturity issue. What I mean is that architecture is a very old discipline. Designing computers and software, not so much. That evolution would, in theory, happen in time, but this will take a long time. Imagine a world in which there are bachelor’s degrees in human factors and human-computer interaction (HCI). Oh sure, there might be one or two now, but imagine a world where they are on the same plane as computer science (CS) degrees.

But in order for such large-scale changes to happen, there needs to be economic incentives. That’s the biggest problem in the entire puzzle here because organizations have no economic incentive to make a radically “better” computer. They’re still making tons of money with “good enough.” I’m hopeful that the rise of mobile computing will give way to better design as the competitive forces there are much stronger than the pc business, just as the same was true for pcs over older mainframes and minis.

But what you seem to be getting at here is a philosophy of computing, just as you describe a philosophy of architecture. That is, not one architect, but an entire movement. This is like Sarah Susanka and the “not so big” movement.11 The conditions for that to exist in computing are not quite as clear to me as in architecture or lifestyle design. That’s possible also with computing, but again, the experience has to be so overwhelmingly great as to cause a parallel economic revolution.

I’d question whether the empathic feeling that you describe between two individuals is even possible with machines. I can’t remember whether this was touched on by Ray Kurzweil in The Age of Spiritual Machines 12 or Don Norman in Emotional Design.13 I don’t know where empathy or compassion originates in the brain, but I’m pretty sure these are very high order functions, and vary individually ( i.e. the continuum from sociopath to the Dalai Lama). Indeed, many would say that empathy and compassion is something we must cultivate within ourselves.

Which brings me to another theme: dogs. Could it be that what you describe is what humans seek in dogs? Dogs are selfless, unconditionally loving, warm, whimsical, carefree — exactly the opposite of “weight of the world” that most adults must grapple with on a daily basis. If the computer could provide a dog-like antidote to adulthood, that would be great. Crazy hard. Which describes the saying, “Anything worth doing…” pretty well.

I suspect that Cynthia Brazeal’s work14 at mit may have some links. Also, David Creswell15 at cmu. He has a publication about transcending self-interest. I think the research questions du jour are these:

What are the determinants of a disposition for empathy in humans? Where is empathy encoded in the brain? Is parity an important part of empathy, or can empathy exist effectively without parity?

The latter would be a requirement for an empathic architectural style to succeed in computing since visiting an empathic requirement on the user would be tantamount to slavery. Until you know the answers to those questions, any attempt to get computers to behave as part of an empathic conversation would be difficult, if not impossible, because there is no other model for empathy but humans. Either that, or I’m horribly confused about the animal kingdom.

Keep up the good work. This is likely to turn into a hard slog if it hasn’t already.


9 Peter Lucas has shaped MAYA as the premier venue for human- and informa-tion-centric product design and research. He co-founded MAYA in 1989 to remove disciplinary boundaries that cause tech-nology to be poorly suited to the needs of individuals and society. His research interests lie at the intersection of advanced technology and human capabilities. He is currently developing a distributed device architecture that is designed to scale to nearly unlimited size, depending primarily on market forces to maintain tractability and global coherence. (MAYA Design, “MAYA Design: Peter Lucas”)

10 Different communication requirements necessitate different network solutions, and these different network protocols can create significant problems of compatibility when networks are interconnected with one another. In order to overcome some of these interconnection problems, the open systems interconnection (OSI) was approved in 1983 as an international standard for communications architecture by the International Organization for Standardization (ISO) and the International Telegraph and Telephone Consultative Committee (CCITT). The OSI model, as shown in the figure, consists of seven layers, each of which is selected to perform a well-defined function at a different level of abstraction. The bottom three layers provide for the timely and correct transfer of data, and the top four ensure that arriving data are recognizable and useful. While all seven layers are usually necessary at each user location, only the bottom three are normally employed at a network node, since nodes are concerned only with timely and correct data transfer from point to point. (Encyclopædia Britannica Online)

11 Through her Not So Big House presentations and book series, Sarah Susanka has argues that the sense of “home” people seek has almost nothing to do with quantity and everything to do with quality. She points out that we feel “at home” in our houses when where we live reflects who we are in our hearts. In her book and presentations about The Not So Big Life, she uses this same set of notions to explain that we can feel “at home” in our lives only when what we do reflects who we truly are. Susanka unveils a process for changing the way we live by fully inhabiting each moment of our lives, and by showing up completely in whatever it is we are doing. (Susanka Studios, 2013, “About Sarah”)

12 Ray Kurzweil is a renowned inventor and an international authority on artificial intelligence. In his book Age of Spiritual Machines, he offers a framework for envisioning the twenty-first century—an age in which the marriage of human sensitivity and artificial intelligence fundamentally alters and improves the way we live. Kurzweil argues for a future where computers exceed the memory capacity and computational ability of the human brain by the year 2020 (with human-level capabilities not far behind), where we will be in relationships with automated personalities who will be our teachers, companions, and lovers; and in information fed straight into our brains along direct neural pathways. (Amazon, 2000)

13 In Emotional Design, Don Norman articulates the profound influence of the feelings that objects evoke, from our willingness to spend thousands of dollars on Gucci bags and Rolex watches, to the impact of emotion on the everyday objects of tomorrow. (Amazon, 2005)

14 Cynthia Breazeal is an Associate Professor of Media Arts and Sciences at the Massachusetts Institute of Technology where she founded and directs the Personal Robots Group at the Media Lab. She is a pioneer of social robotics and human robot interaction. (Dr. Cynthia Breazeal, “Biography”)

15 Dr. David Creswell’s research focuses broadly on how the mind and brain influence our physical health and performance. Much of his work examines basic questions about stress and coping, and in understanding how these factors can be modulated through stress reduction interventions. (CMU Psychology Department, “J. David Creswell: CMU Psychology Department”)

Conversation: Acting & Computers

On March 1, 2011 at 10:14 p.m., I posted the first draft of what will eventually become split into the Prologue and the fourth story in the “Making and Empathy” chapter in the book “Realizing Empathy: An Inquiry Into the Meaning of Making.” The story surrounded my experience observing a friend act the role of Blanche in a play called A Street Car Named Desire. While much has changed since then, I wanted to share with you an edited version of the conversation that followed, regrouped and rearranged for clarity and relevance. Click here for the previous installment, which also includes the introduction of the interdisciplinary participants of the conversation.


david: I think of it this way: great actors are not really actors, they are “be-ers.” They don’t play the role, they manifest the person encoded in the role, almost to the brink of no return. It’s very dangerous territory and quite a few of them have wound up in mental institutions.

Role-playing implies expectations on reality. What’s great about great acting? The notion that our expectations are up-ended. If all the actor does is establish believability, they haven’t really succeeded, because at some point, they’ve got to go over the edge, else it would be a very boring presentation.

slim: Yes, your critique on believability not being the goal is significant. I am curious if that at all relates back to programming. We write code and expect it to produce the same results every time we run it. Not only that, but we also want others who read the code or install it on their computer to believe this to be true as well.

But the reality is that the circumstance in which the program runs changes. For example, the hardware running the code may have different capacity for memory, the memory may be filled in different ways, the hard drive has a different capacity, the power supply has different capacity, the processor, there may be other software running at the same time. The reality is much messier.

Yet programming language designers just keep abstracting all that physical reality away. Trying so hard to make it believable that the virtual machine is the real machine (e.g., Java).

david: In my opinion, nothing has done more to destroy computer science education in this country than Java.

I’d like to point out further, that what lies at the center of actors and musicians, generally, and great artists more broadly, is an ability to be present without expectations of the future or nostalgia for the past. What’s weird — and this gets into the metaphysics of quality à la Pirsig — is that, in my opinion, you can feel this presence, but there is no metric for it. That’s what makes us human.

The Eastern concept of duality rears its ugly head in this story on several occasions, and I would suggest that you might as well label it, and dive into it a little, though it’s a book unto itself. This concept resonates through a lot of what you are saying, meaning that it is another perspective on empathy. The perspective you are presenting is inherently dual, as opposed to moving toward a concept of singularity. Again, metaphysical.

slim: I hesitate to frame this as the Eastern duality. Maybe it’s a choice of words or my misinterpretation of what you mean, but I think of it as circularity as opposed to duality. Imagine a constant movement along
a continuous domain. When you stop along the path and look from any given vantage point you consider what you see to be the other —something outside of yourself. But as you move along that path, as you try to empathize, you eventually feel as if you are that other.

This is what actors do, but all they might have is a piece of script — which is just a bunch of words and some simple directions. So we have to figure out what the script actually means, from our own experiences. We have to first translate it, then interpret it. The same goes for playing from sheet music or learning how to dance. We can’t learn how to dance just by watching how the choreographer’s limbs move. We also have to find out where the invisible force is acting inside the body of the choreographer.

A friend of mine did a beautiful performance piece that speaks to this idea of meaning vs. form. She first filmed herself drawing a circle. Then she projected the film on a surface, and filmed herself again, but this time tracing her movement in the film. She would repeat this over and over, each time tracing the movements of the previous recording of herself tracing the previous recording. Each time, the shape of the circle gets more and more distorted. Eventually what gets drawn doesn’t resemble a circle at all.

In essence, the “why” of the movement gets lost, and all is left is the superficial. To have been able to draw a circle, you have had to understood why the first drawing was manifested the way it was. And once you have that understanding, you might not even draw a circle, but paint one instead. That is the kind of understanding that can only result from having tried to empathize.

jeff: Practically speaking, though, I think there are limits to empathy. For some things, you really either need a sliver of experience or some non-obvious knowledge that helps you imagine the other perspective.

slim: I think so, too. What experiences did you have in mind?

jeff: Like parenting. If you have a dog, you probably have developed a different level of patience than someone who has never tried to train a pet — or anything for that matter. Someone who has never been in a serious accident, catastrophe, combat, or other very dangerous situation probably has no idea what it means for “time to slow down” or “it happened so fast.”

A similar problem arises when people talk about spiritual or religious experiences. Some people may regard spirituality as “we are all connected somehow” or “there is some higher order in the universe.” My concept of this is “all things are connected (sort of)” but there’s nothing mystical about it because we already know we live in the same universe, but to feel it and really empathize with it and think about everything that is happening all the time, is a different concept.

When dealing with Christianity, I have always been puzzled by the idea of “God speaks to us all.” Is that what is actually happening in experience or is that metaphorically true — as in God is the universe? To even get a grasp on that, I’ve found it never makes any sense to consider Christianity from my perspective but as a box unto itself. And I may also have to consider that it is impossible for me to understand simply because I am me and not having those experiences. Perhaps God only speaks to Christians ,  which would make a ton of sense. And then there is the complication that faith is belief in precisely what does not make sense.

slim: I think a significant part of what you’re talking about has to do with language. Depending on what words you use to describe your experience, it could conjure up different experiences in different people, and unless people are willing to establish a shared language in the context of the conversation, more often than not people are not having
a meaningful conversation.

So when you say you don’t have the experience to know what Christian phrases mean, there’s also a chance that you do have the experience but you don’t use the same words used by the Christians to refer to it, which causes miscommunication, and misunderstanding.

jeff: Maybe. Although, I recently finished an excellent audio-book, Amusing Ourselves to Death by Neil Postman. He argued that the form of the media affects how we conceptualize the world so deeply that we are often not aware of how it changes us or how we are different from people in times past. Can we understand what it might be like to live in a society with no print? Or pre-television America, where people would pay money to hear authors read their books from lecturns and people would debate in a language that resembled printed prose rather than the plain-speak we use today.

Similarly, Facebook and Twitter afford relatively short updates and lend themselves to trivialities because it’s become so easy and considered non-imposing to spit out snarky one-liners to friends without considering their context (because it is unavailable). And on a blog, when someone is writing a really long reply, they can’t tell whether they’ve jumped too many topics and have lost their readers completely, because the others won’t see the post until after it’s been posted. So perhaps the rise of writing in society due to the Internet can lend itself to an egotistical style of communication, by the very nature of what the medium is.

slim: Well, I don’t find writing to be a particularly egotistical style of communication, but that of course depends on what you mean by “egotistical” and what you mean by “writing.”

I think it’s the space — I don’t just mean physical or even virtual space, I mean the feeling of space or the relationship between and among participants of interaction — in which the writing is presented can make a writing  egotistical or not egotistical.

Do you really think the nature of the medium affords a egotistical style of communication? The reason I ask is because I’ve had in-depth, thought-provoking discussions about a variety of topics that stem from just a status update on Facebook. So I’m not yet convinced that the nature of the medium somehow absolutely dictates an egotistical style of communication.

Or maybe what you mean is that it isn’t designed with the goal of facilitating a non-egotistical style of communication, and so it’s likely that many people default to something that takes less effort, which is the “egotistical style”? Am I understanding you?

an-lon: A quick note about Internet communities. The type of negative behavior Jeff described — picking fights and baiting and snark — reminds me a lot of people in their cars on the freeway. It’s as if you’re in a bubble and the usual rules don’t apply. I don’t doubt that some of the asshole drivers out there would be perfectly civil to each other in real life, where feedback is instantaneous and actions don’t go without consequences. Such is the power of anonymity.

That said, the Internet doesn’t have to be that way. In a different thread, I described how one very early Internet community evolved from the fan site of an author who was way ahead of her time :  Torey Hayden.1 One thing she had to police in her bulletin boards was language.  People were absolutely not allowed to write like Internet chimps.

The reason was that it was an international board, and she insisted that native-English speakers use proper grammar, punctuation, and capitalization in order to make it easier for the non-native English speakers to understand. Obviously, the non-native English speakers were just asked to do their best. The point wasn’t to punish people for poor English grammar per se, it was to punish lazy and avoidable misuse of the English language.

I really think the language rule made a huge difference not just in what people said, but in how they thought. It reminded them that they were holding a conversation, and that there were people on the other end who might carry with them a vastly different set of cultural assumptions and values.

The other notable feature of the message board was that it was predominantly female in an era when that was still fairly rare. The result was an extremely active and close-knit community that debated and joked about everything under the sun.

People did use avatars and screen names, and were anonymous in that sense, but in general, there wasn’t the kind of mindless hit-and-run you see in, say, the comments section of a New York Times article about politics. I only ever lurked, but for regulars it was a level of addictiveness decades before Facebook. Rather sadly, that’s where the author recently migrated her site.

Her message board had been a significant time commitment for her maintain, as it was pretty much the force of her personality and the ground rules she established that kept the board civilized. Eventually, she decided that the technology that had been cutting-edge when she created the board was hitting obsolescence, and that Facebook was an easier way to interact with her fans and keep the same conversations going.

I think what I’m saying is that there’s a bit of a founder effect2 to Internet communities. If the pioneers are assholes, everyone thinks they have a right to be an asshole. If there’s a precedent for civility, newcomers can learn to be civil too.

And there’s also no inherent reason for Facebook to be as shallow as it often is. The only reason I’m even here is that when Slim started posting substantive status updates to Facebook, I started writing substantive replies.

slim: Jeff, I think you’re saying that when left to our default vices, the way in which Twitter, Facebook, and other social media sites have been designed can direct us toward a certain kind of communication. Some of the reasons why include the fact that it makes the content seem context-free, which leads to misunderstandings, people making assumptions, passing judgments, or being downright malicious for the fun of it, as opposed to contemplating the meaning behind the content or asking a questions in order to further understand and empathize. Please correct me if I am misunderstanding.

jeff: All I’m saying is that the medium does affect how we think. Comparing writing to speech, reflection and revision makes it easier to achieve coherence. I refer to writing as most people encounter it, through online arguments in basically public forums where people don’t know each other. A person needs to make some assumptions about the person they are trying to convince — or more likely, put down. You also can’t confirm your assumptions as you can in person. Most people who argue online don’t follow the argumentative Principle of Charity.3 It’s much harder to be careful and empathic instead of being abusive. Being abusive can also be fun.

And by written communication, I was also referring to short updates like Twitter and Facebook. Those are almost inherently egotistical, not necessarily bad or harmful, but in the sense that the communication has to start with a motive within. Things appear context-free and then you get inappropriate snarkiness.

an-lon: To get back to your story in the acting class, though, isn’t this the human condition in a nutshell? When listening, I seek to be transparent. When projecting, I seek to be saturated. But the “I” remains.

slim: I resonate with those pairings. It directly maps to the pairing I have in mind, which is humility and courage. Can you say more? I would love to hear what you have to say about them.

an-lon: Well, exasperatingly, this was always a visual image first, words second, and an analytical dissection, last. The poem below4 is what planted the image in my head.

If thou couldst empty all thyself of self,
Like to a shell dishabited,
Then might He find thee on the Ocean shelf,
And say—“This is not dead,”—
And fill thee with Himself instead.

But thou art all replete with very thou,
And hast such shrewd activity,
That, when He comes, He says—“This is enow
Unto itself—’Twere better let it be:
It is so small and full, there is no room for Me.”

I am not a religious person, and perhaps not spiritual so much as simply omnivorous, but I had an odd sense from the minute Anson introduced himself that the theologian’s viewpoint was important. Perhaps because there are concepts here that can be expressed no other way, except in the language of the sacred and divine? Certainly, the theme of humility comes into play with this poem.

Anyway, the words “replete with very thou” have been part of my internal monologue since forever — whenever I realize I’m getting in my own way of understanding someone else’s viewpoint.

As for being saturated in order to project, exaggeration is the lifeblood of animation. The illusion of life is precisely that — an illusion. Whether the action in question is a walk cycle or a line of dialogue, you can’t just copy what happens in real life. You have to find the essence of what it is, amplify that, and filter out the rest.

Same with drawing caricatures. It’s not enough to simply give a guy a big nose, you really have find the essence of someone’s facial features and amp that up.

The image in my head was going into Photoshop and cranking up the color saturation of an image, but the metaphor it represents is the exaggeration that is one of the pillars of character animation.

slim: I’m intrigued by what you said about exaggeration and animation.

Is there a degree of exaggeration that is appropriate? In other words, could it be over-done? Where is this need for “amplification” coming from and where is it going? Is the kind of exaggeration you’re talking about related to generating interest in the eye of the viewer? Or is it functional (i.e., if you don’t exaggerate it doesn’t look real)? Or all of the above? Is this really about saturation or contrast?

I know nothing about animation to have any insight into this.

an-lon: Exaggeration is one of the 12 Principles of Animation,5 as developed by Disney during their golden age. If you’re curious, I’d highly recommend a look at the first chapter of The Illusion of Life by Frank Thomas and Ollie Johnston. This is pretty much the Bible for anyone studying animation today, but it’s gorgeously illustrated and extremely readable for a general audience as well.

Here’s the intro paragraph to the “Exaggeration” section:

There was some confusion among the animators when Walt first asked for more realism and then criticized the result because it was not exaggerated enough. In Walt’s mind, there was probably no difference. 

He believed in going to the heart of anything and developing the essence of what he found. If a character was to be sad, make him sadder; bright, make him brighter; worried, more worried; wild, make him wilder. 

Some of the artists had felt that “exaggeration” meant a more distorted drawing, or an action so violent it was disturbing. They found they had missed the point. When Walt asked for realism, he wanted a caricature of realism.

In answer to your specific question of “can it be overdone?” It’s surprisingly difficult to overdo the exaggeration within a drawing, if it’s going in the right direction. If the exaggeration is just going in a random direction, it looks gross and distorted almost immediately, but if it’s going towards rather than away from the heart of the action, you can get away with a really surprising amount of distortion before it falls apart.

A lot of times, we’re given the advice to “push” the pose till it breaks and then back off, rather than inching incrementally toward that imaginary breaking point.

And “is it functional (i.e., if you don’t exaggerate it doesn’t look real)?” Yes, absolutely. Rotoscoping (tracing) live action reference frame by frame almost always comes out looking strangely dead. It takes a human eye to amplify the important parts and tone down the unimportant parts, even when the goal is to be completely unobtrusive about it.

Exaggeration is in pretty much every frame of any animated movie, 2D or 3D. The 12 principles are all so fundamental, they’re in every shot. Sometimes it’s subtle, as it has to be with the handsome prince or the beautiful princess, and sometimes it’s wildly exaggerated, as with the crazy animal sidekicks, but it really is the lifeblood of animation.

When I was first talking about saturation and contrast, it was just at the level of metaphor. What I’m talking about now, you can see in the roughest of pencil tests without any color.


1 Torey is the author of three novels, eight non-fiction books about her experiences working with troubled children and two children’s books. In a writing career that has spanned more than three decades, her books have been worldwide best-sellers, translated into more than 35 languages and appearing as films, stage productions, an opera, and even Kabuki theatre. (Hayden, “The Official Torey Hayden Website”)

“The Official Toery Hayden Website,” Tory Hayden, accessed January 19, 2013,

2 In genetics, the Founder Principle is a principle whereby a daughter population or migrant population may differ in genetic composition from its parent population because the founders of the daughter population were not a representative sample of the parent population. For example, if only blue-eyed inhabitants of a town whose residents included brown-eyed people decided to found a new town, their descendants would all be blue-eyed. (Encyclopædia Britannica Online)

Encyclopædia Britannica Online, s. v. “Founder Principle,” accessed December 29, 2012,

3 The Principle of Charity is a methodological presumption made in seeking to understand a point of view whereby we seek to understand that view in its strongest, most persuasive from before subjecting the view to evaluation. While suspending our own beliefs, we seek a sympathetic understanding of the new idea or ideas. We assume for the moment the new ideas are true even though our initial reaction is to disagree; we seek to tolerate ambiguity for the larger aim of understanding ideas which might prove useful and helpful. Emphasis is placed on seeking to understand rather than on seeking contradictions or difficulties. We seek to understand the ideas in their most persuasive form and actively attempt to resolve contradictions. If more than one view is presented, we choose the one that appears the most cogent. (Oriental Philosophy, “The Principle of Charity”)

“The Principle of Charity,” accessed January 19, 2013,

4 The poem is called “Indwelling” by T. E. Brown. (Brown, “Indwelling”)

“Indwelling,” accessed December 28, 2012,

5 12 Basic Principles of Animation / Squash and stretch/ Anticipation/ Staging/ Straight ahead action and pose to pose/ Follow through and overlapping action/ Slow in and slow out/ Arcs/ Secondary action/ Timing/ Exaggeration/ Solid drawing/ Appeal (Thomas, 1981, 47–69)