Tuesday, September 29, 2020

Everyone Using The Internet Is Affected By This Bias

Of all the baises, I'd put confirmation bias as a top contender for its deleterious effect on critical thinking.

Confirmation bias is the tendency to interpret or seek out information that supports a particular point of view.

So why is confirmation bias bad for critical thinking? Critical thinking is the ability to reserve judgment until objective data has been obtained and reviewed. It prioritizes data that has been obtained without bias, without emotional impact, and without agenda. However, confirmation bias is inherently rooted in emotional wants and desires. When we seek out information to support a particular point of view, we are doing it because we want that point of view to be confirmed, not because we are necessarily interested in actual truth.

Ok, so confirmation bias is kind of an opposite of critical thinking, but how does it impact our ability to think critically? The short version - confirmation bias urges us to shortcut rational thinking and supplant it with an emotionally satisfying result. Critical thinking is like a muscle - if you don't use it, it goes away. So the more you shortcut it, the more it atrophies.

Here is an example - say I am scrolling through social media, and I see that someone on my feed has posted some ostensible political milquetoast that conforms to my world view. I think to myself 'Yes! This is exactly what I think of the situation!' and then repost it, saying something like 'The idiots on the other side of the aisle just don't get it!' But - have I actually challenged my view at all with such actions? Have I questioned myself and why I believe what I believe with my response? Nope. But what I have done is see my world view validated, which further entrenches it, so next time something comes along that might cause me to otherwise think about the why's of what I believe, I have another arrow in my quiver to shoot at it.

Social media isn't the only forum on the internet where confirmation bias exists, though. As Neil DeGrasse Tyson says - "Search engines are the epitome of confirmation bias". Let's say I am discussing a topic with someone, and there is some disagreement as to the final conclusion. How about a rather benign topic - weight loss and calorie counting. A quick search of 'Is calorie counting effective for weight loss?' results in four of the top seven results saying it does work, and the remaining three claiming it doesn't. Regardless of which side of the discussion I start in, there is ample evidence to back up my claim. In a situation where I am having a discussion with someone and there is some back and forth happening, I am already emotionally invested in my argument, so of course I am not going to highlight anything that doesn't support what I am trying to convey. I might be thinking I have made a good argument, but I haven't done any real thinking for myself.

The truth is, it's impossible to rid ones self of confirmation bias - it's just too powerful. Simply being aware that it exists is the first step to mitigating it, but there is also the conscious effort of asking yourself 'why?' when you see something that you initially perceive as evidence. Why am I willing to believe it, and why should I believe it (or not?). If it can't be gotten rid of, we can at least try to be more cognizant of when it appears, and when it is more harmful than helpful. And, staying off Facebook. That helps, too.




Tuesday, September 22, 2020

Common Psychological Barriers That Hold People Back

Have you ever started a project, only to get stuck at a certain part because things weren't turning out quite the way you originally intended?

Or have you started a project and got to a part where things stalled because the next step seemed so minor, so easy to just do at some point, that you never got around to actually doing it?
Or been working on a project, got to a decision point, and things stalled because you couldn't decide on a direction to go?

A while ago, I heard someone mention that the three P's that hold people back are perfectionism, procrastination, and paralysis from analysis. When I heard this, I was struck by both the simplicity, and the truth, to that saying. I've definitely found myself being afflicted by each of those phenomena.

Perfectionism
It seems there is this kind of social pressure that when we put something out, it has to not only be 100% right, but it also has to be polished. I think part of this comes from our own view of things, where we see other people completing projects and looking successful, and it gives us this impression that they have it all together. Maybe that is true for a very select few people, but for the most part - people are their own harshest critic.
This blog is a good example, for me personally. I'd been wrestling for a couple years in how to go about jotting down all these interesting (to me) thoughts. I went through multiple iterations of how to go about doing it, but kept running into obstacles, or started losing motivation because it wasn't happening smoothly. Eventually, my coach told me to write posts that were 500 words (no more) in length, and not to worry about it being 'right'. That was like the dam bursting open. Suddenly it was a whole lot easier to just pick a topic, write a short bit about it, and take on the attitude that this is all just practice. It's not perfect, but that is OK, because that is not the point.

Procrastination
I find that there are two types of procrastination - the kind that keeps people from starting something, and the kind that is found in longer, more complex projects. The first is obviously resolved just by getting started, but the second is more subtle. In larger, more complex projects, some tasks have little to no visible evidence of forward progress. It's really easy to put that project aside and do something more sexy or flashy on another project that does have a visible result. Meanwhile, that first project stalls, details get forgotten, and it can easily become an albatross just because no one wanted to do the 30 minutes of grunt work needed to complete the task or bring things to a solid checkpoint of progress where it's easy to pick up from.

Paralysis From Analysis
Sometimes, you are presented with multiple options, and there's no real obvious best course of action. Or, you have multiple options, and one is significantly more challenging, and you have to weigh effort vs end result.
For example, earlier this year, my wife and I decided to redo our garden raised beds. I initially wanted to use composite boards, because I didn't want to replace boards in 10 years due to rot. However, once I looked into it, there were several negative factors coming up such as desired size availability, sourcing, and general logistics of acquisition. I spent a few weeks hemming and hawing on what to do, before I realized it was September, and that if I didn't act soon, I'd miss out on being able to get them in this year. So, I ended up making my decision that day to forego composite, and haven't looked back. I just needed a nudge to get out of a paralysis situation. Sometimes, a decision just has to be made.

Sometimes we are held back by external forces, but I find that the three P's above (and their potential variants) represent the vast majority of reasons why we may fail to progress in our endeavors.

Monday, September 14, 2020

What Does it Mean to be an Expert?

Nowadays, it seems like everyone tosses around the term expert without really thinking about what it means. People specialize in something for a few months, and call themselves an expert. People pass a test, and call themselves an expert. People often cite Malcolm Gladwell's 10,000 hour claim, using that measurable value to define what makes an expert (ignoring the fact that Gladwell himself says that the popular interpretation of his 10,000 rule is incorrect). A lot of times, its gotten to the point where person A knows just a little more than person B in something, and person A is considered an expert.

So, then, what does it mean to be an expert? Bear with me while I lay a little bit of groundwork.

I was recently listening to an interview with the late psychologist Anders Ericsson, and in his research on experts and expertise, he found that the top performers in a variety of fields got to be top performers because of deliberate practice. On its face, this is not necessarily surprising, but there is a subtle point in there that often escapes notice, and that is that experts are made, not born. In other words, innate IQ or ability often has no bearing on whether someone becomes an expert. I say often because, admittedly, there are some pursuits where genetics plays a part - being 6' tall is rather short for a basketball player, so players of that height are few and far between because the sport inherently selects for taller players, and height is genetic.

So, if innate IQ or ability has little bearing on success, and deliberate practice is how one gets good at something, what does it mean to be an expert? The takeaway from Dr Ericsson that I got was that an expert is someone who is so knowledgeable about the subject matter, that they are able to create a mental representation of the current system, know what is possible, know what is not possible, and know how things will change with certain inputs (or stimuli).

Lets look at a few examples. We all know that being good at chess is a matter of 'looking ahead {some number} of moves'. When looking at a chess board, the novice player sees pieces, and goes through a progression: "ok, if I move the rook from here to here, then my opponent will likely counter with this other move, and then I'll move my bishop, etc, etc" That is one potential input into the system. Going through that progression for every piece is virtually impossible, and every step away from the initial input (moving the rook) widens the cone of uncertainty even more, just like a hurricane forecast map. Keeping track of all that is impossible, and usually best left to a computer. That is novice and amateur level skill.

However, the master chess player sees the positions of the pieces on the board, and through experience, knows where the lines of force are, knows where their pieces are in positions of strength, knows where the opponent is weak, and so forth. They are playing at an entirely other level where they don't just know the position of the pieces, they have seen and studied these positions enough to know how to tip, or keep, the odds in their favor. It's less about tactics with individual pieces, and more about the holistic view of the game board, with all the pieces.

A similar thing happens with the professional basketball player. With a quick glance, they know where the players are positioned relative to one another, know the players strengths and weaknesses, and know how to move the ball to help tip the odds in their favor.

The thing I like about this definition is that it is not based on some arbitrary number or external validation, such as passing a test. We use those external validations as a way to convey to other people that we possess a certain amount of knowledge - which has its uses - but I think it's important to recognize that is just knowledge, not expertise. Just like I can tell you how the pieces on a chess board move, but that certainly does not make me an expert in chess.


Tuesday, September 8, 2020

Increasing Proficiency Through Deliberate Practice

Most of us have heard the adage: How do you get to Carnegie Hall? Practice, practice, practice.

The sentiment is simple: to get good at something, one needs to practice.

It's not that this idea is wrong (it's not) - but as I've listened to more interviews with people who study performance, and read books like Malcolm Gladwell's "Outliers", it's clear that not all methods of practice are equal.

Let's look at an example of gaining a skill that most people are familiar with - learning a new language.

The traditional method of teaching a new language (to someone who already speaks a native tongue) involves a lot of rote memorization. Memorizing new nouns and verbs, followed by memorizing forms of conjugation. Then, some writing or reading. After a year of Spanish in high school, all I can remember at this point is 'Me llamo es Ryan, y tu?' - and I'm not even sure that is correct.

The point is - memorization isn't a great way to get good at something.

Now, contrast this with how a toddler learns to speak. Do parents teach their kids how to speak by drilling new words, and teaching the forms of verb conjugation? No. Kids learn because everyone around them is speaking the language, the same books are being read to them over and over, and eventually the pattern recognition abilities of the child's brain picks up that every time a picture of this big yellow truck appears, the words 'dump truck' are uttered.

This is obviously the basis for practicing via immersion. And many people do cite this as being an excellent way to learn a language - it's worked for every human being that has ever spoken, so it has quite the track record of success.

However, psychology and neuroscience has further defined a method of practice that can yield faster results, and in some ways more applicable to other skills. That method of practice is called Deliberate Practice.

Deliberate Practice involves the following:
  • Establishing the performance metric to be assessed
  • Identifying objectives just beyond the current level of ability
  • Engaging in exercises specifically designed to reach the new level of performance
  • Ongoing corrective feedback
  • Successive refinement over time through repetition
So, whats the difference between immersion and deliberate practice? For starters, immersion doesn't necessarily take into account your existing skill level, so objectives just beyond your current ability can be few and far between. If I want to learn Spanish, I could go to Spain and speak only Spanish, and I could become fluent (enough) in a few months - but I'd probably find that for the first few weeks, I'd be lonely because I can't hold a conversation, I'd be limited in my food options because I don't know what anything is, and it'd just be a really taxing and exhausting exercise.

Switching gears to another example - if I wanted to learn how to get good at chess, I could play against the computer at a hard level, but I'd get my rear end wiped across the board in only a few moves every time, such that just lasting 10 moves would be dramatic improvement.

So, what do we instinctively do in that chess scenario? We set the difficulty to something that challenges us, but doesn't overwhelm us. Something where we win maybe half the time, and we are able to see different strategies and tactics play out. If we do it right, it's a much more effective, satisfying, and faster way to learn.

Going back to the language scenario - think again to how toddlers learn to speak, and what is actually going on there. As parents, we know that the child doesn't have the vocabulary we do, so we don't expect them to say complicated words like 'excavator' right away. We start them off with simple, single syllable words, like 'dump truck' (which, given they are also learning phonetics, they are just as likely to pronounce it as 'dumb f*ck' - but I digress). We lower the bar to be just beyond their ability, and over time we raise that bar as their skills improve. We actually provide them with all the ingredients of deliberate practice, and we do it rather intuitively.

Yet, somehow, as adults, we tend to forget the elements of deliberate practice as we get older and try to learn new skills ourselves. We may try the full-on immersion method (ever started a job and been told it's 'sink or swim, buddy'?), but don't go through the mental exercise of determining our current skill level, what the metrics for improvement are, and how we are going to get there. Then, when we plateau after the first major bout of improvement (because going from zero skill to some skill is rather easy), we get frustrated at the failure and give up.

It's not that as adults we can't learn new skills - it's that a lot of the time, we take an approach that does not set us up for the greatest chance of success, and when we have other demands on our time, it's easy to drop the pursuit of the new skill.

The other thing that I find interesting about deliberate practice is that there are a lot of parallels with a topic I've posted about previously, which is Objectives & Key Results. The OKR model may have been created to increase productivity within an organization, but it's a model that works at the individual level as well.