But…not everyone will be a programmer, part 2

Yesterday’s post talked about the usefulness in other careers and day-to-day life of the factual knowledge gained from studying computer science, but another idea worth touching on is all of the transferable skills to be gained.

Programming is all about being empowered to create something, not just to memorize and regurgitate. Programming is setting an objective or choosing a task, imagining how to achieve it, and then breaking it down into steps small enough to implement. Learning to program pushes you to be resourceful: when you reach the end of your own knowledge, you have the tools to acquire more and then to make it your own by applying it to a project.  It’s about being able to check your own answers, see that your solution isn’t quite right, and then keep trying new ones until you get it. It’s the open-endedness of the humanities combined with math’s requirements of specificity and clarity. In that, it may not be just a liberal art, but perhaps even the liberal art.

Individually, programmers have to develop strategic thinking, thoroughness, self-reliance, and executive function. And that doesn’t even begin to touch on the interpersonal skills to be learned from working on a large project with a team! Can anyone who works with students (or employees, for that matter) honestly say they don’t wish those skills were better developed?

 

Advertisements

But…not everybody is going to become a programmer.

Oh. So history matters because we’ll all be historians? Kids need chemistry because all of them will be chemists? High school French is important because will become the ambassador to France?

If you’re saying that as a hint at a larger argument that maybe, probably, most of us aren’t entirely sure why high schools have the graduation requirements they do, then yes, by all means, let’s have that conversation.

But for this subject, I can point to concrete reasons. Jobs is part of it, yes, but another part is that even people who don’t do it professionally may be well served by being able to create themselves a website or application. And the bigger picture is that computation is happening in every field, and even if you’re not the one doing it, you’re well-served by being able to understand what’s being computed and why. Does it mean say that we’re focused on critical thinking when we’re missing a giant swath of background knowledge to think critically about? 

And that’s not to mention day-to-day life: In the face of recent credit card information theft, can you really argue that raising the average level of American tech savvy would be a waste of time, or that we wouldn’t do well to know just a little bit more about keeping ourselves and our identities safe online? By continuing to regard programming as a superpower rather than a basic skill, we put ourselves at the mercy of those who choose to use those powers for evil. That seems like reason enough to me.

—————-

Update: When I wrote the above paragraph, I hadn’t yet seen this: Americans are, in fact, more afraid of being hacked than of being murdered. Granted, that’s based on self-reported frequency of fearing something rather than intensity of fear, and I imagine having information stolen crosses most people’s minds any time they use a debit card, or whenever they walk into Target, Home Depot, or any other recently breached business, whereas most of us have fewer day-to-day reminders that homicides happen. Still, it seems telling. Keep yourself safe!

But…isn’t programming, like, hard?

Yesterday’s post was on the potential for real-life job preparation that computer science in schools could be. There are, of course, a number of reasons to argue with that idea, some more valid than others. Over the next few days, we’ll touch on a couple of them.

One of them goes like this: “Programming is what, like, the super duper smart guys in movies do! Isn’t that way too hard for kids?!”

In a word, nope.

We’ll talk at length later about this misrepresentation of computing in entertainment and culture more broadly (I’m looking at you, Apple Geniuses) and how it perpetuates both tech fear and stereotypes. For today, though, let’s just take a second to talk about what writing code is: it’s giving a computer a set of directions to complete a task. It can be done in any of a number of programming languages, which are just that: languages. And language is a thing our brains are naturally wired for.

Sure, many programming languages have more formal structures and rigid grammars than most human languages, and we have to write new ones consciously rather than letting them naturally evolve over time. At their core, though, they’re still languages, and kids in several countries (including Vietnam, Estonia, and now the UK) are picking them right up. I get as excited as anybody about kids in the news who write apps and Minecraft mods and whatnot – and I don’t mean to sell short anyone who’s motivated enough to seek out resources and dedicated enough to stick with a project – but I’m leery of seeing them heralded as prodigies.

Before fellow linguists jump down my throat, though, let me be clear about what I’m not saying here. We’ve convinced ourselves for a long time that kids are somehow magically still better language learners than adults, which may or may not be true. A huge difference between child and adult learners, though, when it comes to computers, is the fear I mentioned before. In learning, especially language learning, we talk about the affective filter, more or less the idea that if you’re scared or stressed out or anxious, it’s way harder to learn things. Kids are better learners of programming languages not because their brains are better wired but because they haven’t learned yet that they should panic about trying.

I’d really, really like to teach the Mommy and Me of programming, more or less : workshops wherein a parent/preliterate pair create an interactive storybook or holiday card together. My suspicion is that parents who would never seek out that kind of thing for themselves or be scared that their questions would be too dumb would give it a try if it were pitched as being for their children, and then they’d be able to see firsthand that it is something kids can learn. And, of course, there’s the added bonus of demonstrating to parents that screentime can be social time with their kids rather than isolating them…but that’s another story for another day.