Lately I've been a little involved in both (learning and teaching that is), and wanted to share some thoughts on the subject.
Do you like science fiction? I used to like it (when I was younger, so much younger than today). Asimov, Bradbury and such. It's especially fun when the story mixes past, future and present; reality and imagination. What fascinates me most in these stories, is how the author is making deep and painful observations about reality through moving the characters into a completely unrealistic setting - future comes handy, because it is hardest to imagine . It's just amazing how, by disguising things beyond recognition, the author breaks the thought conventions and emotional associations built in our brain, and doing that lets us understand the reality beyond what we were able to before. John Lennon said "nothing you can see that isn't shown", right, but you can imagine what you don't see, and that's how you "know the unknown" a little bit better.
There's this recurring theme in sci-fi stories "smart man builds machine, machine becomes smarter than man, machine rules man". Is it a real threat? If so, how can we prevent it? Should we stop building smart machines? Is high-tech going to destroy mankind? I don't think so. For one, if I did think so, I would quit and become an organic farmer, and I haven't done so... so far. But the question does bother me. And the answer is, I think, that if we want to continue building smarter machines, we absolutely need to keep building smarter people, or we'll end up like in those sci-fi stories.
The main purpose of education is often perceived as passing on what we already know to the next generation, so they don't waste time on rediscovering it. But it's only secondary. The real goal should be passing on the ability to explore the unknown and solve what previous generation haven't solved. Because if students can think at least as well as their teachers, rediscovering something will be just a small detour for them on the road to Knowledge. But without a developed mind they have no legs to walk that road. And it's a steep road up, so when they stop, they (inadvertently perhaps, but) inevitably slip to the bottom. So how do we nourish the ability to think?
It's not what we know, it is how we learned it.
When asking ourselves how to teach, we should first turn to introspection - how did we learn? If the way we've been taught made us discover all those great things that we are so anxious to pass on, why don't we teach what we've been taught? Sure, we will throw in a bit or two of what we've discovered, but basically why don't we build from the same grounds? They say it about parenting - if you like the way you were raised, you'll likely be a good parent, because you'll repeat what you saw. However, if you look around in what's going on in education system, in almost every level, you see the curriculum constantly changing, "updating", "modernizing" etc. "It was true then, it's wrong now. We don't need it." Why? Because we have the technology? Big mistake. Ancient Greeks, Hebrews, Egyptians, not having the technology, weren't even tiny bit stupider than us. Without the wheel, there would be no Internet, speaking of which, who invented the Net? But no, we don't need old men, old books and such, religion, history... we have TV commercials to teach us how to live, we have the technology. And once technology costs money, suppliers of technology don't want us thinking independently, because, Google forbid, we may decide we don't need it! In computer science, this is the nightmare sci-fi writers were warning us against, and we should at least try to prevent it from happening.
We can't go forward without understanding the past. Students can't possibly understand Java and Object Orientation before they understand procedural programming, functions, math, logic. Then, when (and if) we show them objects, let's show how they came about, and not a popular imitation. If we have a great book and curriculum that generations of computer scientists and engineers grew on, why are we throwing it away? Let Java, Python and their patrons wait. They will lay their heavy paws on the students in just very few years anyway and will turn them in Dilberts, Wallies, and Alices, converting large XML files to long stack traces. Let's give freedom of thought and curiousity a chance to grow just a little bit in students' minds, so at least some of it can survive through corporate development.
It's not what a technology does, it is how and why it works
We're all the time obsessively looking for solutions to our problems. We barely stop and analyze them, until the solution itself becomes our biggest problem. I watched this QCon presentation a while back, and it was a deja vu in many senses. I encountered problems like that, and I even solved them in somewhat similar way. I may be wrong, but what I get from the presentation on the technical level, is that sometimes Object Orientation as we know it (C#, Java), with all the patterns and practices and such, does not solve our problem. The problem in the presentation reminds me of the expression problem - data and operation-set need to evolve, how do we express the relationships? The proposed solution (although I may be getting it wrong) is to have interface per operation; then the implementation of the interface, using reified generic type parameter, stores the type of object it applies to; then at start-up something wires together data types and implementations; then it all becomes a big happy family of multi-methods, operation implementation chosen via dynamic dispatch on the data type. And an old saying goes "when the problem is hard enough, you will find yourself re-inventing Lisp to solve it". It's interesting how Udi describes arriving at this design - a team of people were struggling with the problem for years, until an "old programmer" came to the project retrospective meeting and said "make your roles explicit". Udi took the guy aside and made him explain. Then Udi (and his team, I suppose) implemented it, and now reported the success at QCon. Who was that "old geezer"? What was he? Udi didn't say, in the presentation he is portrayed as a little green troll. Anyway, why do I care who that mysterious character was? Because I didn't come to computer science for some wise Merlin to tell me where the Holy Grail is, I want to be that Merlin! Unfortunately, I am probably not smart enough, but someone else is. That's why I think we need more wanna-be Merlins; wanna-be Kings we already have plenty.
Since I became interested in functional programming, for example, my Java coding style changed significantly. I thought I knew how to use Generics, only after a taste of OCaml and Benjamin Pierce's writings I realized how little I knew about types. I thought I knew how to write object-oriented programs, only after playing with Smalltalk I realized what object oriented really meant. I may not be able to use Haskell for the day job, but the concepts of immutability, closures, function composition, laziness, are helpful no matter what language I use. Let's look at one example - the (in)famous problem of object-oriented design: does square extend rectangle, or in other words does circle extend ellipse? In both cases the former has 2 distinct properties (edge sizes or focal points) and the latter has only 1. Bob Martin discusses the problem in his article "Design Principles and Design Patterns". Does he offer a solution? No - "design by contract" and Eiffel and some hand-waving. On the other hand, understanding types helps, because then I can "design for subsumption": I ask myself - if Ellipse is a type that describes all ellipses in the world, do they all have 2 distinct focal points? No. Then maybe I shouldn't have methods to get and set them. Maybe my API should try to follow the definition of ellipse more precisely. That surely helps to design good APIs. Furthermore, magically, once we turn the shapes to immutable, the problem almost goes away. If Rectangle has a method Rectangle transform(x,y) that produces a new Rectangle with given sizes, then Square can inherit it with no problem, it would simply produce a Rectangle, not a Square when x != y. Same trick would work for Ellipse. After all, shapes are math definitions, why should they be mutable?! See, a bit of "functional" thinking solved the problem. And the moral - understanding the classic foundation of computer science is necessary for programmers.
Reality distracts clarity of thought.
If there is one more thing that we can learn from these sci-fi stories, it's that detaching things from reality may actually increase our ability to grasp them. However education in recent years is insisting on "examples", or worse - "realistic examples", or even worse "examples of being used in the industry". I am not saying the above is worthless, or unnecessary, but it should not be overrated. At some point education ministry decided to teach elementary school math with actual objects - sticks and such, rather than teaching kids the abstract idea of numbers, and it was a disaster. Math, in general, cannot be taught by following "real world" intuition. Nor logic. Physics has evolved way beyond relatively "intuitive" mechanics. So why are so many educational institutions chasing "real life" technologies, at the expense of classics, and ignoring the "too innovative to be popular"? I know why, of course, - money, pressure from the industry, pressure from students who want real jobs after they graduate. But resisting that pressure is absolutely necessary, for the sake of future generations, to save our civilization! Luckily we still have some universities in Northern Europe :-)
I noticed an interesting phenomena with students - when they are "fresh" and don't carry a baggage of "field experience" (C, Java, curly braces etc.), it is easier for them to take a "different" point of view, to understand more abstract ideas. Generics is one example. Teaching them to experienced Java programmers is extremely hard. But with undergraduates it is, surprisingly, much easier. So why don't we teach Haskell for types, Smalltalk for objects, and maybe C for low-level stuff? Then when they meet Java, or any language they will likely encounter in the industry, it will be a piece of cake to learn. Furthermore, some of them will be able to design the next Java!
6 comments:
Hi Yardena,
Great piece. My degree is in electronics and telecommunications. Whilst I have a keen interest in computer science, I've never liked the look of CS courses or degrees much. They all seem, well far too vocational.
I was prompted to read through the 1968 NATO conference report the other day. You know the one were the term "software Engineering" was coined:
http://homepages.cs.ncl.ac.uk/brian.randell/NATO/
The thing the struck me, was that Dijkstra and the rest weren't very sure about anything. Its interesting seeing them grappling with the same questions we have today on how to apply an 'industrial' approach to the craft of creating software. Its amazing how their postulations where later cast in stone as "best practice", and taught by rote by our venerable education system.
So we have ended up with a myriad of job titles: Software Engineer, Architect, Analyst etc and associated "education" based on nothing more then hopeful speculation :)
I'm happy with my Electronics degree, it has served me well. Although I learned very little about computer science (although the little I did learn turned out to be amongst the most important bits), it did teach me to differentiate between fundamental mathematical and physical concepts and transient telecoms standards and tools. And above all, it taught me how to question and think things through for myself from first principles. It was this questioning that lead me to Smalltalk.
I'm reading a book called Artful Making by Rob Austin and Lee Devin. Its a good read I recommend it. It argues that "industrial making" is often less well suited to "knowledge workers" were we rely on the workers creativity and ability to innovate. Here just like in the Theatre "artful making" is viewed as more appropriate.
So it looks like Dijkstra et al, were barking up the wrong tree all along. Yet we have spent 40 years creating a neatly shrink rapped and "industrialised" software development industry with suitably dumbed down software development vocations and tools.
The funny thing is that at the NATO conference many of the participants said as much. Suggesting that perhaps software development isn't industrial at all, and that it was more akin to a craft :)
We seem to decide prematurely about most things.
Paul.
Hi Paul,
Thanks for the comment.
Funny - I also came to computer science through electronics, but that was still in school. I'll follow your links, it sounds very interesting.
I too believe there is more art in programming than anything else, but I often think - maybe it's just me.
Yardena.
"I think we need more wanna-be Merlins; wanna-be Kings we already have plenty."
Amen! Beautiful!
By the way, I've come to believe that we should teach things in the same order in which our species learned them. (Learn Euclidean geometry before calculus, and so on.) Our problem in CS is that our ordering is all mixed up. Each generation seems to reinvent the same wheel, and master a little less of what our Merlins of old already knew, or at least grappled with.
Our pace, and our quest for ever more clever acronyms to describe what we're doing seem to consume all our energy. There's precious little time for reflection.
Recently Corey Haines had a nice post about how important it is to know our history. (http://programmingtour.blogspot.com/2009/05/road-thoughts-history.html) It's interesting that this theme is in people's minds lately.
Hi Morgan,
Thanks. Yes, we are dismissing the past thoughtlessly, and banging our heads against the walls that our predecessors already discovered and sometimes knew how to demolish or avoid.
When I teach, I put pictures of people into presentations, to put a name and a face to important principles. I'm really glad to see other people find it important.
Yardena.
P.S. Hey, there are loads of interesting Scala stuff in your blog! :-)
I guess that its all a matter of priorities, a good establishment should teach "how to" learn and not try to learn every possible language out there, I feel also that we don't learn from our past experience, too many good ideas are left forgotten.
Post a Comment