The Armour Goes in Unexpected Places

It would have been in most respects a normal day for an online distance education teacher in the early nineties. I settled in to my spot in the studio and made sure everything was working. First the mikes—all OK. Next the Telewriter: I picked up the pen and wrote on the screen and then remotely loaded the first ‘slide’ for the day’s lesson. Again everything was fine. As always the first thing to do would be to greet the students by name and just chat for a few minutes. Besides ensuring that the audio and graphics capabilities were working it had the much more important function of getting the students to open up, to come out of their schools, defined as they were by the walls of the classroom, and now enter into the online one defined only by who was present that day.

Today I had a new student. I was a bit surprised as it was several months into the school year. I asked her name but she did not reply. Eventually another student at that school answered for her, telling me her name and letting me know she was shy.

Over the next few weeks I did my best to get my new student—let’s call her Angela—involved, but all to no avail. She would not respond when asked a question and would not ever write on the electronic whiteboard when asked to contribute to the day’s work. Her first written work assignment was comprised of mostly blank sheets and so, I decided it was time to contact the school. I called the principal and then learned the awful truth.

———-

In a previous job, around 14 years ago, my designation was Program Implementation Specialist and one of my initial tasks was to put together a team of online teachers who would lead the changeover from the distance education system used in my province since 1988—the one described in part above, and may be described in more detail here if you are interested. Together, the Program Development Specialist and I devised a recruitment strategy that involved an online application system that would be used to provide a short-list of candidates. Those candidates would then be interviewed by a panel of three and would be subject to a reference check. All components were scored and the scores were used to rank the potential candidates, who would then be seconded.

This system was used by me and my colleagues for seven years and provided me with a significant experience in selecting those would be well suited to online learning. Through constant use I came to anticipate the response to one particular question as it tended to give an almost instant measure of whether the interviewee was or was not a suitable candidate. The question? “What would be your response if you noticed that a particular student was not doing well in the course? That is, if you noticed that a student was not engaged, not submitting work on time or doing work that was of sub-par quality?” Typical answers included: putting on extra classes, creating tutorials, providing “worksheets” and maybe even involving disciplinary measures. None of those, however, were the one I sought. I wanted something else.

———-

Oftentimes the truth or the best course of action is not the one that seems obvious. Take my own academic discipline—physics—for example. There’s nothing commonsensical about the majority of what is typically found in the high school physics curriculum despite the protestations of inexperienced (or just plain ignorant) instructors who claim they can “make it easy.” Newton’s first law (objects tend to remain at rest or in constant motion unless acted on by an unbalanced force) is about as counter-intuitive as it gets. Objects remain at rest—no they don’t! Just YOU try sliding a book across a floor; it comes to a stop in no time! No! Newton’s first law is the product of sheer genius; a fantastic off-the-charts insight made by a most unusual individual. Seeing or maybe creating ‘friction’ as a new construct but one that merely presents itself as a new unbalanced force—pure brilliance!

Physics is not something that is not easily absorbed; something that is only understood after a skillfully-constructed instructional framework that involves bringing students right up against their existing world understanding, clearly pointing out the deficiencies and ensuring that the student acknowledges those deficiencies and then carefully rebuilding the worldview in a different way. Not simple at all and certainly not something that happens in a day.

And so it goes with everything. To do better work you have to work hard to get beyond the obvious and, as just pointed out, this involves going up against your “comfort zone” then breaking through it with a whole new worldview. This involves breaking common sense.

———-

Allied Bomber Command faced just such a situation in World War II.

Let me digress for a moment here. I am not one given to glorifying war. While I acknowledge that it is a reality and something that often cannot  be avoided I also want to point out that there is generally no “right” and “wrong” side but instead two opposing groups who have found themselves with no alternative but to act with extreme aggression. It is a reality. Ordinary people like you and I never wish to find ourselves in it but, alas, from time to time it happens and we are faced with no choice but to do what we must.  Under the extreme conditions faced by the various sides oftentimes comes the need to dig down deep and to utilize every and any opportunity that affect the balance of power. Frequently, then, wartime becomes a time of extreme innovation borne of necessity. I wish to consider one case here as it is illustrative of a point I wish to make and not for any other reason.

Bombers, with their heavy deadly loads, are slow lumbering beasts and, as such, are easy targets for fighters who desperately seek to prevent them from achieving their missions. In WW2 many that set out did not return but were instead shot down by the fighter planes they encountered along the way. Those that returned were typically bullet riddled but still able to limp back to base for repair and refitting.

One of the responses to this loss of planes was to install armour that would protect the aircraft from the projectiles from the fighters. Armour, though, is heavy and reduces the load capacity and thus the military effectiveness of the aircraft. The solution, therefore, is to place the armour only where it is absolutely necessary. Bomber command subsequently engaged in a constant, careful study of its in-service aircraft. Each time an aircraft would return from a mission it would be inspected and the location of bullet holes obtained in that flight would be recorded. Typical returning aircraft resembled the drawing below. Notice where the bullet holes are; namely on the wings, tail and fuselage. Based on that it would make sense to place the armour there since, after all, that’s where the hits were occurring, right?

A Lancaster Bomber after a run. The red dots indicate the position of bullet holes.

A Lancaster Bomber after a run. The red dots indicate the position of bullet holes.

Wrong. The reasoning is unsound; fundamentally flawed, in fact.

Fortunately so, too, thought the Allied Bomber Command, thanks to the insight of mathematician Abraham Wald. He assumed that the bullets were not specifically aimed at any one part of the aircraft. Aerial firefighting was much too chaotic an activity to allow for precision aiming. Fighter pilots instead aimed in the general direction of the aircraft and hoped that the bullets/cannon shells would have some negative effect. One would expect, therefore that in an ideal situation, the placement of bullet holes would be more-or-less uniform.

The placement wasn’t uniform, of course as you already noticed from the image. Wald, however went one step further by reasoning—correctly—that hits to vulnerable areas would result in downed aircraft, ones that would not make it back. Since the sample used in the study consisted of aircraft that made it back it would be logical to conclude that they tended NOT to have hits to the vulnerable areas.

Take another look at the diagram. Where are there very few bullet holes? The engine and forward cockpit. Of course! A relatively small number of hits to the engines would render them inoperable. Likewise, hits to the cockpit could result in casualties to the flight crew. In either case the plane would be lost.

Simply put, instead of looking for where the bullets were you should look for where they were not. Those are the parts that need armour, and not the bullet-riddled parts.

———-

So what does this have to do with eLearning? It turns out that in my previous career a significant part of my efforts were dedicated to the improvement of the quality of our instructional efforts. I approached this is various ways: reading about things done differently elsewhere, researching new devices and attendant methods, conferring with teachers and interviewing successful students. These tended, at first, to be my main starting points. Over time, though, I slowly moved away from all of these somewhat.

It started in a somewhat unexpected fashion. Each year I would address all of the intermediate-secondary student teachers at Memorial University in order to explain to them how the province’s distance education program worked. As part of the presentation I would those in the audience who has received part of their high school program from the program to identify themselves and would ask them to offer up their perspectives on the experience.

Of course, in all honesty, I was, in part, “selling” the program. I was part of that same system and certainly took great pride in it and in my contribution to it. While I was making it look like I was seeking an unbiased assessment I know—now—that in the initial stages I was really seeking affirmation; an ‘independent’ external source that validated the program as being worthwhile.

To my great surprise that’s not exactly what I got. Yes, many of the students were quite positive about the experience they’d had in the distance education program, but not all of them were. Numerous students indicated that they’d not found it great or that they much preferred the more traditional face-to-face approach.

The first few times this happened I responded by downplaying the responses, merely assuming that they were just the voices of the disgruntled few who had not enjoyed success probably through their own efforts or, more accurately, lack thereof. In time, though, I came around. Rather than dismissing those voices or, worse, glossing over what they’d said I began showing active interest in their points of view. I would not just let their comments sit unacknowledged; unchallenged. Instead, I slowly came around to a practice whereby I would probe deeper whenever I got the somewhat negative responses, attempting to determine just exactly had led to what I’d found.

It was enlightening, to say the least. Space does not permit a detailed exposition of what I found but, in general, here were a couple of items that were frequently encountered:

  • The choice to enrol in a particular course, which also happened to be a distance education offering, was not made by the student but, rather, by the parents or, even more frequently, the school administrator or the school district office.
  • The instructor had not made a concerted effort to reach out to the student but seemed, rather to either just teach to nobody in particular, seldom involving anyone in the class or, instead, appeared to play favourites.
  • Technical issues had resulted in significant ‘down time.

Now, lest you get the impression that this post is a mean-spirited barb at my former employer, let me assure you that nothing could be further from the truth. The pride I felt, and continue to feel in that program, is built on more than just emotion. It is, rather, something that is rooted in significant evidence that indicates its overall efficacy. The numbers don’t lie and they indicate that the students tend to do well. Just not all of them.

My point, rather, is to point out that in the later part of my career I found much more use in finding out why students did not find success than I did in identifying those factors that were associated with success.

Like Wald, I found it useful to consider the planes that did not return.

———-

As for that telling response to the question, “What would you do if a student is not having success in your course?”

The desired response: “I would find out what was wrong.” That’s a lesson I earned through long and often painful experience.

Never mind the extra classes, the tutorials and the varied approaches, just figure out why the student is not doing well and do what can be done.

———-

But there’s still ‘Angela,’ the student I found in my class, the one who unexpectedly dropped in and who was not finding any success. Yes, I did seek to get to the bottom of it all.

And I did.

I learned that she had just returned to her home community, after living away for several years. Her mom was a single parent but had found a new boyfriend so she’d moved away to be with him, taking her daughter with her. It became an abusive relationship and one night, in a drunken rage, the boyfriend had murdered Angela’s mom while she was present there in the apartment. She’d returned to her home community and was placed in foster care and that’s why she’d been dropped unexpectedly in my grade eleven physics class.

I tried as best I could to make things work for Angela. Unfortunately I did not succeed. I did not end up giving her a passing grade and she was not in my online physics class the following year. I do not know how she fared in life after that but do think of her often, especially when I need a good dose of humility. Sometimes, even with hard work, skill and insight you still cannot get the success you hope for. Yes, you generally do, with effort and teamwork, but not always.

Angela did not have a good experience in my Physics class. It continues to be a humbling truth.

, , ,

16 Comments

Small Schools Rank Higher: It’s Built-In; Do the Math (& a twist)

From time to time you will see institutions ranked according to various criteria. Generally this is done with the intention of demonstrating how well each is performing. It’s not unusual to see this done with schools and here’s a claim that is often made, and substantiated by the numbers:

Small Schools Tend to Lead the Ranks

This is consistent. For years I saw it in my own region, reflected in the annual report card issued by the Halifax-Based Atlantic Institute for Market Studies (AIMS). Year after year, small schools led the provincial rankings. As a professional whose entire career was devoted to the betterment of small rural schools I wanted to be able to brag about this, to puff out my chest and say, “look, I told you that small schools were better for our children. It’s obvious that the extra care and attention they get on an individual basis, as well as the better socialization caused by the fact that everyone knows everyone else, is making a positive difference.”

I never did that, of course. It’s not because I don’t believe in small schools–I truly do. My silence on the matter was, to some degree due to the fact that at the time the studies were published I was a non-executive member of the Department of Education. As such I was not authorized to speak on its behalf. That was not the real reason though.

No, I knew that a far more powerful force was afoot; something that affected the results much more than did either good teaching, a supportive (small) ecosystem and the presence of many brilliant bay woman and men.

Although all of those are positive factors.

No the most powerful effect was something else, something related to straightforward mathematical behaviour, and if you’ll spare a few minutes of your attention I will explain.

Simply Put: Small schools have an advantage in these rankings that is due only to the fact that they are small.

And there is an unexpected twist too, one not often mentioned in the discussion of the reports.

Here goes!

Let’s simplify the situation and assume that the rankings are based on the outcome of one test only. Furthermore, let’s say that the result of that test, for any given student, is completely random; that is, any student who writes it will get a random grade between 0 and 100. In other words let’s act like there’s really no difference between the students at all of the schools. The small ones will still come out on top.

Let’s see what this would mean for ten small schools (we’ll arbitrarily name them sml01 to sml10), each one having only fifteen students in grade twelve and writing the test on which our report is based. The results for all of the students are tabulated below. In reality the table was produced using a random number generator in Microsoft Excel. You don’t need to read the table in detail. It’s just here so you know I’m not making the whole thing up!

School sml01 sml02 sml03 sml04 sml05 sml06 sml07 sml08 sml09 sml10
Score 15.0 14.0 45.0 55.0 53.0 70.0 55.0 100.0 79.0 56.0
Score 6.0 34.0 94.0 75.0 64.0 75.0 59.0 75.0 73.0 52.0
Score 15.0 83.0 65.0 30.0 84.0 46.0 64.0 10.0 35.0 20.0
Score 64.0 16.0 80.0 77.0 10.0 55.0 85.0 32.0 91.0 97.0
Score 29.0 28.0 96.0 98.0 6.0 67.0 51.0 74.0 69.0 9.0
Score 43.0 29.0 49.0 79.0 17.0 64.0 54.0 11.0 32.0 91.0
Score 31.0 49.0 62.0 33.0 0.0 92.0 35.0 59.0 91.0 45.0
Score 51.0 21.0 98.0 75.0 47.0 57.0 32.0 32.0 25.0 58.0
Score 45.0 27.0 18.0 6.0 24.0 31.0 84.0 5.0 89.0 2.0
Score 62.0 62.0 38.0 84.0 16.0 23.0 39.0 84.0 36.0 17.0
Score 9.0 0.0 6.0 67.0 53.0 99.0 54.0 23.0 97.0 15.0
Score 38.0 34.0 21.0 70.0 58.0 40.0 37.0 21.0 56.0 50.0
Score 21.0 55.0 51.0 97.0 92.0 40.0 48.0 100.0 76.0 4.0
Score 17.0 50.0 43.0 38.0 6.0 1.0 72.0 77.0 16.0 25.0
Score 35.0 69.0 83.0 34.0 75.0 16.0 46.0 51.0 77.0 33.0
Average 32.1 38.1 56.6 61.2 40.3 51.7 54.3 50.3 62.8 38.3

Table 1: School results for ten small schools.

That’s a huge pile of numbers and we are only looking at the results for the schools so lets just redo that table showing only the schools and the averages.

School Average
sml01 32.1
sml02 38.1
sml03 56.6
sml04 61.2
sml05 40.3
sml06 51.7
sml07 54.3
sml08 50.3
sml09 62.8
sml10 38.3
Table 2: Small School Results

Notice that the results show a fair bit of variability. They cluster around an average of 50  but some schools had averages in the thirties while others were up around 60.

Now let’s do it all over again, but this time let’s see what would happen in larger schools (named big01 to big10). For the small schools we assumed there were only 15 students per grade level but for the larger ones let’s assume that there are in fact 120 students in grade 12 writing the test.

The rather long table is below just so you know I’m not pulling the numbers out of my head. As was the case with the small school simulation it was done using a random number generator in Microsoft Excel and just pasted directly into WordPress. Scroll to the bottom of the table :-)

School big01 big02 big03 big04 big05 big06 big07 big08 big09 big10
Score 67.0 100.0 82.0 25.0 27.0 100.0 10.0 69.0 22.0 96.0
Score 42.0 92.0 63.0 16.0 42.0 12.0 69.0 94.0 66.0 60.0
Score 100.0 27.0 42.0 59.0 88.0 79.0 83.0 49.0 27.0 96.0
Score 44.0 29.0 46.0 63.0 28.0 69.0 31.0 19.0 18.0 59.0
Score 16.0 33.0 66.0 81.0 11.0 21.0 76.0 67.0 70.0 85.0
Score 95.0 31.0 11.0 2.0 80.0 15.0 21.0 78.0 91.0 33.0
Score 32.0 32.0 38.0 34.0 44.0 61.0 55.0 0.0 89.0 64.0
Score 82.0 6.0 96.0 57.0 8.0 52.0 64.0 55.0 62.0 72.0
Score 100.0 48.0 45.0 10.0 49.0 93.0 89.0 72.0 87.0 72.0
Score 88.0 91.0 33.0 0.0 36.0 11.0 76.0 10.0 78.0 55.0
Score 54.0 95.0 41.0 68.0 92.0 75.0 54.0 75.0 20.0 52.0
Score 44.0 79.0 88.0 69.0 82.0 9.0 31.0 74.0 1.0 78.0
Score 71.0 36.0 2.0 51.0 58.0 2.0 17.0 68.0 29.0 36.0
Score 93.0 47.0 89.0 91.0 25.0 47.0 85.0 96.0 63.0 23.0
Score 9.0 79.0 33.0 41.0 68.0 19.0 74.0 81.0 57.0 47.0
Score 77.0 84.0 28.0 44.0 2.0 54.0 37.0 48.0 25.0 54.0
Score 20.0 74.0 33.0 57.0 15.0 65.0 85.0 59.0 21.0 80.0
Score 56.0 98.0 27.0 68.0 45.0 75.0 58.0 71.0 92.0 58.0
Score 7.0 70.0 83.0 74.0 26.0 52.0 71.0 40.0 75.0 87.0
Score 80.0 32.0 65.0 7.0 54.0 62.0 68.0 7.0 87.0 88.0
Score 65.0 12.0 68.0 22.0 5.0 26.0 36.0 92.0 79.0 40.0
Score 87.0 89.0 51.0 70.0 96.0 98.0 56.0 13.0 10.0 51.0
Score 52.0 71.0 13.0 86.0 88.0 54.0 11.0 20.0 26.0 18.0
Score 69.0 57.0 11.0 36.0 39.0 5.0 38.0 56.0 82.0 40.0
Score 95.0 54.0 54.0 77.0 52.0 74.0 100.0 82.0 35.0 7.0
Score 49.0 80.0 24.0 42.0 11.0 82.0 70.0 18.0 30.0 19.0
Score 46.0 26.0 3.0 56.0 54.0 50.0 2.0 9.0 26.0 47.0
Score 58.0 57.0 98.0 62.0 65.0 50.0 7.0 94.0 9.0 43.0
Score 86.0 86.0 32.0 81.0 63.0 49.0 60.0 61.0 93.0 5.0
Score 9.0 54.0 74.0 65.0 27.0 38.0 42.0 30.0 42.0 99.0
Score 41.0 37.0 30.0 70.0 77.0 86.0 58.0 48.0 53.0 99.0
Score 23.0 82.0 9.0 73.0 9.0 9.0 86.0 27.0 57.0 50.0
Score 52.0 97.0 91.0 90.0 58.0 11.0 56.0 16.0 53.0 89.0
Score 54.0 84.0 46.0 0.0 26.0 55.0 36.0 94.0 89.0 46.0
Score 94.0 75.0 32.0 16.0 77.0 9.0 87.0 21.0 58.0 59.0
Score 77.0 27.0 93.0 65.0 61.0 23.0 53.0 60.0 29.0 23.0
Score 41.0 26.0 34.0 21.0 24.0 57.0 34.0 78.0 99.0 90.0
Score 73.0 67.0 83.0 54.0 99.0 63.0 24.0 65.0 75.0 37.0
Score 55.0 76.0 30.0 85.0 92.0 57.0 31.0 69.0 82.0 43.0
Score 12.0 38.0 53.0 56.0 40.0 67.0 3.0 50.0 86.0 90.0
Score 48.0 89.0 86.0 77.0 80.0 83.0 92.0 38.0 67.0 0.0
Score 59.0 81.0 65.0 0.0 47.0 24.0 57.0 18.0 27.0 90.0
Score 32.0 72.0 46.0 54.0 92.0 54.0 41.0 99.0 0.0 87.0
Score 5.0 55.0 0.0 78.0 13.0 83.0 60.0 68.0 68.0 86.0
Score 0.0 0.0 91.0 66.0 38.0 22.0 2.0 82.0 32.0 12.0
Score 19.0 74.0 40.0 54.0 93.0 37.0 68.0 75.0 57.0 35.0
Score 13.0 81.0 36.0 39.0 50.0 3.0 44.0 19.0 100.0 16.0
Score 36.0 95.0 4.0 100.0 60.0 89.0 47.0 99.0 70.0 43.0
Score 29.0 46.0 12.0 92.0 35.0 28.0 17.0 74.0 38.0 85.0
Score 49.0 84.0 35.0 70.0 36.0 12.0 32.0 43.0 81.0 39.0
Score 87.0 32.0 89.0 71.0 11.0 0.0 93.0 51.0 10.0 39.0
Score 43.0 27.0 12.0 9.0 81.0 78.0 52.0 99.0 82.0 86.0
Score 51.0 41.0 50.0 73.0 83.0 65.0 51.0 44.0 89.0 5.0
Score 21.0 56.0 89.0 6.0 47.0 41.0 57.0 17.0 72.0 53.0
Score 12.0 39.0 51.0 18.0 96.0 75.0 23.0 39.0 75.0 39.0
Score 0.0 48.0 11.0 51.0 61.0 22.0 39.0 35.0 88.0 75.0
Score 33.0 53.0 23.0 68.0 88.0 69.0 48.0 40.0 19.0 100.0
Score 31.0 30.0 82.0 31.0 13.0 55.0 89.0 94.0 40.0 60.0
Score 90.0 5.0 19.0 26.0 68.0 60.0 77.0 63.0 51.0 6.0
Score 41.0 65.0 72.0 76.0 91.0 11.0 71.0 37.0 68.0 53.0
Score 24.0 80.0 70.0 73.0 61.0 4.0 79.0 59.0 37.0 73.0
Score 11.0 24.0 72.0 48.0 64.0 28.0 38.0 79.0 66.0 22.0
Score 22.0 13.0 14.0 83.0 2.0 21.0 95.0 100.0 55.0 55.0
Score 50.0 97.0 59.0 85.0 15.0 82.0 77.0 31.0 21.0 92.0
Score 81.0 9.0 45.0 56.0 16.0 55.0 66.0 69.0 79.0 78.0
Score 36.0 74.0 68.0 7.0 36.0 42.0 5.0 76.0 41.0 76.0
Score 30.0 35.0 68.0 59.0 92.0 50.0 9.0 50.0 98.0 97.0
Score 30.0 31.0 2.0 1.0 62.0 64.0 82.0 88.0 84.0 53.0
Score 4.0 46.0 55.0 54.0 61.0 42.0 81.0 77.0 25.0 27.0
Score 32.0 51.0 79.0 58.0 2.0 33.0 66.0 92.0 20.0 68.0
Score 70.0 76.0 52.0 24.0 2.0 21.0 6.0 98.0 63.0 37.0
Score 54.0 68.0 91.0 56.0 58.0 32.0 41.0 74.0 64.0 45.0
Score 37.0 48.0 29.0 42.0 4.0 93.0 10.0 29.0 97.0 40.0
Score 14.0 47.0 46.0 83.0 80.0 52.0 42.0 54.0 33.0 29.0
Score 15.0 2.0 100.0 12.0 9.0 84.0 52.0 53.0 53.0 6.0
Score 8.0 23.0 35.0 63.0 78.0 34.0 30.0 75.0 14.0 54.0
Score 16.0 90.0 13.0 80.0 32.0 29.0 99.0 21.0 34.0 80.0
Score 99.0 48.0 47.0 5.0 71.0 88.0 77.0 68.0 50.0 2.0
Score 45.0 15.0 18.0 38.0 49.0 8.0 90.0 13.0 71.0 33.0
Score 42.0 50.0 86.0 80.0 79.0 53.0 21.0 81.0 53.0 36.0
Score 16.0 14.0 51.0 14.0 19.0 97.0 50.0 49.0 8.0 2.0
Score 34.0 85.0 55.0 54.0 49.0 63.0 1.0 58.0 73.0 13.0
Score 8.0 98.0 9.0 7.0 70.0 78.0 41.0 18.0 94.0 74.0
Score 59.0 43.0 31.0 30.0 97.0 85.0 64.0 94.0 3.0 91.0
Score 29.0 34.0 6.0 17.0 43.0 78.0 67.0 17.0 50.0 34.0
Score 80.0 12.0 98.0 24.0 84.0 25.0 96.0 76.0 16.0 67.0
Score 87.0 89.0 11.0 86.0 5.0 39.0 83.0 98.0 27.0 13.0
Score 62.0 73.0 69.0 91.0 47.0 52.0 91.0 57.0 87.0 39.0
Score 22.0 64.0 86.0 64.0 10.0 88.0 6.0 62.0 91.0 26.0
Score 28.0 74.0 88.0 19.0 45.0 97.0 94.0 3.0 75.0 30.0
Score 27.0 11.0 11.0 55.0 39.0 30.0 39.0 54.0 99.0 86.0
Score 94.0 85.0 60.0 1.0 42.0 23.0 57.0 97.0 58.0 24.0
Score 78.0 7.0 30.0 94.0 26.0 75.0 100.0 11.0 99.0 11.0
Score 94.0 12.0 81.0 50.0 49.0 36.0 68.0 95.0 67.0 33.0
Score 5.0 9.0 39.0 23.0 31.0 29.0 23.0 22.0 57.0 46.0
Score 67.0 55.0 98.0 81.0 80.0 72.0 31.0 53.0 80.0 95.0
Score 68.0 57.0 17.0 34.0 26.0 38.0 46.0 55.0 74.0 24.0
Score 8.0 39.0 82.0 34.0 65.0 74.0 34.0 39.0 62.0 19.0
Score 83.0 97.0 14.0 84.0 71.0 66.0 62.0 13.0 8.0 82.0
Score 83.0 78.0 39.0 45.0 15.0 70.0 63.0 65.0 75.0 68.0
Score 8.0 32.0 75.0 8.0 53.0 67.0 22.0 4.0 34.0 46.0
Score 91.0 38.0 48.0 85.0 11.0 93.0 96.0 17.0 80.0 13.0
Score 46.0 90.0 14.0 41.0 0.0 40.0 97.0 74.0 0.0 25.0
Score 26.0 14.0 85.0 92.0 29.0 63.0 77.0 94.0 80.0 8.0
Score 99.0 60.0 48.0 94.0 23.0 37.0 74.0 57.0 2.0 96.0
Score 51.0 99.0 89.0 67.0 69.0 5.0 91.0 6.0 97.0 97.0
Score 41.0 21.0 55.0 63.0 68.0 55.0 1.0 60.0 11.0 54.0
Score 35.0 18.0 65.0 78.0 96.0 79.0 3.0 22.0 80.0 44.0
Score 64.0 62.0 37.0 12.0 81.0 71.0 50.0 29.0 33.0 82.0
Score 27.0 24.0 4.0 29.0 86.0 36.0 11.0 47.0 77.0 5.0
Score 40.0 14.0 44.0 95.0 78.0 90.0 42.0 41.0 99.0 83.0
Score 10.0 93.0 37.0 48.0 87.0 27.0 79.0 15.0 94.0 57.0
Score 92.0 100.0 24.0 81.0 61.0 93.0 68.0 8.0 54.0 46.0
Score 55.0 79.0 17.0 88.0 96.0 83.0 88.0 99.0 49.0 21.0
Score 88.0 46.0 20.0 17.0 74.0 76.0 12.0 53.0 89.0 22.0
Score 85.0 42.0 26.0 26.0 87.0 69.0 49.0 68.0 57.0 49.0
Score 59.0 31.0 21.0 74.0 56.0 3.0 94.0 95.0 26.0 18.0
Score 100.0 75.0 59.0 39.0 64.0 54.0 7.0 30.0 34.0 38.0
Score 58.0 100.0 60.0 96.0 100.0 70.0 71.0 19.0 58.0 7.0
Score 13.0 66.0 94.0 82.0 92.0 87.0 51.0 51.0 80.0 13.0
Score 36.0 3.0 32.0 29.0 83.0 95.0 20.0 42.0 53.0 95.0
Average 48.7 53.9 48.2 51.7 52.1 51.8 52.8 53.9 56.0 50.4
Table 3: School results for ten big schools.

As before let’s just look at the averages for each school.

School Average
big01 48.7
big02 53.9
big03 48.2
big04 51.7
big05 52.1
big06 51.8
big07 52.8
big08 53.9
big09 56.0
big10 50.4
Table 4: Big School Results

Notice that, like table 2 the results are clustered about an average of around 50. Notice, though, that the numbers are not spread nearly as much.

Let’s put the two tables side-by-side for a better look

Scool Average School Average
sml01 32.1 big01 48.7
sml02 38.1 big02 53.9
sml03 56.6 big03 48.2
sml04 61.2 big04 51.7
sml05 40.3 big05 52.1
sml06 51.7 big06 51.8
sml07 54.3 big07 52.8
sml08 50.3 big08 53.9
sml09 62.8 big09 56.0
sml10 38.3 big10 50.4
Table 5: Averages for both small and big schools

The thing to notice is that the big schools show much less variability. In small schools, individual students who do very well or very poorly (we call them outliers) tend to have a large effect on the average. In larger schools, the increased number of results tends to “smooth out” the results; to make them less variable.

This is something that is well-known in mathematics. It  even has a name: The Law of Large Numbers. Simply put, in larger populations repeated experiments tend to cluster better about the expected result.

Now, this is where things get interesting. Recall that this is all about the fact that small schools get a built-in advantage due only to the fact that they are small. Let’s see what it looks like when all twenty schools are ranked from highest to lowest.

School Average
sml09 62.8
sml04 61.2
sml03 56.6
big09 56.0
sml07 54.3
big08 53.9
big02 53.9
big07 52.8
big05 52.1
big06 51.8
big04 51.7
sml06 51.7
big10 50.4
sml08 50.3
big01 48.7
big03 48.2
sml05 40.3
sml10 38.3
sml02 38.1
sml01 32.1
Table 6: All twenty schools ranked from highest to lowest.

Did you see what happened? The top schools were all small schools. They reached the top due to nothing other than the law of mall numbers working in their favour. Random variability–two or three bright students or an absence of  two or three weaker students had a profoundly positive effect on the school average.

Recall also I mentioned there would be a twist. Notice that while the highest ranking institutions were drawn from the pool of small schools, so, too were the lowest ranking ones, and for the same reason–namely the presence of a few weaker students or he absence of a few strong ones.

So, based on this little experiment it’s plain to see that when ranked this way, small schools will tend to come out on top simply because they are small and the fact that the law of large numbers is better able to work in their favour.

As for the small schools at the bottom, it happens too and it’s at best likely that these are rarely mentioned due to the presence of selection bias on behalf of whoever wishes to weave the numbers into a narrative that suits their own political ends. One wonders, though, how many small schools have been closed or otherwise penalized for nothing other than being the unfortunate victims of chance.

Closing Note: this is in no way intended to cast AIMS in any negative light. To the best of my knowledge neither they, nor the various Departments of Education nor the various school districts ever tried to spin the reports into any grandiose claims regarding big and small schools. The false claims I have heard have generally be made by private individuals, each with their own axes to grind.

As for my own conclusion: Ranking systems, regardless of the context, whether it be health care, law enforcement, customer care or, as is the case here, school-based student achievement, serve a useful purpose but be wary of the law of large numbers before making any sweeping generalizations.

, , ,

12 Comments

Asking Better Questions: Ends and Means in eLearning

In a previous post I considered the possibility that much of what is presented as “Innovation” is anything but that. With access to some fairly new and attractive or otherwise popular products and armed with even a slight grasp of how to operate them it’s relatively easy to create an appearance of innovation. Simply put, if you can get your hands on some new gear, in even a short while you can present quite a convincing front.

Worse again, it is equally easy to generate what passes for proof; to an untrained eye you can make it look like your so-called innovation is creating some real differences. Any of these strategies can give you reams of what looks like convincing evidence:

  • Deliberately pick enthusiastic students or teachers and pile on the anecdotes that endorse the desired point of view. People who rely on system-one (more or less intuitive) reasoning are easily swayed by stories so it won’t be hard to capitalize on that to get some people talking about how innovative the project is.
  • Stage the project in a relatively well-off school or class and then compare the results from this highly-biased “treatment” group to the population in general. Very few will dig deep enough to see that the superior achievement results predated, and were independent of, the treatment.
  • Rely on manufacturer or vendor supplied “research” when crafting reports, proposals and press releases.
  • Bluff; just preface your claims with clauses like “decades of research shows…” and leave it at that. You might be surprised to see how few—if anyone—will call you out. Besides it will be relatively easy to portray those that do as kooks or curmudgeons.

That said, you could instead opt to take the more difficult path and strive for some real gains.

Notwithstanding the cynical tone of the opening of this post, it needs to be emphasized that emerging technologies should be welcomed, albeit guardedly, in all places of learning. I’ve come by this knowledge the hard way with ample personal experience in doing it both the right AND the wrong way. Lessons learned well generally involve first-hand experience and I have it, having done things for both the right and the wrong reasons–but generally having benefited from the experience in either case. It can be summed up succinctly: it’s best when you develop and refine an appropriate match between the technologies and the desired learning outcomes. This means, in particular, to start with the right sort of question:

  • Bad Question: How can (insert gadget name here) be used in the (insert subject name) classroom?
  • Better Question: What combination of equipment and methodology will foster better achievement in (insert subject name/outcome area)?

Notice the difference? Instead of placing the focus on the tools, place it on the learning.

right-questions-01

You might say that, in the end, the two are the same. Yes, in both cases the goal is to do a better job. Take a closer look, though. Notice that the bad question is, in fact all wrong. First, by selecting a particular device it sets serious limits on what can be done. This can even lead to the selection of inferior methods. Consider this: A teacher wants to see if physics achievement can be improved through the use of tablets in the class after noticing that there are some good simulations available and asks, “how can I use tablets in the classroom?” With the best of intentions the approach is changed, replacing hands on activities with simulations. Now, while simulations are an excellent way to introduce topics, especially ones that cannot be done cheaply or safely, it makes little sense, when you think about it, to replace hands on activities involving motion, sound, electricity and light with simulations in which the only physical interaction is sliding a finger along a glass screen! After all, physics is all about interacting with the physical world! How ironic! If, instead the right question had been asked, no doubt the simulations would have been used but their use would have been balanced with follow up real-world interactions.

Second, the selection of a particular device sets in place a condition in which demonstrable improvements are expected. That’s nice, but what if it’s the case that the new technology is in fact inferior? You might suggest, “no problem, the report will show this.” Think about it, though, and be careful to layer in some human nature.

Consider again the previous case involving tablets and physics. Suppose that the unit of study was about current electricity and the tablets were used to explore the topic through simulations in which students constructed virtual circuits involving batteries, resistors, lamps, switches and meters to measure voltage and current instead of doing the same with the real thing. At the end of the unit the evaluation would be based on what could be measured, either online or using pencil and paper, and NOT on actually constructing the circuits.

How likely would it be that students would be able to do the same with real circuits? Not likely.

How likely is it that they would do about the same on a test? Very likely.

What’s the difference? In which class would it be more likely that you would find someone who could help you wire your basement? If, on the other hand, the right question had been asked, again, in all likelihood the simulations would have been utilized but their user would be balanced, blended with hands-on activities too.

Focusing instead on the learning will have two likely outcomes:

  • You will likely not get famous as “it’s” clearly about learning and not about you.
  • The project will show modest but useful results.

Whenever embarking on any effort to improve results in education it’s important to bear in mind one simple truth: you are not starting from scratch. The “traditional” methods that self-nominated reformers (most of whom have only limited classroom experience, other than the imagined stuff) so love to mock are in fact reasonably effective. The huge majority of people–those who’ve not been the beneficiaries of their enlightened practice but who have still managed to thrive nonetheless bear testament to that. Existing methods are, perhaps, not as good as they could be but are still nonetheless effective. Reformers should bear in mind that the traditional methods they so distain have several important advantages over proposed new ones. First, they are understood since, in all likelihood, existing practitioners not only use them now but will likely have been taught using them. More importantly, though, traditional methods have been refined from extensive classroom use. Proposed methods, by contrast are not well understood, raw and untested.

Far too often reformers boldly charge into classrooms armed with little more than vague ideas, shiny new equipment and an unhealthy combination of ignorance and arrogance. Students, parents, colleagues and administrators generally tolerate the ensuing activity since (a) it probably doesn’t interfere with them too much and (b) there is always the chance that some good might come of it. The proponent will usually get a little something—a write up in a journal, perhaps a trip to a conference, maybe even an award—but in the end the students will likely be left no better off and the effect on general classroom practice will be negligible.

It does not need to be that way, though. If, instead, the proponents asked the right question, one that focused on making some real improvement in student learning, then wins would be had all around. That is, better teaching and learning would result and, who knows, maybe the innovator’s career would get a boost anyway.

, , ,

15 Comments

Managing the Distractions

I came across something like this “unhelpful high school teacher” meme the other day and it got me thinking about the distracted landscape our students occupy.

meme-unhelpful-teach-01

All too often the opinions you encounter on the web and in other parts of everyday life are one-sided; normally the work of someone with an axe to grind; someone wishing to provide just one side of a rather complicated issue and this is no exception. There are very valid reasons why educators have to be skeptical about the unrestricted use of electronic devices such as laptops and tablets in class.

In my previous job my office was located on campus at a fairly large university. It gave ample opportunity to view the electronic habits of typical students and was a never-ending source of amazement—both the good and the bad kinds.

One incident in particular stands out. I wished to confer briefly with a colleague who was, at the time, teaching a large class (around 160+ senior education students) in one of two large lecture theatres located in the basement of the building we both inhabited. I decided to just head over to the class and chat with him before it started. Unfortunately, as is often the case, I was briefly distracted, and, by the time I arrived at the door the class had already started. Out of curiosity I looked in. My vantage point was from the centre back and, as the lecture theatre slopes toward the front, I had an excellent view of exactly what the students were doing.

Almost all of them had either a laptop or a tablet device open and active. What was interesting was the fact that the majority of the students were not just taking notes on the machines but also had a web browser open. Well over half of the students would periodically switch from the note taking application (typically a word processor) to the browser. The browsers had the usual suspects, of course (Facebook, Twitter and other social media applications) but a surprising number of students were also shopping online during class time. I’d estimate now that somewhere between 10 and 20 of the approximately 150 students were doing this! Only a very small fraction—I’d estimate now around 20 to 25%–seemed to be totally focused on the lecture; at least as evidenced by their keeping the notes application open throughout the five minutes or so I was watching.

meme-buzz-woody-01

I recall the moment quite well as it was one of those times when something became quite clear to me; a time that has sparked a considerable number of subsequent informal observations. Right then and there I decided to also take a look at the other lecture theatre. This one had a 2nd semester calculus class going on and, unlike the former one, was one in which electronic devices were not that well suited to taking notes (unless, of course, you had a touch screen or some stylus such as a Wacom device in which you could render back handwriting. After all, typing calculus notes is not something anyone can do on the fly!). Guess what? Same thing! Once again I saw a sea of laptops and tablets. Not quite so many, of course—I’d estimate around 50% of the students had them open as opposed to over 90% as was the case in the education class. Once again, though, the screens were dominated by not just social media but also online shopping!

Just a thought—maybe someone should run their own set of observations and verify this. At any rate, this short anecdote lends a bit (yes, I know “piling on the anecdotes” is a very flawed form of research) of credibility to the notion that we all have of how distracted our students really are.

meme-spock

Which brings us to the point: as educators it is in our best interests, and those of our students, if we find effective ways of managing the many distractions that electronic gadgets bring to our classrooms. While it is certainly true that electronic devices hold incredible promise for all aspects of education it must also be acknowledged that the devices are equally effective at pulling students away from the tasks that should be at hand. The same conduit that brings research, information and activities right to the students’ foregrounds is equally adept at bringing in distractions such as off-topic interactions, irrelevant information and other distractions, particularly games that have nothing to do with learning.

Blocking unrelated content is a strategy that will never work. Go ahead and block Facebook at the Wi-Fi router. The students will hardly be slowed at all. Some will switch back to getting it through their phones, which you cannot block. Others will switch to a different social media platform—new ones pop up almost weekly, and still others will just connect through a proxy server which will just circumvent the router and firewall rules. It’s a losing game of cat and mouse.

Blocking the use of electronic devices is equally counterproductive. First of all, it drags instruction back to the 19th century—and we cannot afford to do that. More importantly, though, the whole practice of “blocking” or “banning” is anathema to the whole idea of schools as places of learning.

So what, then? What is the magic bullet? As expected, because it’s nearly always the case, there is no one simple solution. There are, however general strategies that can be applied and which will be found effective. Here are some suggestions:

  • Make a personal contact with the students: When students turn to the web browser they are turning away from you, the instructor. The less personal you are to the students the more they will do this.
  • Communicate your values clearly: Typically around 80% of people will respect your wishes so make sure they know what your wishes are. Make it clear to the students that you do value the use of electronic equipment but that they must also make the best use of their class time. To do this they should minimize distractions and, in particular, save the social networking and shopping for some other time. It’s also worth noting that of the remaining 20%, around three-quarters of them can be convinced to follow along too especially if you ensure that you move around the room to make it apparent that you are checking to see If students are engaged. It should also be noted that the small remainder—around 5% of the total—will do what they please regardless of what you do and you would be well advised that this small group may be regarded as “beyond the point of diminishing returns” so long as they do not distract others with their off-topic pursuits.
  • Find ways to leverage the potentially-distracting technology: You can always find ways to put the devices to some good use. Examples include: (1) getting the students to install “clicker” applications and build in “instant response” activities to your classes (2) provide electronic versions of partial notes (sometimes referred to as “gap notes”) that the students can complete online if they have annotation software (3) make effective use of simulations in class time if appropriate (4) use appropriate application software for in-class activities.

meme-what-if-told-u

, ,

12 Comments

Four Forms of Innovation

The word Innovation is one that is tossed around so much that it’s lost much of its impact. In some ways it’s like “awesome,” isn’t it? Once awesome meant something that literally took your breath away. These days it’s just a tired expression of assent; something that is deemed awesome is more likely just socially acceptable. Similarly, in a world where corporate press releases are grinded out in volumes that rival unit sales neither “innovation” nor “innovative” catch the readers’ attention much.

Add to that the point, already made, that scant few resources exist, whether in the form of HR or money, to engage in the various activities that one might immediately recognize as innovative. Besides in today’s busy, distracted world it’s often hard to spot it when it does occur.

That’s not to say it does not exist—it’s just generally buried under mounds of impressive looking but essentially shallow efforts. A recent journey to the Unemployed Philosopher’s blog reminded me that most of the important work happens far away from fanfare. Day after day, professionals of all kinds, including educators, toil away developing the small but significant things that make practice just a bit better. It is a shame, really. Much of the attention is given to things that appear significant but are really not once you take the time to peer beneath the surface; stuff designed to grab the attention and maybe further some goal, just not the goals one would associate with positive change for all. Sure it may look and sound great but in the end, you’re often left with the professional equivalent of election promises. The real innovations often lie elsewhere, often buried among the many other details that take up our days. They do, nonetheless exist and can be seen if you look hard enough, in one of these four forms.

1. Structured Engineering: The kinds of planned changes that take place in a more-or-less orderly fashion. You have identified a problem to be solved, planned a solution that involves more-or-less standardized equipment & procedures then will implement and test a solution.

For example, suppose you develop an online visual art course. You will carry out a procedure roughly like this:

  • review with the curriculum guide and outline the general instructional strategies, including the method by which they will be developed or acquired;
  • assemble the development and implementation team; formulate the overall plan;
  • select and assemble a system of effective tools and methods by which you will carry out the plan;
  • field test the course and revise as necessary.

Pros:

  • Good fit between need and response.
  • Robust system once implemented.

Cons:

  • Significant up-front cost.
  • Often significant resistance to system-wide change and adaptation.
  • Possibility of large scale failure if wrong choices are made.

2. Structured Deepening: This involves extending an existing system in a purposeful way. As an example, perhaps you chose to modify the aforementioned system by which you are teaching visual art so that you can now teach music online too.

Pros:

  • Significantly less costly than starting from scratch.
  • Less likelihood of large-scale failure.

Cons:

  • Less than optimal fit between need and response since you are modifying an existing system rather than building one to meet specifications.

3. Radically novel: Every so often completely new approaches are developed. It can be argued that before “Star Trek: The Next Generation” nobody thought very seriously about the use of multipurpose digital tablets such as Apple’s iPad or Google’s Nexus Tablet. Now, however these multipurpose devices are changing the way people interact with the Internet, with audio and video and, most importantly, with one another.

Pros:

  • Often based on new devices; carries a shink & new “wow” sense of interest;

Cons:

  • Teaching and Learning sometimes becomes a secondary activity;
  • New devices often lack institutional tech support and have a short lifespan.

4. Entirely new bodies of knowledge and practice: Radically new devices lead, in turn, to entirely new ways of doing things. Consider English Language Arts. In the pre-digital age the focus was on reading, writing, listening and speaking. Now, with so many modes by which we can communicate an additional focus—Representing—is becoming very important. The mobile devices, mentioned above, are also changing the way we interact. Who knows what’s coming!

Pros:

  • Generally a good fit for those who have had the benefit of the events that led to the new development.
  • Often well-suited to the time and place in which they occur; “ products of their times.”

Cons:

  • Often adopted by evangelicals who assume (incorrectly) that the new way is the best way for all.

Through it all, though, it remains as important as ever to maintain a focus on teaching and learning. While the new devices and methods are exciting, if the end result is not a strategically significant improvement in an identified area of concern in education, most notably increased achievement or cost savings, then the innovation is pointless.

, , ,

10 Comments

Structured Integration vs. Cost Savings

How many times do you see “cost saving” being touted as a reason for increased use of educational technology, and most especially distance education? Time and again you will see the adoption of new technology being explained away as cost savings. All you can really do, most of the time, is roll your eyes as you know, beyond doubt, that one of two things will happen. Either (1-not bad) the new technology will wind up costing somewhat more than budgeted—owing to the training costs and other unanticipated costs associated with the adoption and integration process or (2-BAD) it will eventually be abandoned and left to lie, mostly unused, right next to all of the other money wasters that have been purchased through the years.

This does not need to be the case. Properly done, new technologies can be more effective and cheaper; just not that much cheaper. Look around at the cellphones, fuel-injected engines, “green” heating systems and such that have made our lives that much better. The same can happen in our classrooms too but we need to take a much longer view of what comprises cost saving and just plain get over the fool’s quest for that elusive magic bullet.

Cost saving should not be NOT the slashing of departmental budgets and subsequent placement of course notes online just so deficits can be handled in the short term. (Although, admittedly, here in the real world that does have to happen from time to time regardless of how high-minded we would like to be.)That helps nobody as the result will only be a degradation of services, followed by corresponding loss in enrolment. Cost savings might be better framed as the deliberate employment of suitable technologies so that, over time, better outcomes can be achieved at lower cost.

Examples include:

  • Joining classes at separate campuses or schools using videoconference or, even better, a combination of videoconferencing and web conferencing such that smaller student cohorts can be aggregated. In those instances, though, care must be taken such that the host site or the instructor site does not become the “main” site with the remote ones getting the scraps from the educational table.
  • The replacement of non-interactive lectures with series of multimedia-based presentations, preferably with interactive components, such as embedded quizzes or simulations.
  • The gradual replacement of some media types with others but only after a piloting process which (a) shows the worth of the new technology and (b) refines the methodology before full deployment. For example, it may be feasible to replace the printed materials used in a course with online versions, perhaps multimedia or eBooks.

How often has it happened—a new device and its associated procedures shows up unannounced? Perhaps it’s a new set of chromebooks, maybe its clickers, a handheld computer algebra system or a new, shiny, computer numerical control (CNC) machine for the shop class. Whatever. In it comes and with it comes a feeling that you are expected, all of a sudden, to just change everything.

Before proceeding too far it needs to be said that the expectation that you need to change right away if often imagined. It’s been my experience that those responsible for high level decisions do tend to also have a healthy sense of what everyone is up against. After all, the funding that permits that sort of upgrade, itself takes years to put together. The problem is that the expectations that led to the upgrade are often not well understood by those who are expected to implement the change; there’s often a disconnect. Nonetheless, those on the front lines tend to be confronted by a somewhat intimidating set of equipment and feel a corresponding sense of stress on account of what they know needs to happen.

Of course that is just a bit silly. Change does not happen that way. Yes, we are all intelligent and capable of change but none of us is foolish enough to react to every new thing that comes our way, whether invited or not. The change and integration process happens in stages. Assuming that the technology is not another blind alley (and they do happen) it usually plays out something like this:

  1. Familiarization: You have to learn how the equipment works at the most basic level. What’s it for? What do the controls/menus do? What options do you have? In situations like this it’s good to have access to an expert. A demo followed by hands-on activities can be quite useful at this stage.
  2. Utilization: You have to become comfortable with using it. It’s not enough to know what each component does but you have to become adept in its use. Nobody wants to make clumsy or false moves in front of an audience so you need time to practice. If, for example, the device in question is a handheld computer algebra device then use it for your own purposes for a semester or so before even attempting to build lessons around it. If it’s an IWB then you need to take some time to engage in unstructured use—play—with the device in a non-threatening environment. Just close the classroom door and fly solo or, better still, gather a small posse of like-minded colleagues and have a collaborative session.
  3. Integration: Bit by bit you make the use of the technology a part of the natural routine. While you can bring it in all at once it’s much less stressful to layer its use in here and there. If, for example the device is an IWB, instead of ditching your existing lesson plans, try instead to catch the low hanging fruit; that is to redo some of the lessons than lend themselves best to an IWB approach. If it works well, try another and so on.
  4. Reorientation: In time you may find that the “new” equipment and associated methodology becomes your standard approach. That set of chromebooks that you used to despise may, in time, become treasured additions to your classroom; perhaps even indispensable. This will not happen overnight and the stages are likely measured best in semesters, maybe even years.
  5.   Evolution: With new standards come new horizons. You may find unexpected applications of the once-unfamiliar technology. Perhaps you even spot yet another—and for now unfamiliar—set of methodologies that bears promise.

Of course equipment will still arrive unexpectedly and instructors will, to some extent, have to sort it out as best they can. The best advice is to realize that regardless of what else happens the integration process will come in stages, so act accordingly.

, ,

4 Comments

Learning Resources: Where’s the REAL Commons?

Have you ever wondered why, over thirty years after personal computers became affordable, and over twenty years after the widespread adoption of the Internet, digital technologies have still not reached their full potential? There are some generally good reasons why this is so and it’s not primarily due to resistance to change. Let’s examine a typical course.

The Setting

Consider a reasonably popular and somewhat universal course of study: First-year University Physics. This course is taken by students primarily interested in pursuing careers for which a knowledge of that discipline is a necessity—all jokes aside, that is it is manly a course people take because they need to. It is a gateway to a career in oil & gas, mining, engineering, aviation and, yes, the very few who wish to become physicists also take it but they must be considered a minority. The students who sign up are typified by a wide range of interest and ability. Some of them have studied physics in high school and come to the course with a solid background—that is, well-developed laboratory/inquiry skills, mathematics skills and a decent grasp of the fundamentals. Still others have next to no experience in the area, a weak grasp of mathematics and, sadly an interest level that leans more in the direction of “Mom/Dad wanted me to do this,” rather than “This is cool.” The majority, as you would expect, find themselves clustered closer to the middle of these extremes, that is, they have some background and experience as well as enough motivation to make them show up for class and put at least some effort into performing the various required activities (being attentive during class, performing the lab work and making their way through the written assignments as best they can). Overall, to a few the course is pain to be tolerated, to another few it is a total joy; the essence of their existence, but to the majority it is a right of passage; a series of tasks to be done with care but not necessarily with the burning love and passion felt by the instructors and other members of the faculty. Simply put, the audience is reasonably competent and serious but by no means a young version of the faculty

The content of the course is a wide-scale survey of the discipline as a whole. To the extent that it can it tries to provide an overview of the various areas in which the discipline has stepped into. Over two semesters–two courses actually–it includes:

  • A non-matrix approach to statics (forces at rest).
  • A non-calculus approach to mechanics, including potential and kinetic energy, impulse and momentum as well as Newton’s Laws of motion (and maybe Universal Gravitation) with a particular focus on the 2nd.
  • Static electricity including the concepts of fields ( but without the use of field equations), charge and electric potential.
  • Current electricity including Ohm’s Law and Kirchoff’s rules but with a focus on DC circuits and a serious limitation in terms of complexity—the circuit analysis rarely involves the use of simultaneous equations.
  • An introduction to waves, including basic coverage of sound and light. Wave phenomena such as the Doppler effect, diffraction and interference are introduced with a minimum of mathematics.
  • Perhaps: An introduction to special relativity and quantum mechanics, fluid mechanics and geometric optics.

The course endeavours to serve as a bridge in many ways. It tries as best it can to be accessible to students who do not have a previous background while, at the same time, not boring those who do. It does try to impart a fair degree of disciplined thinking while at the same time, encouraging further study. All in all the managing of the course can be described as quite a balancing act.

But here’s the thing: like most (but not all) scientific disciplines it is reasonably universal. That is, the background required by students does not tend to vary much by geography. Unlike, say, history which is impossible to separate from the local culture, first-year physics can be assumed to be more or less the same just about anywhere.

The Issue

This brings us to the big issue: even though there is the potential for a large audience for it, there does not exist a high quality integrated set of digital teaching and learning resources for that course. There are, rather, collections of good efforts that must be assembled and then put to use at each institution, each doing as best they can despite limited human resources and budgets. All things considered this is a great loss.

The same is just as true in other subject areas including Pre-Calculus and Introductory Calculus, Chemistry, Biology and Earth Science, along with possibly Psychology.

Now, before this gets too far let me hasten to explain why this discussion is dwelling on just STEM. It is solely because those disciplines are reasonably global in nature, that is, there is more-or-less worldwide uniformity on what is taught and how it is taught. This is simply not the case for other first year courses such as English (or whatever you wish to call the study that centres on the most popular language in the region), any of the fine arts, liberal arts or social studies. In all of those disciplines the local context matters far too much for anyone to get very serious about talking about a global approach to learning resources. But let us leave that for another time and just return to the ones for which it is the case.

So what is the extent of available learning resources for STEM? Here’s a partial list.

  • Commercially available print-based resources including textbooks and self-study guides. These tend to cost in the range of around $200 each and are generally of good quality. They are logically organized, well-illustrated, complete and correct (contain modes of thinking in-keeping with the established canon). For the motivated student who reads well they serve as excellent and compete resources. For those less motivated they often lie unused, as evidenced by the many so-called “used” (Irony, yes) books out there with unblemished spines.
  • Instructors’ notes and personal websites. Once something you could only get if you could afford the photocopying fee, thanks to scanners, word-processors and most importantly electronic Learning Management Systems these are becoming increasingly accessible. The quality varies widely, owing to the lack of formal peer-review processes that typifies other areas of academic life, but at least in my experience leans toward “very good” more often than “lackluster.” Notes tend to be short-form representations, lacking in the commentary and elaboration available in books. They also tend to be more to the point and, unlike the texts, do tend to be carefully read by students.
  • Communitarian resources such as Wikipedia. Over the past decade these have significantly improved both in terms of scope and quality. For any given topic that one would find in a first-year STEM course the entries tend to be complete and useful. There is no guarantee, though, that the depth of treatment is the same as is expected in the course. Instructor guidance is definitely a must if Wikipedia is used as a source.
  • Other web-based resources. A significant number of piecemeal efforts exist. These do an excellent job on portions of a course but do not try to be a single point of contact. A good example of this is the University of Colorado’s Phet site, which has developed a huge array of Java-based science simulations. Taken one by one any of the Phet resources does an excellent job of exploring the topic it intends to but it has to be left to the instructor to decide which ones to use, how to use them and how to link them in with the rest of the course resources.

So a wide variety of useful resources does exist so what, then, is the big deal.

A Simple Vision

Let’s think for a moment what it could be like online when a student accesses the course.

The course home provides an overview of what’s in the course along with a summary of progress to date. This includes a list of tasks completed, along with appropriate achievement indicators (grades, etc.), upcoming events and deadlines as well as uncompleted tasks, along with suggested resources and activities. It’s worth noting that just about any online Learning Management System (LMS) such as Desire2Learn, Blackboard or Moodle can do this right now.

For any given course organizer (whether it be lesson, topic or learning outcome, for example) course resources are provided in a variety of formats including:

 

  • Print Materials, and preferably in a format that lends itself well to display using either paper or an electronic reader such as an eBook reader or tablet device.
  • Multimedia presentations—that is, an electronic version of an in-class presentation, complete with visuals and audio—that could be created with software such as Adobe Captivate.
  • Interactive simulations (where applicable) in which students could investigate topics of study. These should be similar to the ones already available from Phet but with the added value of having guidance on what you are looking for; the simulation has a built-in lesson plan. In some, but not all, cases (investigating DC circuits, for example) these could replace activities generally done in the lab.
  • Laboratory resources in the form of videos, analysis software and handouts that would be used in conjunction with lab activities. Students would still be expected to go to the lab but because these would replace the lab manuals and demos from the front of the room. Students would have more autonomy, meaning that at any given time various activities could be managed at once in the same location.

 

For any given course organizer the course would also host a variety of assessment/evaluation tools including these:

 

  • Traditional written assignments. These could be printed off, completed on pencil and paper, scanned, and then uploaded to the assignment drop-box for that item, where they would be graded, probably by a TA.
  • Online assignments, similar to the above but with the submissions and solutions done online. This is similar in form to the existing open source LON CAPA program currently used worldwide but with several important additions: (1) integration with the LMS instead of just stand-alone (2) provision for viewing of solutions, not just answers.
  • Interactive, Simulation based assessments. Instead of just working in pencil and paper the student would perform actual tasks online and be assessed on them. For example, the student could use an interface to work through an exercise traditionally done with paper and pencil or could use a drag and drop interface to assemble, test and analyze a circuit. These tasks could be done, for example by tweaking existing java based simulations or built from scratch using the simulation features in software such as Adobe Captivate.

 

Overall, you may notice that none of the items mentioned are too far-fetched. While this could have been listed: “The development of a completely immersive online lab based learning environment for physics,” it was not, owing to the extremely prohibitive cost (probably in excess of $50M).

The course assemblage mentioned has a much more modest cost, probably in the vicinity of $1M or so, with the majority of it going to the programming efforts of getting the interactive pieces up to a sufficient quality. While it is unlikely that any given institution could be expected to foot this sort of development bill, when you consider the fact, already mentioned, that this course is one that would have worldwide appeal it is rather amazing that it does not already exist.

The Barrier

Think about the numbers for a moment. Consider just doing the course in English and thus limiting it to primarily English – speaking countries (of course we really want this done in all popular languages but lets look at a limited, simple case here, just to make a point). This would potentially give a market in which millions of students would wish to access it. Currently those students are expected to purchase either new (at around $200/copy) or used (at around $100/copy) traditional textbooks for the course. What if, instead, this money which, at a conservative count would be around $50M per year (assuming that only half the students purchase the text and most of them buy used) were instead invested to the development of online resources? If the figures given were correct, the development costs would be recouped in such a short while as to be insignificant!

This, of course, makes no sense. Commercial publishers are not stupid enough to pass up such a lucrative cash cow so why has this not been done? I would suggest it is the sum of three interacting causes.

Educational institutions are unable or unwilling to fund the development of high-quality course content. It costs money—lots of it, and in these times when all institutions are facing increasing pressure to keep costs down any requests for additional funding are unlikely to be met with anything other than skepticism. To develop course content requires (1) time for the subject matter expert—likely an already over-burdened instructor (2) time for an instructional designer as well as (3) various multimedia/programming professionals who assemble the content into the various types mentioned in previous posts. Generally there is little or no money available to put the IDs and multimedia specialists on the projects and requests from the instructor for release time are met with the response, “we are already paying you your salary and we assume that the development if class-related materials is included in that already.” Simply put, the administration does not have the extra funds to pay the people and the instructors do not have the extra time to prepare the content that would be needed to take it to the next level.

Educational institutions do not cooperate to share the development burden. It has already been suggested that, while individual institutions are likely unable to fund the development of high-quality materials, collectively, the human and monetary resources exist when you consider that, at least for the courses mentioned, most institutions are, in effect, teaching much the same courses. If instead of each institution doing its own thing, they cooperated and jointly developed the materials it would apportion to very little.

The fact is, though, that this is one of those things easier said than done. To pursue a joint venture there must be (1) an overall plan (2) formal coordination and management of the project and (3) buy-in. The fiercely competitive atmosphere that exists between institutions coupled with the absence of a formal unifying body means that it’s hard to get this done, especially when you realize that this whole topic is nowhere near the top of most educational administrators’ lists of priorities. Still, it is a shame as the Internet has already demonstrated how well it is suited to cooperative development projects, as evidenced through successful development projects such as Mozilla as well as more communitarian development such as has been done with Wikipedia.

Commercial educational publishers are unable to implement an effective business model. Ask just about any administrator that holds the educational purse strings for education this question, “Why don’t you allocate money towards the funding of teaching and learning resources?” Chances are this will be the response: “Because that is the job of the educational publishers. We can’t afford to do it ourselves but they can because they can access a much larger market.”

Fine; after all, why waste taxpayers’ money when there’s a much better way?

Now go ask any executive with any of the major publishers the same question. Chances are, this will be the response: “Because we cannot recoup the development costs. Not only are institutions unwilling to pay the license fee, even though it is significantly less than they used to pay for textbooks, but, worse, our experience has been that people will always find a way to obtain and use our materials, regardless of copyright. We just can’t win, no matter what we do.”

Put the three together and you get the situation we face today. Despite the huge potential that Internet-based resources hold for improved teaching and learning in first year courses much—not all, mind you, but still the majority—of that potential remains untapped, with little sign of any widespread, sustained effort to do much of anything about it.

Suggested Solutions

This is not to suggest that the appropriate response is to just accept that things are the way they are for good reasons and the best that we should all do is to learn to accept the status quo. While it is unlikely that any revolutionary change is likely to happen in the short-term, significant benefits can still be realized from some straightforward actions. If sustained, some of the items below are likely to go a long way towards realizing the dream of much more optimal usage of digital technologies in service to teaching and learning. Here are three items which, taken together, hold every possibility of resulting in widespread improvements.

Existing and prospective faculty need to continue to work toward positive change. Large scale changes take time. Not only do materials need to be developed but, more importantly the two book-ending sets of activities need to be done right. (1) The preliminary work of understanding the current problems and planning appropriate responses need to be done well. Likewise (2) the follow-up activities of fine-tuning introduced measures and modifying them in light of unexpected contingencies is something that cannot be forgotten. In situations where the general consensus is that things are just fine as they are, the general response is not just one of stagnation but even worse, is one of gradual decline. Things break. Things change. If there is no response, what’s broken remains broken and sustained change brings the reality increasingly further and further from the classroom. If, on the other hand, the general consensus is “We need to make things better” then the meaningful improvements—that is the ones forged from REAL need—will slowly be realized in a spirit of collegiality.

Leaders (Deans, Directors and College Presidents and perhaps government officials) should move for more inter-agency cooperation. While there are few formal opportunities for collaboration, the academic world is rife with opportunities for informal exchanges: presentations, conferences and such. If, at those occasions the topic was, brought around to the whole idea of cooperatively developing teaching and learning resources, in time the interest would build and, along with it, the ways and means of getting it done. People who acknowledge a need tend to see the opportunities for finding the means by which to solve the problem—a positive off-shoot of the generally unhelpful confirmation bias. In short, talk about it and a way generally tends to be found for getting it done in a way that everyone can live with.

Publishers need to work more closely with institutions. Despite the fact that they work closely with some faculty members—after all most current texts are authored by faculty—publishers tend not to have good two-way relationships with the learning institutions their whole business model is built upon. Generally the only formal relationship is through the bookstore and the general attitude is one of client service, that is, the university states a requirement, generally in the form of a syllabus and then evaluates the available resources. This activity is generally muddied somewhat by the publisher’s efforts to sharpen their competitive edge through either the provision of some free goodies or through the haranguing of either the dean or the individual committee members. Often the relationship shakes out something like this: Faculty view the publishers as greedy & grasping and publishers view the institutions as needy & cold-hearted. All in all not a great atmosphere in which anyone can be expected to thrive.

It does not need to be that way. Institutions have great need for improved resources, especially as students gravitate more and more toward the Internet and away from print-based materials. Likewise the publishers are faced with an ever-diminishing pool of revenue as more and more of the old-style core business of just feeding the thirst for basic knowledge is met more and more through existing resources such as Instructors’ own websites and Wikipedia. What’s needed, then is a more sincere and productive dialog in which the publishers gain a better understanding of how to meet current needs while universities find better ways in which to ensure that the publishers’ financial expectations are met.

Overall, the situation is far from desperate. Despite the many shrill cries of doom and gloom our modern educational institutions are by no means in a sorry state; far from it. First year students do as well as they always have—in many cases even better. Enrolments, overall, tend to be strong, and the product—students who achieve and thrive—tends to be good, as evidenced by the continued relative success that all still seem to enjoy.

That said, the situation as always can be improved. The great potential that the Internet holds for education is far from being realized. An overall attitude that is conducive toward positive improvement coupled with a willingness to strive collectively, achieving the small gains that, measured together will result in great strides is just what we all need.

, , , ,

7 Comments

Follow

Get every new post delivered to your Inbox.

Join 801 other followers