Small Schools Rank Higher: It’s Built-In; Do the Math (& a twist)

From time to time you will see institutions ranked according to various criteria. Generally this is done with the intention of demonstrating how well each is performing. It’s not unusual to see this done with schools and here’s a claim that is often made, and substantiated by the numbers:

Small Schools Tend to Lead the Ranks

This is consistent. For years I saw it in my own region, reflected in the annual report card issued by the Halifax-Based Atlantic Institute for Market Studies (AIMS). Year after year, small schools led the provincial rankings. As a professional whose entire career was devoted to the betterment of small rural schools I wanted to be able to brag about this, to puff out my chest and say, “look, I told you that small schools were better for our children. It’s obvious that the extra care and attention they get on an individual basis, as well as the better socialization caused by the fact that everyone knows everyone else, is making a positive difference.”

I never did that, of course. It’s not because I don’t believe in small schools–I truly do. My silence on the matter was, to some degree due to the fact that at the time the studies were published I was a non-executive member of the Department of Education. As such I was not authorized to speak on its behalf. That was not the real reason though.

No, I knew that a far more powerful force was afoot; something that affected the results much more than did either good teaching, a supportive (small) ecosystem and the presence of many brilliant bay woman and men.

Although all of those are positive factors.

No the most powerful effect was something else, something related to straightforward mathematical behaviour, and if you’ll spare a few minutes of your attention I will explain.

Simply Put: Small schools have an advantage in these rankings that is due only to the fact that they are small.

And there is an unexpected twist too, one not often mentioned in the discussion of the reports.

Here goes!

Let’s simplify the situation and assume that the rankings are based on the outcome of one test only. Furthermore, let’s say that the result of that test, for any given student, is completely random; that is, any student who writes it will get a random grade between 0 and 100. In other words let’s act like there’s really no difference between the students at all of the schools. The small ones will still come out on top.

Let’s see what this would mean for ten small schools (we’ll arbitrarily name them sml01 to sml10), each one having only fifteen students in grade twelve and writing the test on which our report is based. The results for all of the students are tabulated below. In reality the table was produced using a random number generator in Microsoft Excel. You don’t need to read the table in detail. It’s just here so you know I’m not making the whole thing up!

School sml01 sml02 sml03 sml04 sml05 sml06 sml07 sml08 sml09 sml10
Score 15.0 14.0 45.0 55.0 53.0 70.0 55.0 100.0 79.0 56.0
Score 6.0 34.0 94.0 75.0 64.0 75.0 59.0 75.0 73.0 52.0
Score 15.0 83.0 65.0 30.0 84.0 46.0 64.0 10.0 35.0 20.0
Score 64.0 16.0 80.0 77.0 10.0 55.0 85.0 32.0 91.0 97.0
Score 29.0 28.0 96.0 98.0 6.0 67.0 51.0 74.0 69.0 9.0
Score 43.0 29.0 49.0 79.0 17.0 64.0 54.0 11.0 32.0 91.0
Score 31.0 49.0 62.0 33.0 0.0 92.0 35.0 59.0 91.0 45.0
Score 51.0 21.0 98.0 75.0 47.0 57.0 32.0 32.0 25.0 58.0
Score 45.0 27.0 18.0 6.0 24.0 31.0 84.0 5.0 89.0 2.0
Score 62.0 62.0 38.0 84.0 16.0 23.0 39.0 84.0 36.0 17.0
Score 9.0 0.0 6.0 67.0 53.0 99.0 54.0 23.0 97.0 15.0
Score 38.0 34.0 21.0 70.0 58.0 40.0 37.0 21.0 56.0 50.0
Score 21.0 55.0 51.0 97.0 92.0 40.0 48.0 100.0 76.0 4.0
Score 17.0 50.0 43.0 38.0 6.0 1.0 72.0 77.0 16.0 25.0
Score 35.0 69.0 83.0 34.0 75.0 16.0 46.0 51.0 77.0 33.0
Average 32.1 38.1 56.6 61.2 40.3 51.7 54.3 50.3 62.8 38.3

Table 1: School results for ten small schools.

That’s a huge pile of numbers and we are only looking at the results for the schools so lets just redo that table showing only the schools and the averages.

School Average
sml01 32.1
sml02 38.1
sml03 56.6
sml04 61.2
sml05 40.3
sml06 51.7
sml07 54.3
sml08 50.3
sml09 62.8
sml10 38.3
Table 2: Small School Results

Notice that the results show a fair bit of variability. They cluster around an average of 50  but some schools had averages in the thirties while others were up around 60.

Now let’s do it all over again, but this time let’s see what would happen in larger schools (named big01 to big10). For the small schools we assumed there were only 15 students per grade level but for the larger ones let’s assume that there are in fact 120 students in grade 12 writing the test.

The rather long table is below just so you know I’m not pulling the numbers out of my head. As was the case with the small school simulation it was done using a random number generator in Microsoft Excel and just pasted directly into WordPress. Scroll to the bottom of the table :-)

School big01 big02 big03 big04 big05 big06 big07 big08 big09 big10
Score 67.0 100.0 82.0 25.0 27.0 100.0 10.0 69.0 22.0 96.0
Score 42.0 92.0 63.0 16.0 42.0 12.0 69.0 94.0 66.0 60.0
Score 100.0 27.0 42.0 59.0 88.0 79.0 83.0 49.0 27.0 96.0
Score 44.0 29.0 46.0 63.0 28.0 69.0 31.0 19.0 18.0 59.0
Score 16.0 33.0 66.0 81.0 11.0 21.0 76.0 67.0 70.0 85.0
Score 95.0 31.0 11.0 2.0 80.0 15.0 21.0 78.0 91.0 33.0
Score 32.0 32.0 38.0 34.0 44.0 61.0 55.0 0.0 89.0 64.0
Score 82.0 6.0 96.0 57.0 8.0 52.0 64.0 55.0 62.0 72.0
Score 100.0 48.0 45.0 10.0 49.0 93.0 89.0 72.0 87.0 72.0
Score 88.0 91.0 33.0 0.0 36.0 11.0 76.0 10.0 78.0 55.0
Score 54.0 95.0 41.0 68.0 92.0 75.0 54.0 75.0 20.0 52.0
Score 44.0 79.0 88.0 69.0 82.0 9.0 31.0 74.0 1.0 78.0
Score 71.0 36.0 2.0 51.0 58.0 2.0 17.0 68.0 29.0 36.0
Score 93.0 47.0 89.0 91.0 25.0 47.0 85.0 96.0 63.0 23.0
Score 9.0 79.0 33.0 41.0 68.0 19.0 74.0 81.0 57.0 47.0
Score 77.0 84.0 28.0 44.0 2.0 54.0 37.0 48.0 25.0 54.0
Score 20.0 74.0 33.0 57.0 15.0 65.0 85.0 59.0 21.0 80.0
Score 56.0 98.0 27.0 68.0 45.0 75.0 58.0 71.0 92.0 58.0
Score 7.0 70.0 83.0 74.0 26.0 52.0 71.0 40.0 75.0 87.0
Score 80.0 32.0 65.0 7.0 54.0 62.0 68.0 7.0 87.0 88.0
Score 65.0 12.0 68.0 22.0 5.0 26.0 36.0 92.0 79.0 40.0
Score 87.0 89.0 51.0 70.0 96.0 98.0 56.0 13.0 10.0 51.0
Score 52.0 71.0 13.0 86.0 88.0 54.0 11.0 20.0 26.0 18.0
Score 69.0 57.0 11.0 36.0 39.0 5.0 38.0 56.0 82.0 40.0
Score 95.0 54.0 54.0 77.0 52.0 74.0 100.0 82.0 35.0 7.0
Score 49.0 80.0 24.0 42.0 11.0 82.0 70.0 18.0 30.0 19.0
Score 46.0 26.0 3.0 56.0 54.0 50.0 2.0 9.0 26.0 47.0
Score 58.0 57.0 98.0 62.0 65.0 50.0 7.0 94.0 9.0 43.0
Score 86.0 86.0 32.0 81.0 63.0 49.0 60.0 61.0 93.0 5.0
Score 9.0 54.0 74.0 65.0 27.0 38.0 42.0 30.0 42.0 99.0
Score 41.0 37.0 30.0 70.0 77.0 86.0 58.0 48.0 53.0 99.0
Score 23.0 82.0 9.0 73.0 9.0 9.0 86.0 27.0 57.0 50.0
Score 52.0 97.0 91.0 90.0 58.0 11.0 56.0 16.0 53.0 89.0
Score 54.0 84.0 46.0 0.0 26.0 55.0 36.0 94.0 89.0 46.0
Score 94.0 75.0 32.0 16.0 77.0 9.0 87.0 21.0 58.0 59.0
Score 77.0 27.0 93.0 65.0 61.0 23.0 53.0 60.0 29.0 23.0
Score 41.0 26.0 34.0 21.0 24.0 57.0 34.0 78.0 99.0 90.0
Score 73.0 67.0 83.0 54.0 99.0 63.0 24.0 65.0 75.0 37.0
Score 55.0 76.0 30.0 85.0 92.0 57.0 31.0 69.0 82.0 43.0
Score 12.0 38.0 53.0 56.0 40.0 67.0 3.0 50.0 86.0 90.0
Score 48.0 89.0 86.0 77.0 80.0 83.0 92.0 38.0 67.0 0.0
Score 59.0 81.0 65.0 0.0 47.0 24.0 57.0 18.0 27.0 90.0
Score 32.0 72.0 46.0 54.0 92.0 54.0 41.0 99.0 0.0 87.0
Score 5.0 55.0 0.0 78.0 13.0 83.0 60.0 68.0 68.0 86.0
Score 0.0 0.0 91.0 66.0 38.0 22.0 2.0 82.0 32.0 12.0
Score 19.0 74.0 40.0 54.0 93.0 37.0 68.0 75.0 57.0 35.0
Score 13.0 81.0 36.0 39.0 50.0 3.0 44.0 19.0 100.0 16.0
Score 36.0 95.0 4.0 100.0 60.0 89.0 47.0 99.0 70.0 43.0
Score 29.0 46.0 12.0 92.0 35.0 28.0 17.0 74.0 38.0 85.0
Score 49.0 84.0 35.0 70.0 36.0 12.0 32.0 43.0 81.0 39.0
Score 87.0 32.0 89.0 71.0 11.0 0.0 93.0 51.0 10.0 39.0
Score 43.0 27.0 12.0 9.0 81.0 78.0 52.0 99.0 82.0 86.0
Score 51.0 41.0 50.0 73.0 83.0 65.0 51.0 44.0 89.0 5.0
Score 21.0 56.0 89.0 6.0 47.0 41.0 57.0 17.0 72.0 53.0
Score 12.0 39.0 51.0 18.0 96.0 75.0 23.0 39.0 75.0 39.0
Score 0.0 48.0 11.0 51.0 61.0 22.0 39.0 35.0 88.0 75.0
Score 33.0 53.0 23.0 68.0 88.0 69.0 48.0 40.0 19.0 100.0
Score 31.0 30.0 82.0 31.0 13.0 55.0 89.0 94.0 40.0 60.0
Score 90.0 5.0 19.0 26.0 68.0 60.0 77.0 63.0 51.0 6.0
Score 41.0 65.0 72.0 76.0 91.0 11.0 71.0 37.0 68.0 53.0
Score 24.0 80.0 70.0 73.0 61.0 4.0 79.0 59.0 37.0 73.0
Score 11.0 24.0 72.0 48.0 64.0 28.0 38.0 79.0 66.0 22.0
Score 22.0 13.0 14.0 83.0 2.0 21.0 95.0 100.0 55.0 55.0
Score 50.0 97.0 59.0 85.0 15.0 82.0 77.0 31.0 21.0 92.0
Score 81.0 9.0 45.0 56.0 16.0 55.0 66.0 69.0 79.0 78.0
Score 36.0 74.0 68.0 7.0 36.0 42.0 5.0 76.0 41.0 76.0
Score 30.0 35.0 68.0 59.0 92.0 50.0 9.0 50.0 98.0 97.0
Score 30.0 31.0 2.0 1.0 62.0 64.0 82.0 88.0 84.0 53.0
Score 4.0 46.0 55.0 54.0 61.0 42.0 81.0 77.0 25.0 27.0
Score 32.0 51.0 79.0 58.0 2.0 33.0 66.0 92.0 20.0 68.0
Score 70.0 76.0 52.0 24.0 2.0 21.0 6.0 98.0 63.0 37.0
Score 54.0 68.0 91.0 56.0 58.0 32.0 41.0 74.0 64.0 45.0
Score 37.0 48.0 29.0 42.0 4.0 93.0 10.0 29.0 97.0 40.0
Score 14.0 47.0 46.0 83.0 80.0 52.0 42.0 54.0 33.0 29.0
Score 15.0 2.0 100.0 12.0 9.0 84.0 52.0 53.0 53.0 6.0
Score 8.0 23.0 35.0 63.0 78.0 34.0 30.0 75.0 14.0 54.0
Score 16.0 90.0 13.0 80.0 32.0 29.0 99.0 21.0 34.0 80.0
Score 99.0 48.0 47.0 5.0 71.0 88.0 77.0 68.0 50.0 2.0
Score 45.0 15.0 18.0 38.0 49.0 8.0 90.0 13.0 71.0 33.0
Score 42.0 50.0 86.0 80.0 79.0 53.0 21.0 81.0 53.0 36.0
Score 16.0 14.0 51.0 14.0 19.0 97.0 50.0 49.0 8.0 2.0
Score 34.0 85.0 55.0 54.0 49.0 63.0 1.0 58.0 73.0 13.0
Score 8.0 98.0 9.0 7.0 70.0 78.0 41.0 18.0 94.0 74.0
Score 59.0 43.0 31.0 30.0 97.0 85.0 64.0 94.0 3.0 91.0
Score 29.0 34.0 6.0 17.0 43.0 78.0 67.0 17.0 50.0 34.0
Score 80.0 12.0 98.0 24.0 84.0 25.0 96.0 76.0 16.0 67.0
Score 87.0 89.0 11.0 86.0 5.0 39.0 83.0 98.0 27.0 13.0
Score 62.0 73.0 69.0 91.0 47.0 52.0 91.0 57.0 87.0 39.0
Score 22.0 64.0 86.0 64.0 10.0 88.0 6.0 62.0 91.0 26.0
Score 28.0 74.0 88.0 19.0 45.0 97.0 94.0 3.0 75.0 30.0
Score 27.0 11.0 11.0 55.0 39.0 30.0 39.0 54.0 99.0 86.0
Score 94.0 85.0 60.0 1.0 42.0 23.0 57.0 97.0 58.0 24.0
Score 78.0 7.0 30.0 94.0 26.0 75.0 100.0 11.0 99.0 11.0
Score 94.0 12.0 81.0 50.0 49.0 36.0 68.0 95.0 67.0 33.0
Score 5.0 9.0 39.0 23.0 31.0 29.0 23.0 22.0 57.0 46.0
Score 67.0 55.0 98.0 81.0 80.0 72.0 31.0 53.0 80.0 95.0
Score 68.0 57.0 17.0 34.0 26.0 38.0 46.0 55.0 74.0 24.0
Score 8.0 39.0 82.0 34.0 65.0 74.0 34.0 39.0 62.0 19.0
Score 83.0 97.0 14.0 84.0 71.0 66.0 62.0 13.0 8.0 82.0
Score 83.0 78.0 39.0 45.0 15.0 70.0 63.0 65.0 75.0 68.0
Score 8.0 32.0 75.0 8.0 53.0 67.0 22.0 4.0 34.0 46.0
Score 91.0 38.0 48.0 85.0 11.0 93.0 96.0 17.0 80.0 13.0
Score 46.0 90.0 14.0 41.0 0.0 40.0 97.0 74.0 0.0 25.0
Score 26.0 14.0 85.0 92.0 29.0 63.0 77.0 94.0 80.0 8.0
Score 99.0 60.0 48.0 94.0 23.0 37.0 74.0 57.0 2.0 96.0
Score 51.0 99.0 89.0 67.0 69.0 5.0 91.0 6.0 97.0 97.0
Score 41.0 21.0 55.0 63.0 68.0 55.0 1.0 60.0 11.0 54.0
Score 35.0 18.0 65.0 78.0 96.0 79.0 3.0 22.0 80.0 44.0
Score 64.0 62.0 37.0 12.0 81.0 71.0 50.0 29.0 33.0 82.0
Score 27.0 24.0 4.0 29.0 86.0 36.0 11.0 47.0 77.0 5.0
Score 40.0 14.0 44.0 95.0 78.0 90.0 42.0 41.0 99.0 83.0
Score 10.0 93.0 37.0 48.0 87.0 27.0 79.0 15.0 94.0 57.0
Score 92.0 100.0 24.0 81.0 61.0 93.0 68.0 8.0 54.0 46.0
Score 55.0 79.0 17.0 88.0 96.0 83.0 88.0 99.0 49.0 21.0
Score 88.0 46.0 20.0 17.0 74.0 76.0 12.0 53.0 89.0 22.0
Score 85.0 42.0 26.0 26.0 87.0 69.0 49.0 68.0 57.0 49.0
Score 59.0 31.0 21.0 74.0 56.0 3.0 94.0 95.0 26.0 18.0
Score 100.0 75.0 59.0 39.0 64.0 54.0 7.0 30.0 34.0 38.0
Score 58.0 100.0 60.0 96.0 100.0 70.0 71.0 19.0 58.0 7.0
Score 13.0 66.0 94.0 82.0 92.0 87.0 51.0 51.0 80.0 13.0
Score 36.0 3.0 32.0 29.0 83.0 95.0 20.0 42.0 53.0 95.0
Average 48.7 53.9 48.2 51.7 52.1 51.8 52.8 53.9 56.0 50.4
Table 3: School results for ten big schools.

As before let’s just look at the averages for each school.

School Average
big01 48.7
big02 53.9
big03 48.2
big04 51.7
big05 52.1
big06 51.8
big07 52.8
big08 53.9
big09 56.0
big10 50.4
Table 4: Big School Results

Notice that, like table 2 the results are clustered about an average of around 50. Notice, though, that the numbers are not spread nearly as much.

Let’s put the two tables side-by-side for a better look

Scool Average School Average
sml01 32.1 big01 48.7
sml02 38.1 big02 53.9
sml03 56.6 big03 48.2
sml04 61.2 big04 51.7
sml05 40.3 big05 52.1
sml06 51.7 big06 51.8
sml07 54.3 big07 52.8
sml08 50.3 big08 53.9
sml09 62.8 big09 56.0
sml10 38.3 big10 50.4
Table 5: Averages for both small and big schools

The thing to notice is that the big schools show much less variability. In small schools, individual students who do very well or very poorly (we call them outliers) tend to have a large effect on the average. In larger schools, the increased number of results tends to “smooth out” the results; to make them less variable.

This is something that is well-known in mathematics. It  even has a name: The Law of Large Numbers. Simply put, in larger populations repeated experiments tend to cluster better about the expected result.

Now, this is where things get interesting. Recall that this is all about the fact that small schools get a built-in advantage due only to the fact that they are small. Let’s see what it looks like when all twenty schools are ranked from highest to lowest.

School Average
sml09 62.8
sml04 61.2
sml03 56.6
big09 56.0
sml07 54.3
big08 53.9
big02 53.9
big07 52.8
big05 52.1
big06 51.8
big04 51.7
sml06 51.7
big10 50.4
sml08 50.3
big01 48.7
big03 48.2
sml05 40.3
sml10 38.3
sml02 38.1
sml01 32.1
Table 6: All twenty schools ranked from highest to lowest.

Did you see what happened? The top schools were all small schools. They reached the top due to nothing other than the law of mall numbers working in their favour. Random variability–two or three bright students or an absence of  two or three weaker students had a profoundly positive effect on the school average.

Recall also I mentioned there would be a twist. Notice that while the highest ranking institutions were drawn from the pool of small schools, so, too were the lowest ranking ones, and for the same reason–namely the presence of a few weaker students or he absence of a few strong ones.

So, based on this little experiment it’s plain to see that when ranked this way, small schools will tend to come out on top simply because they are small and the fact that the law of large numbers is better able to work in their favour.

As for the small schools at the bottom, it happens too and it’s at best likely that these are rarely mentioned due to the presence of selection bias on behalf of whoever wishes to weave the numbers into a narrative that suits their own political ends. One wonders, though, how many small schools have been closed or otherwise penalized for nothing other than being the unfortunate victims of chance.

Closing Note: this is in no way intended to cast AIMS in any negative light. To the best of my knowledge neither they, nor the various Departments of Education nor the various school districts ever tried to spin the reports into any grandiose claims regarding big and small schools. The false claims I have heard have generally be made by private individuals, each with their own axes to grind.

As for my own conclusion: Ranking systems, regardless of the context, whether it be health care, law enforcement, customer care or, as is the case here, school-based student achievement, serve a useful purpose but be wary of the law of large numbers before making any sweeping generalizations.

, , ,

12 Comments

Asking Better Questions: Ends and Means in eLearning

In a previous post I considered the possibility that much of what is presented as “Innovation” is anything but that. With access to some fairly new and attractive or otherwise popular products and armed with even a slight grasp of how to operate them it’s relatively easy to create an appearance of innovation. Simply put, if you can get your hands on some new gear, in even a short while you can present quite a convincing front.

Worse again, it is equally easy to generate what passes for proof; to an untrained eye you can make it look like your so-called innovation is creating some real differences. Any of these strategies can give you reams of what looks like convincing evidence:

  • Deliberately pick enthusiastic students or teachers and pile on the anecdotes that endorse the desired point of view. People who rely on system-one (more or less intuitive) reasoning are easily swayed by stories so it won’t be hard to capitalize on that to get some people talking about how innovative the project is.
  • Stage the project in a relatively well-off school or class and then compare the results from this highly-biased “treatment” group to the population in general. Very few will dig deep enough to see that the superior achievement results predated, and were independent of, the treatment.
  • Rely on manufacturer or vendor supplied “research” when crafting reports, proposals and press releases.
  • Bluff; just preface your claims with clauses like “decades of research shows…” and leave it at that. You might be surprised to see how few—if anyone—will call you out. Besides it will be relatively easy to portray those that do as kooks or curmudgeons.

That said, you could instead opt to take the more difficult path and strive for some real gains.

Notwithstanding the cynical tone of the opening of this post, it needs to be emphasized that emerging technologies should be welcomed, albeit guardedly, in all places of learning. I’ve come by this knowledge the hard way with ample personal experience in doing it both the right AND the wrong way. Lessons learned well generally involve first-hand experience and I have it, having done things for both the right and the wrong reasons–but generally having benefited from the experience in either case. It can be summed up succinctly: it’s best when you develop and refine an appropriate match between the technologies and the desired learning outcomes. This means, in particular, to start with the right sort of question:

  • Bad Question: How can (insert gadget name here) be used in the (insert subject name) classroom?
  • Better Question: What combination of equipment and methodology will foster better achievement in (insert subject name/outcome area)?

Notice the difference? Instead of placing the focus on the tools, place it on the learning.

right-questions-01

You might say that, in the end, the two are the same. Yes, in both cases the goal is to do a better job. Take a closer look, though. Notice that the bad question is, in fact all wrong. First, by selecting a particular device it sets serious limits on what can be done. This can even lead to the selection of inferior methods. Consider this: A teacher wants to see if physics achievement can be improved through the use of tablets in the class after noticing that there are some good simulations available and asks, “how can I use tablets in the classroom?” With the best of intentions the approach is changed, replacing hands on activities with simulations. Now, while simulations are an excellent way to introduce topics, especially ones that cannot be done cheaply or safely, it makes little sense, when you think about it, to replace hands on activities involving motion, sound, electricity and light with simulations in which the only physical interaction is sliding a finger along a glass screen! After all, physics is all about interacting with the physical world! How ironic! If, instead the right question had been asked, no doubt the simulations would have been used but their use would have been balanced with follow up real-world interactions.

Second, the selection of a particular device sets in place a condition in which demonstrable improvements are expected. That’s nice, but what if it’s the case that the new technology is in fact inferior? You might suggest, “no problem, the report will show this.” Think about it, though, and be careful to layer in some human nature.

Consider again the previous case involving tablets and physics. Suppose that the unit of study was about current electricity and the tablets were used to explore the topic through simulations in which students constructed virtual circuits involving batteries, resistors, lamps, switches and meters to measure voltage and current instead of doing the same with the real thing. At the end of the unit the evaluation would be based on what could be measured, either online or using pencil and paper, and NOT on actually constructing the circuits.

How likely would it be that students would be able to do the same with real circuits? Not likely.

How likely is it that they would do about the same on a test? Very likely.

What’s the difference? In which class would it be more likely that you would find someone who could help you wire your basement? If, on the other hand, the right question had been asked, again, in all likelihood the simulations would have been utilized but their user would be balanced, blended with hands-on activities too.

Focusing instead on the learning will have two likely outcomes:

  • You will likely not get famous as “it’s” clearly about learning and not about you.
  • The project will show modest but useful results.

Whenever embarking on any effort to improve results in education it’s important to bear in mind one simple truth: you are not starting from scratch. The “traditional” methods that self-nominated reformers (most of whom have only limited classroom experience, other than the imagined stuff) so love to mock are in fact reasonably effective. The huge majority of people–those who’ve not been the beneficiaries of their enlightened practice but who have still managed to thrive nonetheless bear testament to that. Existing methods are, perhaps, not as good as they could be but are still nonetheless effective. Reformers should bear in mind that the traditional methods they so distain have several important advantages over proposed new ones. First, they are understood since, in all likelihood, existing practitioners not only use them now but will likely have been taught using them. More importantly, though, traditional methods have been refined from extensive classroom use. Proposed methods, by contrast are not well understood, raw and untested.

Far too often reformers boldly charge into classrooms armed with little more than vague ideas, shiny new equipment and an unhealthy combination of ignorance and arrogance. Students, parents, colleagues and administrators generally tolerate the ensuing activity since (a) it probably doesn’t interfere with them too much and (b) there is always the chance that some good might come of it. The proponent will usually get a little something—a write up in a journal, perhaps a trip to a conference, maybe even an award—but in the end the students will likely be left no better off and the effect on general classroom practice will be negligible.

It does not need to be that way, though. If, instead, the proponents asked the right question, one that focused on making some real improvement in student learning, then wins would be had all around. That is, better teaching and learning would result and, who knows, maybe the innovator’s career would get a boost anyway.

, , ,

15 Comments

Managing the Distractions

I came across something like this “unhelpful high school teacher” meme the other day and it got me thinking about the distracted landscape our students occupy.

meme-unhelpful-teach-01

All too often the opinions you encounter on the web and in other parts of everyday life are one-sided; normally the work of someone with an axe to grind; someone wishing to provide just one side of a rather complicated issue and this is no exception. There are very valid reasons why educators have to be skeptical about the unrestricted use of electronic devices such as laptops and tablets in class.

In my previous job my office was located on campus at a fairly large university. It gave ample opportunity to view the electronic habits of typical students and was a never-ending source of amazement—both the good and the bad kinds.

One incident in particular stands out. I wished to confer briefly with a colleague who was, at the time, teaching a large class (around 160+ senior education students) in one of two large lecture theatres located in the basement of the building we both inhabited. I decided to just head over to the class and chat with him before it started. Unfortunately, as is often the case, I was briefly distracted, and, by the time I arrived at the door the class had already started. Out of curiosity I looked in. My vantage point was from the centre back and, as the lecture theatre slopes toward the front, I had an excellent view of exactly what the students were doing.

Almost all of them had either a laptop or a tablet device open and active. What was interesting was the fact that the majority of the students were not just taking notes on the machines but also had a web browser open. Well over half of the students would periodically switch from the note taking application (typically a word processor) to the browser. The browsers had the usual suspects, of course (Facebook, Twitter and other social media applications) but a surprising number of students were also shopping online during class time. I’d estimate now that somewhere between 10 and 20 of the approximately 150 students were doing this! Only a very small fraction—I’d estimate now around 20 to 25%–seemed to be totally focused on the lecture; at least as evidenced by their keeping the notes application open throughout the five minutes or so I was watching.

meme-buzz-woody-01

I recall the moment quite well as it was one of those times when something became quite clear to me; a time that has sparked a considerable number of subsequent informal observations. Right then and there I decided to also take a look at the other lecture theatre. This one had a 2nd semester calculus class going on and, unlike the former one, was one in which electronic devices were not that well suited to taking notes (unless, of course, you had a touch screen or some stylus such as a Wacom device in which you could render back handwriting. After all, typing calculus notes is not something anyone can do on the fly!). Guess what? Same thing! Once again I saw a sea of laptops and tablets. Not quite so many, of course—I’d estimate around 50% of the students had them open as opposed to over 90% as was the case in the education class. Once again, though, the screens were dominated by not just social media but also online shopping!

Just a thought—maybe someone should run their own set of observations and verify this. At any rate, this short anecdote lends a bit (yes, I know “piling on the anecdotes” is a very flawed form of research) of credibility to the notion that we all have of how distracted our students really are.

meme-spock

Which brings us to the point: as educators it is in our best interests, and those of our students, if we find effective ways of managing the many distractions that electronic gadgets bring to our classrooms. While it is certainly true that electronic devices hold incredible promise for all aspects of education it must also be acknowledged that the devices are equally effective at pulling students away from the tasks that should be at hand. The same conduit that brings research, information and activities right to the students’ foregrounds is equally adept at bringing in distractions such as off-topic interactions, irrelevant information and other distractions, particularly games that have nothing to do with learning.

Blocking unrelated content is a strategy that will never work. Go ahead and block Facebook at the Wi-Fi router. The students will hardly be slowed at all. Some will switch back to getting it through their phones, which you cannot block. Others will switch to a different social media platform—new ones pop up almost weekly, and still others will just connect through a proxy server which will just circumvent the router and firewall rules. It’s a losing game of cat and mouse.

Blocking the use of electronic devices is equally counterproductive. First of all, it drags instruction back to the 19th century—and we cannot afford to do that. More importantly, though, the whole practice of “blocking” or “banning” is anathema to the whole idea of schools as places of learning.

So what, then? What is the magic bullet? As expected, because it’s nearly always the case, there is no one simple solution. There are, however general strategies that can be applied and which will be found effective. Here are some suggestions:

  • Make a personal contact with the students: When students turn to the web browser they are turning away from you, the instructor. The less personal you are to the students the more they will do this.
  • Communicate your values clearly: Typically around 80% of people will respect your wishes so make sure they know what your wishes are. Make it clear to the students that you do value the use of electronic equipment but that they must also make the best use of their class time. To do this they should minimize distractions and, in particular, save the social networking and shopping for some other time. It’s also worth noting that of the remaining 20%, around three-quarters of them can be convinced to follow along too especially if you ensure that you move around the room to make it apparent that you are checking to see If students are engaged. It should also be noted that the small remainder—around 5% of the total—will do what they please regardless of what you do and you would be well advised that this small group may be regarded as “beyond the point of diminishing returns” so long as they do not distract others with their off-topic pursuits.
  • Find ways to leverage the potentially-distracting technology: You can always find ways to put the devices to some good use. Examples include: (1) getting the students to install “clicker” applications and build in “instant response” activities to your classes (2) provide electronic versions of partial notes (sometimes referred to as “gap notes”) that the students can complete online if they have annotation software (3) make effective use of simulations in class time if appropriate (4) use appropriate application software for in-class activities.

meme-what-if-told-u

, ,

12 Comments

Four Forms of Innovation

The word Innovation is one that is tossed around so much that it’s lost much of its impact. In some ways it’s like “awesome,” isn’t it? Once awesome meant something that literally took your breath away. These days it’s just a tired expression of assent; something that is deemed awesome is more likely just socially acceptable. Similarly, in a world where corporate press releases are grinded out in volumes that rival unit sales neither “innovation” nor “innovative” catch the readers’ attention much.

Add to that the point, already made, that scant few resources exist, whether in the form of HR or money, to engage in the various activities that one might immediately recognize as innovative. Besides in today’s busy, distracted world it’s often hard to spot it when it does occur.

That’s not to say it does not exist—it’s just generally buried under mounds of impressive looking but essentially shallow efforts. A recent journey to the Unemployed Philosopher’s blog reminded me that most of the important work happens far away from fanfare. Day after day, professionals of all kinds, including educators, toil away developing the small but significant things that make practice just a bit better. It is a shame, really. Much of the attention is given to things that appear significant but are really not once you take the time to peer beneath the surface; stuff designed to grab the attention and maybe further some goal, just not the goals one would associate with positive change for all. Sure it may look and sound great but in the end, you’re often left with the professional equivalent of election promises. The real innovations often lie elsewhere, often buried among the many other details that take up our days. They do, nonetheless exist and can be seen if you look hard enough, in one of these four forms.

1. Structured Engineering: The kinds of planned changes that take place in a more-or-less orderly fashion. You have identified a problem to be solved, planned a solution that involves more-or-less standardized equipment & procedures then will implement and test a solution.

For example, suppose you develop an online visual art course. You will carry out a procedure roughly like this:

  • review with the curriculum guide and outline the general instructional strategies, including the method by which they will be developed or acquired;
  • assemble the development and implementation team; formulate the overall plan;
  • select and assemble a system of effective tools and methods by which you will carry out the plan;
  • field test the course and revise as necessary.

Pros:

  • Good fit between need and response.
  • Robust system once implemented.

Cons:

  • Significant up-front cost.
  • Often significant resistance to system-wide change and adaptation.
  • Possibility of large scale failure if wrong choices are made.

2. Structured Deepening: This involves extending an existing system in a purposeful way. As an example, perhaps you chose to modify the aforementioned system by which you are teaching visual art so that you can now teach music online too.

Pros:

  • Significantly less costly than starting from scratch.
  • Less likelihood of large-scale failure.

Cons:

  • Less than optimal fit between need and response since you are modifying an existing system rather than building one to meet specifications.

3. Radically novel: Every so often completely new approaches are developed. It can be argued that before “Star Trek: The Next Generation” nobody thought very seriously about the use of multipurpose digital tablets such as Apple’s iPad or Google’s Nexus Tablet. Now, however these multipurpose devices are changing the way people interact with the Internet, with audio and video and, most importantly, with one another.

Pros:

  • Often based on new devices; carries a shink & new “wow” sense of interest;

Cons:

  • Teaching and Learning sometimes becomes a secondary activity;
  • New devices often lack institutional tech support and have a short lifespan.

4. Entirely new bodies of knowledge and practice: Radically new devices lead, in turn, to entirely new ways of doing things. Consider English Language Arts. In the pre-digital age the focus was on reading, writing, listening and speaking. Now, with so many modes by which we can communicate an additional focus—Representing—is becoming very important. The mobile devices, mentioned above, are also changing the way we interact. Who knows what’s coming!

Pros:

  • Generally a good fit for those who have had the benefit of the events that led to the new development.
  • Often well-suited to the time and place in which they occur; “ products of their times.”

Cons:

  • Often adopted by evangelicals who assume (incorrectly) that the new way is the best way for all.

Through it all, though, it remains as important as ever to maintain a focus on teaching and learning. While the new devices and methods are exciting, if the end result is not a strategically significant improvement in an identified area of concern in education, most notably increased achievement or cost savings, then the innovation is pointless.

, , ,

10 Comments

Structured Integration vs. Cost Savings

How many times do you see “cost saving” being touted as a reason for increased use of educational technology, and most especially distance education? Time and again you will see the adoption of new technology being explained away as cost savings. All you can really do, most of the time, is roll your eyes as you know, beyond doubt, that one of two things will happen. Either (1-not bad) the new technology will wind up costing somewhat more than budgeted—owing to the training costs and other unanticipated costs associated with the adoption and integration process or (2-BAD) it will eventually be abandoned and left to lie, mostly unused, right next to all of the other money wasters that have been purchased through the years.

This does not need to be the case. Properly done, new technologies can be more effective and cheaper; just not that much cheaper. Look around at the cellphones, fuel-injected engines, “green” heating systems and such that have made our lives that much better. The same can happen in our classrooms too but we need to take a much longer view of what comprises cost saving and just plain get over the fool’s quest for that elusive magic bullet.

Cost saving should not be NOT the slashing of departmental budgets and subsequent placement of course notes online just so deficits can be handled in the short term. (Although, admittedly, here in the real world that does have to happen from time to time regardless of how high-minded we would like to be.)That helps nobody as the result will only be a degradation of services, followed by corresponding loss in enrolment. Cost savings might be better framed as the deliberate employment of suitable technologies so that, over time, better outcomes can be achieved at lower cost.

Examples include:

  • Joining classes at separate campuses or schools using videoconference or, even better, a combination of videoconferencing and web conferencing such that smaller student cohorts can be aggregated. In those instances, though, care must be taken such that the host site or the instructor site does not become the “main” site with the remote ones getting the scraps from the educational table.
  • The replacement of non-interactive lectures with series of multimedia-based presentations, preferably with interactive components, such as embedded quizzes or simulations.
  • The gradual replacement of some media types with others but only after a piloting process which (a) shows the worth of the new technology and (b) refines the methodology before full deployment. For example, it may be feasible to replace the printed materials used in a course with online versions, perhaps multimedia or eBooks.

How often has it happened—a new device and its associated procedures shows up unannounced? Perhaps it’s a new set of chromebooks, maybe its clickers, a handheld computer algebra system or a new, shiny, computer numerical control (CNC) machine for the shop class. Whatever. In it comes and with it comes a feeling that you are expected, all of a sudden, to just change everything.

Before proceeding too far it needs to be said that the expectation that you need to change right away if often imagined. It’s been my experience that those responsible for high level decisions do tend to also have a healthy sense of what everyone is up against. After all, the funding that permits that sort of upgrade, itself takes years to put together. The problem is that the expectations that led to the upgrade are often not well understood by those who are expected to implement the change; there’s often a disconnect. Nonetheless, those on the front lines tend to be confronted by a somewhat intimidating set of equipment and feel a corresponding sense of stress on account of what they know needs to happen.

Of course that is just a bit silly. Change does not happen that way. Yes, we are all intelligent and capable of change but none of us is foolish enough to react to every new thing that comes our way, whether invited or not. The change and integration process happens in stages. Assuming that the technology is not another blind alley (and they do happen) it usually plays out something like this:

  1. Familiarization: You have to learn how the equipment works at the most basic level. What’s it for? What do the controls/menus do? What options do you have? In situations like this it’s good to have access to an expert. A demo followed by hands-on activities can be quite useful at this stage.
  2. Utilization: You have to become comfortable with using it. It’s not enough to know what each component does but you have to become adept in its use. Nobody wants to make clumsy or false moves in front of an audience so you need time to practice. If, for example, the device in question is a handheld computer algebra device then use it for your own purposes for a semester or so before even attempting to build lessons around it. If it’s an IWB then you need to take some time to engage in unstructured use—play—with the device in a non-threatening environment. Just close the classroom door and fly solo or, better still, gather a small posse of like-minded colleagues and have a collaborative session.
  3. Integration: Bit by bit you make the use of the technology a part of the natural routine. While you can bring it in all at once it’s much less stressful to layer its use in here and there. If, for example the device is an IWB, instead of ditching your existing lesson plans, try instead to catch the low hanging fruit; that is to redo some of the lessons than lend themselves best to an IWB approach. If it works well, try another and so on.
  4. Reorientation: In time you may find that the “new” equipment and associated methodology becomes your standard approach. That set of chromebooks that you used to despise may, in time, become treasured additions to your classroom; perhaps even indispensable. This will not happen overnight and the stages are likely measured best in semesters, maybe even years.
  5.   Evolution: With new standards come new horizons. You may find unexpected applications of the once-unfamiliar technology. Perhaps you even spot yet another—and for now unfamiliar—set of methodologies that bears promise.

Of course equipment will still arrive unexpectedly and instructors will, to some extent, have to sort it out as best they can. The best advice is to realize that regardless of what else happens the integration process will come in stages, so act accordingly.

, ,

4 Comments

Learning Resources: Where’s the REAL Commons?

Have you ever wondered why, over thirty years after personal computers became affordable, and over twenty years after the widespread adoption of the Internet, digital technologies have still not reached their full potential? There are some generally good reasons why this is so and it’s not primarily due to resistance to change. Let’s examine a typical course.

The Setting

Consider a reasonably popular and somewhat universal course of study: First-year University Physics. This course is taken by students primarily interested in pursuing careers for which a knowledge of that discipline is a necessity—all jokes aside, that is it is manly a course people take because they need to. It is a gateway to a career in oil & gas, mining, engineering, aviation and, yes, the very few who wish to become physicists also take it but they must be considered a minority. The students who sign up are typified by a wide range of interest and ability. Some of them have studied physics in high school and come to the course with a solid background—that is, well-developed laboratory/inquiry skills, mathematics skills and a decent grasp of the fundamentals. Still others have next to no experience in the area, a weak grasp of mathematics and, sadly an interest level that leans more in the direction of “Mom/Dad wanted me to do this,” rather than “This is cool.” The majority, as you would expect, find themselves clustered closer to the middle of these extremes, that is, they have some background and experience as well as enough motivation to make them show up for class and put at least some effort into performing the various required activities (being attentive during class, performing the lab work and making their way through the written assignments as best they can). Overall, to a few the course is pain to be tolerated, to another few it is a total joy; the essence of their existence, but to the majority it is a right of passage; a series of tasks to be done with care but not necessarily with the burning love and passion felt by the instructors and other members of the faculty. Simply put, the audience is reasonably competent and serious but by no means a young version of the faculty

The content of the course is a wide-scale survey of the discipline as a whole. To the extent that it can it tries to provide an overview of the various areas in which the discipline has stepped into. Over two semesters–two courses actually–it includes:

  • A non-matrix approach to statics (forces at rest).
  • A non-calculus approach to mechanics, including potential and kinetic energy, impulse and momentum as well as Newton’s Laws of motion (and maybe Universal Gravitation) with a particular focus on the 2nd.
  • Static electricity including the concepts of fields ( but without the use of field equations), charge and electric potential.
  • Current electricity including Ohm’s Law and Kirchoff’s rules but with a focus on DC circuits and a serious limitation in terms of complexity—the circuit analysis rarely involves the use of simultaneous equations.
  • An introduction to waves, including basic coverage of sound and light. Wave phenomena such as the Doppler effect, diffraction and interference are introduced with a minimum of mathematics.
  • Perhaps: An introduction to special relativity and quantum mechanics, fluid mechanics and geometric optics.

The course endeavours to serve as a bridge in many ways. It tries as best it can to be accessible to students who do not have a previous background while, at the same time, not boring those who do. It does try to impart a fair degree of disciplined thinking while at the same time, encouraging further study. All in all the managing of the course can be described as quite a balancing act.

But here’s the thing: like most (but not all) scientific disciplines it is reasonably universal. That is, the background required by students does not tend to vary much by geography. Unlike, say, history which is impossible to separate from the local culture, first-year physics can be assumed to be more or less the same just about anywhere.

The Issue

This brings us to the big issue: even though there is the potential for a large audience for it, there does not exist a high quality integrated set of digital teaching and learning resources for that course. There are, rather, collections of good efforts that must be assembled and then put to use at each institution, each doing as best they can despite limited human resources and budgets. All things considered this is a great loss.

The same is just as true in other subject areas including Pre-Calculus and Introductory Calculus, Chemistry, Biology and Earth Science, along with possibly Psychology.

Now, before this gets too far let me hasten to explain why this discussion is dwelling on just STEM. It is solely because those disciplines are reasonably global in nature, that is, there is more-or-less worldwide uniformity on what is taught and how it is taught. This is simply not the case for other first year courses such as English (or whatever you wish to call the study that centres on the most popular language in the region), any of the fine arts, liberal arts or social studies. In all of those disciplines the local context matters far too much for anyone to get very serious about talking about a global approach to learning resources. But let us leave that for another time and just return to the ones for which it is the case.

So what is the extent of available learning resources for STEM? Here’s a partial list.

  • Commercially available print-based resources including textbooks and self-study guides. These tend to cost in the range of around $200 each and are generally of good quality. They are logically organized, well-illustrated, complete and correct (contain modes of thinking in-keeping with the established canon). For the motivated student who reads well they serve as excellent and compete resources. For those less motivated they often lie unused, as evidenced by the many so-called “used” (Irony, yes) books out there with unblemished spines.
  • Instructors’ notes and personal websites. Once something you could only get if you could afford the photocopying fee, thanks to scanners, word-processors and most importantly electronic Learning Management Systems these are becoming increasingly accessible. The quality varies widely, owing to the lack of formal peer-review processes that typifies other areas of academic life, but at least in my experience leans toward “very good” more often than “lackluster.” Notes tend to be short-form representations, lacking in the commentary and elaboration available in books. They also tend to be more to the point and, unlike the texts, do tend to be carefully read by students.
  • Communitarian resources such as Wikipedia. Over the past decade these have significantly improved both in terms of scope and quality. For any given topic that one would find in a first-year STEM course the entries tend to be complete and useful. There is no guarantee, though, that the depth of treatment is the same as is expected in the course. Instructor guidance is definitely a must if Wikipedia is used as a source.
  • Other web-based resources. A significant number of piecemeal efforts exist. These do an excellent job on portions of a course but do not try to be a single point of contact. A good example of this is the University of Colorado’s Phet site, which has developed a huge array of Java-based science simulations. Taken one by one any of the Phet resources does an excellent job of exploring the topic it intends to but it has to be left to the instructor to decide which ones to use, how to use them and how to link them in with the rest of the course resources.

So a wide variety of useful resources does exist so what, then, is the big deal.

A Simple Vision

Let’s think for a moment what it could be like online when a student accesses the course.

The course home provides an overview of what’s in the course along with a summary of progress to date. This includes a list of tasks completed, along with appropriate achievement indicators (grades, etc.), upcoming events and deadlines as well as uncompleted tasks, along with suggested resources and activities. It’s worth noting that just about any online Learning Management System (LMS) such as Desire2Learn, Blackboard or Moodle can do this right now.

For any given course organizer (whether it be lesson, topic or learning outcome, for example) course resources are provided in a variety of formats including:

 

  • Print Materials, and preferably in a format that lends itself well to display using either paper or an electronic reader such as an eBook reader or tablet device.
  • Multimedia presentations—that is, an electronic version of an in-class presentation, complete with visuals and audio—that could be created with software such as Adobe Captivate.
  • Interactive simulations (where applicable) in which students could investigate topics of study. These should be similar to the ones already available from Phet but with the added value of having guidance on what you are looking for; the simulation has a built-in lesson plan. In some, but not all, cases (investigating DC circuits, for example) these could replace activities generally done in the lab.
  • Laboratory resources in the form of videos, analysis software and handouts that would be used in conjunction with lab activities. Students would still be expected to go to the lab but because these would replace the lab manuals and demos from the front of the room. Students would have more autonomy, meaning that at any given time various activities could be managed at once in the same location.

 

For any given course organizer the course would also host a variety of assessment/evaluation tools including these:

 

  • Traditional written assignments. These could be printed off, completed on pencil and paper, scanned, and then uploaded to the assignment drop-box for that item, where they would be graded, probably by a TA.
  • Online assignments, similar to the above but with the submissions and solutions done online. This is similar in form to the existing open source LON CAPA program currently used worldwide but with several important additions: (1) integration with the LMS instead of just stand-alone (2) provision for viewing of solutions, not just answers.
  • Interactive, Simulation based assessments. Instead of just working in pencil and paper the student would perform actual tasks online and be assessed on them. For example, the student could use an interface to work through an exercise traditionally done with paper and pencil or could use a drag and drop interface to assemble, test and analyze a circuit. These tasks could be done, for example by tweaking existing java based simulations or built from scratch using the simulation features in software such as Adobe Captivate.

 

Overall, you may notice that none of the items mentioned are too far-fetched. While this could have been listed: “The development of a completely immersive online lab based learning environment for physics,” it was not, owing to the extremely prohibitive cost (probably in excess of $50M).

The course assemblage mentioned has a much more modest cost, probably in the vicinity of $1M or so, with the majority of it going to the programming efforts of getting the interactive pieces up to a sufficient quality. While it is unlikely that any given institution could be expected to foot this sort of development bill, when you consider the fact, already mentioned, that this course is one that would have worldwide appeal it is rather amazing that it does not already exist.

The Barrier

Think about the numbers for a moment. Consider just doing the course in English and thus limiting it to primarily English – speaking countries (of course we really want this done in all popular languages but lets look at a limited, simple case here, just to make a point). This would potentially give a market in which millions of students would wish to access it. Currently those students are expected to purchase either new (at around $200/copy) or used (at around $100/copy) traditional textbooks for the course. What if, instead, this money which, at a conservative count would be around $50M per year (assuming that only half the students purchase the text and most of them buy used) were instead invested to the development of online resources? If the figures given were correct, the development costs would be recouped in such a short while as to be insignificant!

This, of course, makes no sense. Commercial publishers are not stupid enough to pass up such a lucrative cash cow so why has this not been done? I would suggest it is the sum of three interacting causes.

Educational institutions are unable or unwilling to fund the development of high-quality course content. It costs money—lots of it, and in these times when all institutions are facing increasing pressure to keep costs down any requests for additional funding are unlikely to be met with anything other than skepticism. To develop course content requires (1) time for the subject matter expert—likely an already over-burdened instructor (2) time for an instructional designer as well as (3) various multimedia/programming professionals who assemble the content into the various types mentioned in previous posts. Generally there is little or no money available to put the IDs and multimedia specialists on the projects and requests from the instructor for release time are met with the response, “we are already paying you your salary and we assume that the development if class-related materials is included in that already.” Simply put, the administration does not have the extra funds to pay the people and the instructors do not have the extra time to prepare the content that would be needed to take it to the next level.

Educational institutions do not cooperate to share the development burden. It has already been suggested that, while individual institutions are likely unable to fund the development of high-quality materials, collectively, the human and monetary resources exist when you consider that, at least for the courses mentioned, most institutions are, in effect, teaching much the same courses. If instead of each institution doing its own thing, they cooperated and jointly developed the materials it would apportion to very little.

The fact is, though, that this is one of those things easier said than done. To pursue a joint venture there must be (1) an overall plan (2) formal coordination and management of the project and (3) buy-in. The fiercely competitive atmosphere that exists between institutions coupled with the absence of a formal unifying body means that it’s hard to get this done, especially when you realize that this whole topic is nowhere near the top of most educational administrators’ lists of priorities. Still, it is a shame as the Internet has already demonstrated how well it is suited to cooperative development projects, as evidenced through successful development projects such as Mozilla as well as more communitarian development such as has been done with Wikipedia.

Commercial educational publishers are unable to implement an effective business model. Ask just about any administrator that holds the educational purse strings for education this question, “Why don’t you allocate money towards the funding of teaching and learning resources?” Chances are this will be the response: “Because that is the job of the educational publishers. We can’t afford to do it ourselves but they can because they can access a much larger market.”

Fine; after all, why waste taxpayers’ money when there’s a much better way?

Now go ask any executive with any of the major publishers the same question. Chances are, this will be the response: “Because we cannot recoup the development costs. Not only are institutions unwilling to pay the license fee, even though it is significantly less than they used to pay for textbooks, but, worse, our experience has been that people will always find a way to obtain and use our materials, regardless of copyright. We just can’t win, no matter what we do.”

Put the three together and you get the situation we face today. Despite the huge potential that Internet-based resources hold for improved teaching and learning in first year courses much—not all, mind you, but still the majority—of that potential remains untapped, with little sign of any widespread, sustained effort to do much of anything about it.

Suggested Solutions

This is not to suggest that the appropriate response is to just accept that things are the way they are for good reasons and the best that we should all do is to learn to accept the status quo. While it is unlikely that any revolutionary change is likely to happen in the short-term, significant benefits can still be realized from some straightforward actions. If sustained, some of the items below are likely to go a long way towards realizing the dream of much more optimal usage of digital technologies in service to teaching and learning. Here are three items which, taken together, hold every possibility of resulting in widespread improvements.

Existing and prospective faculty need to continue to work toward positive change. Large scale changes take time. Not only do materials need to be developed but, more importantly the two book-ending sets of activities need to be done right. (1) The preliminary work of understanding the current problems and planning appropriate responses need to be done well. Likewise (2) the follow-up activities of fine-tuning introduced measures and modifying them in light of unexpected contingencies is something that cannot be forgotten. In situations where the general consensus is that things are just fine as they are, the general response is not just one of stagnation but even worse, is one of gradual decline. Things break. Things change. If there is no response, what’s broken remains broken and sustained change brings the reality increasingly further and further from the classroom. If, on the other hand, the general consensus is “We need to make things better” then the meaningful improvements—that is the ones forged from REAL need—will slowly be realized in a spirit of collegiality.

Leaders (Deans, Directors and College Presidents and perhaps government officials) should move for more inter-agency cooperation. While there are few formal opportunities for collaboration, the academic world is rife with opportunities for informal exchanges: presentations, conferences and such. If, at those occasions the topic was, brought around to the whole idea of cooperatively developing teaching and learning resources, in time the interest would build and, along with it, the ways and means of getting it done. People who acknowledge a need tend to see the opportunities for finding the means by which to solve the problem—a positive off-shoot of the generally unhelpful confirmation bias. In short, talk about it and a way generally tends to be found for getting it done in a way that everyone can live with.

Publishers need to work more closely with institutions. Despite the fact that they work closely with some faculty members—after all most current texts are authored by faculty—publishers tend not to have good two-way relationships with the learning institutions their whole business model is built upon. Generally the only formal relationship is through the bookstore and the general attitude is one of client service, that is, the university states a requirement, generally in the form of a syllabus and then evaluates the available resources. This activity is generally muddied somewhat by the publisher’s efforts to sharpen their competitive edge through either the provision of some free goodies or through the haranguing of either the dean or the individual committee members. Often the relationship shakes out something like this: Faculty view the publishers as greedy & grasping and publishers view the institutions as needy & cold-hearted. All in all not a great atmosphere in which anyone can be expected to thrive.

It does not need to be that way. Institutions have great need for improved resources, especially as students gravitate more and more toward the Internet and away from print-based materials. Likewise the publishers are faced with an ever-diminishing pool of revenue as more and more of the old-style core business of just feeding the thirst for basic knowledge is met more and more through existing resources such as Instructors’ own websites and Wikipedia. What’s needed, then is a more sincere and productive dialog in which the publishers gain a better understanding of how to meet current needs while universities find better ways in which to ensure that the publishers’ financial expectations are met.

Overall, the situation is far from desperate. Despite the many shrill cries of doom and gloom our modern educational institutions are by no means in a sorry state; far from it. First year students do as well as they always have—in many cases even better. Enrolments, overall, tend to be strong, and the product—students who achieve and thrive—tends to be good, as evidenced by the continued relative success that all still seem to enjoy.

That said, the situation as always can be improved. The great potential that the Internet holds for education is far from being realized. An overall attitude that is conducive toward positive improvement coupled with a willingness to strive collectively, achieving the small gains that, measured together will result in great strides is just what we all need.

, , , ,

7 Comments

Theoretical Case: Designing & Developing a Typical Course for Online Delivery

Background

Suppose that you wish to put an advanced mathematics class online. Let’s stay out of the very common ones such as first-year Math, or any of the sciences. They have issues that will be dealt with later on. Let’s suppose that your faculty has developed a course in solving ordinary differential equations (ODIs for short). This is a course that needs to be taken by math, physics and chemistry majors as well as by engineers, generally in the second or third year of the program. The course is somewhat universal but not really, is somewhat popular, but not really. This means it is in most regards a typical course; a good case-study.

The course design is straightforward. Students will be presented with 12 methods by which to solve differential equations. The 12 methods will comprise 12 lessons. Each lesson will consist of these components:

  • Presentation of the theory behind the method.
  • Three worked examples, in increasing order of complexity.
  • Exercises for the student, which shall be submitted for grading.
  • Three exams.

The student will receive a grade out if a possible 100 points. Each of the three quizzes shall be graded out of 30 points and each lesson assignment shall be graded out of 10 points. The final grade will be the sum of: the average of the lesson assignments plus the total from the three quizzes.

Let’s assume that the course has been run in a face-to-face mode for many years but is now to be run as an online course. A previous attempt which consisted of class notes posted online and an evaluation based on two 50 point exams which were taken at several regional centres, did not work out. The students did not access the class notes frequently and said that it was next to impossible to get answers through the course email system. They also noted that it was extremely inconvenient driving to the regional centres to write the exam.

Let’s redesign it. We don’t have to redesign the curriculum. The twelve methods for solving ODIs remains the basis of the course.

Evaluation

Start with the evaluation methodology since this will have an impact on how the rest of the course is delivered. We know how important it is for the students to complete work assignments so we have allocated some grade points to them. It must also be convenient for students to get feedback on them. For simplicity’s sake then we will construct, for each lesson, a five-question assignment. To ensure that each student does not get exactly the same assignment, for each of the five questions we will put in 3 versions. For each student, then, question 1 will be randomly chosen from the 3 available, question 2 will be randomly chosen from the 3 available and so on. These will be presented as a series of multiple choice and with 10 possible answers. The students will get three attempts at each question. After either the successful entry or after the third unsuccessful attempt the solution will be displayed. This continues until all five questions are done. In this way, the student gets a reasonable chance at getting t the answer themselves but, if necessary, they will get the full response.

Of course any student could just “game” this and ask others for help. This may happen but, in the end, it is the student who will lose out since the development of facility with the solution methods is contingent on trying the practice exercises. To keep possible cheating from heavily skewing the grades, overall, we are limiting the weight to ten percent of the total—enough so that people should take it seriously but not enough to render the scheme invalid should cheating occur.

A different tack will be taken for the exams. These will no longer need to be taken at a regional centre because we will purchase into one of several available online exam proctoring services. To take the exam the student logs in from their local PC and its webcam is turned on to pan the room and ensure that only the student is taking the exam. The screen is then “locked down” to only display the exam and the student takes the exam, in view of the camera using pencil and paper. When finished the student scans the exam using an ordinary scanner, as a PDF file. This file is then placed in the exam drop box that is also on the locked-down screen. With this done the screen is released.

The instructor will then either print off the exam as normal or open it onscreen using Adobe Acrobat and mark it onscreen using a Wacom pen. The marked up exam is then (rescanned if marked old-school and) placed back in the exam drop box.

Content Design and Preparation

Recall that a previous effort based on placing class notes online had not worked out. This is to be expected for several reasons:

  • Instructors’ own class notes tend to be somewhat cryptic. They are the distilled version of the instruction, generally minus the many prompts and explanations that are given live. The instructor has crafted these to be part of the delivery system, not all of it.
  • Notes are often idiosyncratic, based on one particular view and often with unspoken assumptions that are not at all evident to the outside reader.
  • At best, mathematics is hard work to read so most students tend to procrastinate and not read texts or notes unless forced to.
  • Instruction goes better when students are challenged; encouraged to predict what should happen next. This is most of what makes live classes so effective when done well. You cannot do this effectively through notes.

We could videotape the instructor. In fact this is routinely done in university campuses everywhere through “lecture capture” technology. Let’s be frank, though: it amounts to boring, badly produced TV. Instructors are not paid performers and, as such, make frequent missteps, often have distracting habits (such as excessive pacing about, saying “ah” often and such). While this is perfectly acceptable in a live classroom, for recorded media it falls far short.

You could, of course, train an actor to deliver the course but, practically speaking, given the nature of the subject, the budget is just not there.

We shall do a cost effective compromise. We will begin with the course notes. Since the course has been offered for many years live we know we have access to a perfectly valid set. They are hand-written so we will enlist a senior math student, nominated by the math department, to redo them as PowerPoint slides. An Instructional designer (ID) will work with the draft slides to clean them up somewhat. In particular an effort will be made to make them far less busy and only display onscreen what is necessary.

A live class, based on the notes, is then videotaped. The same math student then transcribes the class lecture and the ID goes through the transcript to clean it up. Only that which is necessary remains. We are then left with a script that matches the PowerPoints, slide by slide.

The course instructor is then enlisted to read the script in a sound booth. This leaves us with a clean vocal track for each slide.

The PowerPoint slides are loaded into Adobe Captivate. The audio track for each slide is then layered in. The result is then produced as HTML5 and SWF which can be viewed on a desktop, notebook or mobile device.

For each lesson, then 5 multimedia files are produced.

  • A audiovisual presentation of theory that ends with three multiple choice questions for understanding.
  • Two audiovisual presentations. One for each of the first two worked examples.
  • Two interactive audiovisual presentations. These will be like the first two but at each step the student will be asked what should happen next and will need to choose correctly before proceeding.

Course Delivery

All of this is loaded into an LMS such as Desire2Learn. Students can log in at any time. The LMS will track and document their progress. In theory the course can be run on an as-needed basis but we will offer ours on a schedule. Why? So we can assign an instructor who can maintain the course pace, offer extra insight and respond to student questions.

So what does this mean for the instructor? Does it mean that we can build a system in which instructors are no longer necessary?

Let’s get real, shall we…

First let’s not forget for a second that learning is as much a social activity as it is an intellectual one. Most (yes, not all but still most) students want to feel as if they are a part of something; that their actions are noticed, even rewarded. If we leave the class instructorless it will not work; it’s like leaving a ship “captainless.” Sure it will float but it will get nowhere. In time, some students may finish but most will not, eventually choosing to just bail out.

The course will have an instructor. The duties we be these:

  • Respond promptly to student questions.
  • Post periodically to ensure that the pace is maintained.
  • Provide feedback in the form of grades and comments.
  • Continue to improve on the course content: develop better examples, provide more examples for students who need them, or, do the existing examples several times, using different language; different prompts, convert some of the presentation examples to interactive ones, update the assessment sets. The list is endless.

Conclusion

There. One case sort of closed. Not perfect, but then again not meant to be. It was, rather, meant to be serviceable and affordable. As such this was by no means the only way in which it could have been done. Alternatives include:

  • Making parts of the assignment such that they were scanned and submitted like the tests.
  • Making parts of the test objective using multiple choice items if valid items were found to exist (frankly I can’t really see that being the case for this course).
  • Using produced video instead of the method described.
  • Writing simulations in which the students interactively solve the equations. Mind you, this would be a major project and a significant cost item but maybe a worthwhile one if the budget permitted.
  • Adding some “gamified” elements to reward success or the completion of extra exercises or to enable group completion of items.
  • Adding a live tutorial component using synchronous tools such as Blackboard Collaborate.

With the last bullet stated it should also me noted that there’s really nothing stopping the math department from making a complete switch from using the lecture hall to, instead, moving the instructor to a Blackboard collaborate environment. Instead of going to the lecture theatre, students and instructors would just log in to Blackboard Collaborate and the instructor would do what (s)he has always done, as would the students.

All of this kind of makes you wonder why this is not already the case, doesn’t it? Let’s address that. Here are a few reasons:

  • Existing methods work very well and faculties do not have the resources to make wholesale shifts in short periods of time.
  • Not all faculty and students wish to do this. Not only is “Live” instruction something many, many students and instructors thrive on but also, the converse is very true: for those same individuals the quiet confines of the office or home is anathema to effective learning.
  • Audiovisual presentations can place a distance between the student and instructor, making both reluctant to interact with one another, even when absolutely necessary.

That said, think of the advantages: Students get more freedom regarding when they take classes. They also get to redo the examples when necessary. Finally, instructional quality is assured through a deliberate process. Instructors are also freed from the “routine” instruction tasks and are freer to deal with individual issues and, maybe, even have a bit more time for research.

, , ,

9 Comments

Follow

Get every new post delivered to your Inbox.

Join 763 other followers