To the Student Teachers

G                                                   D
Best practice, Curriculum guides, Cooperative learning, Think, Pair, Share
Am                                     C
Differentiated Instruction,  Bloom’s Taxonomy
G                                                     D
Flipped Classroom, Manipulatives, Formative, Summative
Am                                  C
Scaffolding, Rubric,  Accountability

G                              D
To be teachers we aspire
Am                                 C
always yearning for the place of learning
G                                              D
but we don’t know if we’ll get hired
Am
and to be forthright
      C
it has us all uptight

Multiple Intelligence, Professional Development,
Certification, Short Attention Span
Critical Thinking, At Risk Students,
Lesson, Unit, and Assessment Plans

Authentic Assessment, Blended Learning,
Comprehension, Methodology
In Loco Parentis, Methods Courses,
Professional Learning, No Zero Policy

G                                   D
Nobody Excluded, Schools Act, Literacy,
Am                               C
Program of Study, Busing Schedule
G                          D       N.C
Pedagogy, IEPs, SCOs, ESL, UDL…
N.C
…Bunch of other terms as well
N.C
that now in your teacher brains do dwell

(to the tune of “We Didn’t Start the Fire,” by Billy Joel)

, ,

8 Comments

Reconsidering Programming in Schools as a Mandatory Course

Earlier this week I heard a piece on the CBC St. John’s morning show regarding the assertion that computer programming is something that should be taught in our public school system. Later on I read much the same article off the CBC NL website. The spokesperson for Code NL asserted that existing public school courses within our province were “a joke” and noted that in giving people the training in computer skills we could move away from our reliance on natural resources.

While, at least on the surface, this sounds reasonable, the reality is much more complex.

First some of the assertions are inaccurate. The spokesperson stated that existing courses were “a joke” and, besides sounding rather condescending to both teachers and students alike, without data to back up this assertion it must be considered to be only a personal opinion. The belief that the courses are only offered “in the Metro” area is also false. They are, for example, available in Gander, Clarenville, Corner Brook, Stephenville, Goose Bay, as well as in a host of smaller communities both locally and through CDLI. There is academic life “beyond the overpass.”

But there are more important things that should be stated in reply to the story.

Chief among these is the fact that schools do not exist for the sole purpose of preparing people for the world of work. While that is certainly ONE of the aims of public education it is important to also realize that the full picture is much broader. Schools exist because we wish to have individuals with the attitudes, skills and knowledge necessary to lead happy, worthwhile lives—at home, at work and within the community at large. Yes, of course we need people who contribute to the economy—after all, bills, both public and private, have to be paid and for that we must all do our part: earn money and pay our taxes. That said, it’s important to remember that as a society what we really need are people who lead good, personally meaningful lives, and who also live out their duties to the community.

Added to this is the reality that we live in a diverse, vibrant society. Young people come to school with varying interests, abilities and values. Sure, we are all citizens of a single community, a single province and, at least at first glance, it makes some degree of sense that an intricate knowledge of those little electronic gadgets that so dominate our lives seems to make some sense. But just think about our already busy schools and consider the value of additional mandatory publically funded courses in:

– Plumbing, because running water and sewer are vital parts of our public infrastructure;

– Carpentry, because, shelter is important, especially in our nasty cold environment;

– Cooking, because we all have to eat on a regular basis;

– Embalming, because we’re all going to need it.

Of course not! That’s silly in the extreme. Schools cannot be expected to do everything and, besides, one of the benefits to living in a large diverse society is that we have the critical mass needed to ensure that levels of expertise exist, to the necessary extent, across any given community.

We don’t all have to be able to do everything.

So, too, with programming: It’s a vital part of our economy and its effects within our personal lives are too broad to even summarize. Still, we don’t all need to be programmers to appreciate the technology or to use it effectively.

There’s something else: it’s naïve to assume that taking a course or two, in school, in programming, is something that will prepare a young person for a career in that field. Programmers do much more than just write code. Sure, that’s a vital part of the enterprise and, besides, it’s fun to write code bits and have computers do clever things. That said, the fact is that only a few of the students who would be forced to take that mandatory course (or courses) would see the value in it and, thus, put in the required effort. The result would likely be a halfhearted thing leading to jaded teachers and students; in sum a waste of time and money.

The reality is that computer science is not something that can be sparked and ignited like your backyard barbecue. It is, rather, a complex skill that takes many years of personal investment of both time and effort. Besides knowing the basics of a code’s “language” the programming professional also understands logic and structure. Most importantly the programmer sees it all within a complex, disciplined problem-solving framework, something that only happens in an environment specifically created to doing just that—namely a computer science academic unit or a well-run enterprise dedicated to that pursuit…

…and specifically NOT a public school that is already over-burdened with unrealistic expectations from its governing agencies and from the public at large.

Still, the sentiment is a valid one, albeit a bit misdirected. Instead of trying to create yet another course, along with its attendant monetary costs (and they will be steep; computer hardware and software, along with the required training is costly; a bottomless black hole into which one pours money) perhaps those interested in promoting the cause of programming should do what others with similar interest have done, and continue to do: forego advocacy for outreach.

Instead of publically shaming governments and schools for not teaching the stuff, work alongside of the various partners: government, districts, the university and the NLTA.

Instead of asking them to do what you feel is important, offer free workshops for students and teachers. Visit schools and participate in professional development activities. Focus in integrating some of the skills and knowledge within the existing educational framework. Add vitality rather than simply grafting on something else to an already overburdened structure.

, , ,

8 Comments

Electric Vehicles in St. Johns–Let’s be Rational

There’s been considerable debate regarding a recent RFP to purchase a pair of electric vehicles as part of the St. John’s city fleet. As expressed by councillor Dave Lane, who spearheaded the proect, the plan was to make the new vehicles available to the parking enforcement unit and to treat the whole thing as a pilot project and to see where things went.

Initially this did draw considerable, polarized, interest with those for it noting that much could be learned from the acquisition and use of the vehicles and, besides, the cost of operating them should be considerably cheaper. Those against typically viewed the whole thing as not worth the bother; playing with newfangled toys. The debate was brought to a head, though, by a letter written to the council by former mayor Andy Wells in which he slammed the plan as a waste of taxpayers’ money and concluding that electric vehicles are, in general, ” ‘driveway jewelry’ for the eco-affluent, who benefit from public subsidy to indulge their guilt about living in a fossil-fuel-dependent society,”

Then the fight started, the polar opposites moved yet further apart and both sanity as well as reason, it seems, exited the building.

Still, though, it’s impossible to ignore the fact that, year by year, electric vehicles (EV’s) are becoming more and more prevalent. Sales, while not exactly meeting the growth targets guessed at 4 years ago (they were expected to triple each year) have still been showing decent growth. What’s more, most of the major manufacturers are in on the industry. Presently, models are available from GM, Toyota, Nissan, Fiat, Daimler, Mitsubishi and of course Tesla to name just a few.

With a little time on my hands I investigated the costs associated with the requested purchase for the city. I asked just one question: does it make financial sense? In other words should the expected reduced cost of operation translate to a lower cost of ownership. Not to spoil the rest of the post but the short answer is “no” but it’s worth reading on, if you have the time and interest.

Let’s look at the cost for just one car and let’s leave the cost of the charging station out of it altogether since the car doesn’t really need a dedicated charging station as such; all it needs is access to a 110 V or a 220 V (preferred) standard outlet. Since the EV is to be used just around the city, all of the costs should be based on that type of driving.

Now—what car to choose? While no doubt some users would love to cruise around in something very nice such as the luxurious and trendy Tesla S,  we have to be more pragmatic here and choose something better suited to the job at hand. It’s for parking enforcement and won’t be carrying significant cargo. It’s also bought on the taxpayers’ dime so it therefore needs to be inexpensive. For the sake of argument let’s choose the Nissan Leaf.

Nissan Leaf (Wikipedia)

Let’s stick with the base model. According to Nissan’s website it can be had here for $33,788.00. Not exactly cheap but the expectation is that what we lose up-front we’ll gain back in the long term with lower operating costs.

Let’s figure them out. Let’s start with charging the battery. According to Nissan, the battery capacity is 24 kWh so you might assume that it therefore will take that much electricity to charge it. You need to take into account, though, the simple reality that no process is 100% efficient. You may notice that when batteries are charging and discharging they warm up. This means that some of the energy is being wasted as heat. You can expect, therefore to need more than 24 kWh.

Based on some data found here I am assuming 85% efficiency this means that to fully charge the battery you must therefore supply (24/0.85) or 28 kWh of energy. At NL Power’s current rate of $0.1178 / kW h this means a full charge will cost $3.30.

Next you need to determine how far a charge will get you. Nissan’s stated figure is of 135 km for a full charge.  The US EPA, however lists a much more conservative value of 117 km. Given the fact that batteries function les efficiently in cold temperatures, however, it makes sense to question the validity of even this figure. Fortunately, some low temperature data are available. The website fleetcarma.com has listed some empirically derived figures of how the range per charge varies with temperature so let’s use those.

We also need to use temperatures that are realistic for this setting. Average temperatures by season can be found here. Let’s assume that the vehicles will be driven 20,000 km/year and, further, let’s assume equal distances in each season. Since the temperatures will be different in each season let’s take that into account. The table below lists the anticipated costs for driving 5000 km in each season, as well as the yearly total.

Season Temperature (degrees C) Range (km) Cost for charge Cost for $5000 km
Winter 1 70 $3.30 $236
Spring 8 75 $3.30 $220
Summer 19 77 $3.30 $214
Fall 13 77 $3.30 $214
Yearly total $884

Table 1: Yearly charging cost for Leaf, assuming 20,000 km

Now let’s compare the EV to something reasonable. Since we started with a small Nissan EV let’s compare it to a small conventional Nissan, the Versa. Once again, let’s stick with the base model but equip it with an automatic transmission to make it more functionally equivalent to the Leaf. According to Nissan’s website that vehicle should come in at $17,165.00.

Nissan Versa Note (Wikipedia)

Now we need to find the cost of fuel for 20,000 km. Based on US EPA figures the Nissan Versa is rated at 7.6 L/100 km in city driving so you can expect to use 1520 litres to drive the 20,000 km in a year. The variability in gas pricing makes it impossible to provide a definitive single cost so upper and lower figures will be used instead.

According to Gas Buddy the yearly low was $0.95/l and the high was $1.44/l. This then gives two yearly fuel costs.

Based on $0.95/l
“low fuel”
Based on $1.44/l“high fuel”
Yearly fuel cost $1444 $2189

Table 2: Fuel costs for Versa, assuming 20,000 km

Now the maintenance. Data on this are available in US$ from autoblog.com and are presented in the table below (converted to Canadian dollars). Interestingly enough the figures are roughly the same and could realistically have been omitted from the calculation.

Vehicle Leaf Versa
Repairs and Maintenance $4844.20 4857.94

Table 3: Maintenance and repair

So, finally, let’s look at the total five year cost for the two vehicles. Insurance and licensing will be the same for both so we can omit them.

Item Leaf Versa (low fuel) Versa (high fuel)
Purchase Cost $33,788.00 $17,165.00 $17,165.00
Fuel $4420.00 $7220.00 $10945.00
Repair and Maintenance $4844.20 $4857.94 $4857.94
Total $43052.20 $29242.94 $32967.94

Table 4: Five year cost of ownership, based on purchase with no resale.

Clearly, presented this way, there’s no contest. Based on straight up purchase the leaf will cost anywhere from around $9000 to around $14000 extra to own over the five-year period.

Now, you may be crying foul, “wait a minute, you don’t ditch the car after 5 years. The Leaf will be worth more at the end of that period so this is not a fair comparison.” Fine. Let’s factor in depreciation. Once again the estimates came from autoblog.

Leaf Versa
Original Cost $33788.00 $17165.00
Depreciation $21317.32 $9115.47
Resale Value $12471.00 $8049.53

Table 5: Expected depreciation and resale values

Set’s just do the total cost table 4 above over again but use depreciation instead of purchase cost.

Item Leaf Versa (low fuel) Versa (high fuel)
Purchase Cost $21317.32 $9115.47 $9115.47
Fuel $4420.00 $7220.00 $10945.00
Repair and Maintenance $4844.20 $4857.94 $4857.94
Total $30581.32 $21193.41 $24918.41

Table 6: Five year cost of ownership, based on purchase with resale

The Leaf is still considerably more expensive, even when you consider a worst case scenario for gasoline.

From a strictly cost-based perspective, then, it does not make sense to procure and use the EV’s if the assumptions used are valid.

That’s not really the end of the story, though, is it? It’s not my intention to be negative here, just reasonable, and since the main argument put forward was based on cost it needed to be pointed out that it was likely invalid. That said, there are far more compelling reasons that may still make the plan a good idea. Consider these:

First, you need to consider the overall environmental impact of EV’s. They have the potential of being much cleaner and environmentally friendly. Assuming that the batteries are correctly recycled and re-purposed (and there’s cause for some optimism in that area, see here.) then the real environmental issue is related to the source of the electricity. Right now in NL, unfortunately that’s just a bad joke as the electricity is as non-green as it gets, coming, as it does, from a dirty thermal generating plant. Later, though, when the feed is switched over to the Hydro-based Muskrat Falls plant that will be an entirely different matter; much greener. Simply put, right now electric cars are just contributing to the pollution coming from the Holyrood Plant bit that will change in a few years—right about the time those vehicles are ready to come off the road as it turns out.

Second, you need to consider the value in foresight and planning—something often badly absent from the NL milieu. (As an aside a good friend often half-jokes that the NL Government’s idea of long-term planning is, “what’s for supper?” His words, not mine.) Based on the best available data it does seem likely that EV’s will become more and more prevalent as time goes on. To what extent? I would suggest it is impossible to ascertain that right now. It’s still worth considering. As such not only the city but also the province needs to devote a reasonable amount of time and effort in gathering pertinent empirical data regarding use costs, reliability, safely and infrastructure needs. In that light, the proposed plan, if altered and fleshed out appropriately as a rigorous pilot project, and not just a vague idea, can easily be seen to have significant merit.

So maybe the best advice to those involved should be this: plan it all out a bit better, look again at the timelines and goals, and maybe see if partnership assistance is available from the province. Don’t just mess around driving from meter to meter and, asking the traffic enforcement officials, “how’s it going with the new EV’s?” from time to time. No, devise a proper plan and implement it. Log everything: kilometers driven, time needed to charge, energy transferred in the charge, times required, maintenance and repairs–everything. Put it up there where we can all see it and benefit from it. In that way, properly implemented, the project does have the possibility of yielding information that can be used by consumers and governments alike.

, ,

6 Comments

The Armour Goes in Unexpected Places

It would have been in most respects a normal day for an online distance education teacher in the early nineties. I settled in to my spot in the studio and made sure everything was working. First the mikes—all OK. Next the Telewriter: I picked up the pen and wrote on the screen and then remotely loaded the first ‘slide’ for the day’s lesson. Again everything was fine. As always the first thing to do would be to greet the students by name and just chat for a few minutes. Besides ensuring that the audio and graphics capabilities were working it had the much more important function of getting the students to open up, to come out of their schools, defined as they were by the walls of the classroom, and now enter into the online one defined only by who was present that day.

Today I had a new student. I was a bit surprised as it was several months into the school year. I asked her name but she did not reply. Eventually another student at that school answered for her, telling me her name and letting me know she was shy.

Over the next few weeks I did my best to get my new student—let’s call her Angela—involved, but all to no avail. She would not respond when asked a question and would not ever write on the electronic whiteboard when asked to contribute to the day’s work. Her first written work assignment was comprised of mostly blank sheets and so, I decided it was time to contact the school. I called the principal and then learned the awful truth.

———-

In a previous job, around 14 years ago, my designation was Program Implementation Specialist and one of my initial tasks was to put together a team of online teachers who would lead the changeover from the distance education system used in my province since 1988—the one described in part above, and may be described in more detail here if you are interested. Together, the Program Development Specialist and I devised a recruitment strategy that involved an online application system that would be used to provide a short-list of candidates. Those candidates would then be interviewed by a panel of three and would be subject to a reference check. All components were scored and the scores were used to rank the potential candidates, who would then be seconded.

This system was used by me and my colleagues for seven years and provided me with a significant experience in selecting those would be well suited to online learning. Through constant use I came to anticipate the response to one particular question as it tended to give an almost instant measure of whether the interviewee was or was not a suitable candidate. The question? “What would be your response if you noticed that a particular student was not doing well in the course? That is, if you noticed that a student was not engaged, not submitting work on time or doing work that was of sub-par quality?” Typical answers included: putting on extra classes, creating tutorials, providing “worksheets” and maybe even involving disciplinary measures. None of those, however, were the one I sought. I wanted something else.

———-

Oftentimes the truth or the best course of action is not the one that seems obvious. Take my own academic discipline—physics—for example. There’s nothing commonsensical about the majority of what is typically found in the high school physics curriculum despite the protestations of inexperienced (or just plain ignorant) instructors who claim they can “make it easy.” Newton’s first law (objects tend to remain at rest or in constant motion unless acted on by an unbalanced force) is about as counter-intuitive as it gets. Objects remain at rest—no they don’t! Just YOU try sliding a book across a floor; it comes to a stop in no time! No! Newton’s first law is the product of sheer genius; a fantastic off-the-charts insight made by a most unusual individual. Seeing or maybe creating ‘friction’ as a new construct but one that merely presents itself as a new unbalanced force—pure brilliance!

Physics is not something that is not easily absorbed; something that is only understood after a skillfully-constructed instructional framework that involves bringing students right up against their existing world understanding, clearly pointing out the deficiencies and ensuring that the student acknowledges those deficiencies and then carefully rebuilding the worldview in a different way. Not simple at all and certainly not something that happens in a day.

And so it goes with everything. To do better work you have to work hard to get beyond the obvious and, as just pointed out, this involves going up against your “comfort zone” then breaking through it with a whole new worldview. This involves breaking common sense.

———-

Allied Bomber Command faced just such a situation in World War II.

Let me digress for a moment here. I am not one given to glorifying war. While I acknowledge that it is a reality and something that often cannot  be avoided I also want to point out that there is generally no “right” and “wrong” side but instead two opposing groups who have found themselves with no alternative but to act with extreme aggression. It is a reality. Ordinary people like you and I never wish to find ourselves in it but, alas, from time to time it happens and we are faced with no choice but to do what we must.  Under the extreme conditions faced by the various sides oftentimes comes the need to dig down deep and to utilize every and any opportunity that affect the balance of power. Frequently, then, wartime becomes a time of extreme innovation borne of necessity. I wish to consider one case here as it is illustrative of a point I wish to make and not for any other reason.

Bombers, with their heavy deadly loads, are slow lumbering beasts and, as such, are easy targets for fighters who desperately seek to prevent them from achieving their missions. In WW2 many that set out did not return but were instead shot down by the fighter planes they encountered along the way. Those that returned were typically bullet riddled but still able to limp back to base for repair and refitting.

One of the responses to this loss of planes was to install armour that would protect the aircraft from the projectiles from the fighters. Armour, though, is heavy and reduces the load capacity and thus the military effectiveness of the aircraft. The solution, therefore, is to place the armour only where it is absolutely necessary. Bomber command subsequently engaged in a constant, careful study of its in-service aircraft. Each time an aircraft would return from a mission it would be inspected and the location of bullet holes obtained in that flight would be recorded. Typical returning aircraft resembled the drawing below. Notice where the bullet holes are; namely on the wings, tail and fuselage. Based on that it would make sense to place the armour there since, after all, that’s where the hits were occurring, right?

A Lancaster Bomber after a run. The red dots indicate the position of bullet holes.

A Lancaster Bomber after a run. The red dots indicate the position of bullet holes.

Wrong. The reasoning is unsound; fundamentally flawed, in fact.

Fortunately so, too, thought the Allied Bomber Command, thanks to the insight of mathematician Abraham Wald. He assumed that the bullets were not specifically aimed at any one part of the aircraft. Aerial firefighting was much too chaotic an activity to allow for precision aiming. Fighter pilots instead aimed in the general direction of the aircraft and hoped that the bullets/cannon shells would have some negative effect. One would expect, therefore that in an ideal situation, the placement of bullet holes would be more-or-less uniform.

The placement wasn’t uniform, of course as you already noticed from the image. Wald, however went one step further by reasoning—correctly—that hits to vulnerable areas would result in downed aircraft, ones that would not make it back. Since the sample used in the study consisted of aircraft that made it back it would be logical to conclude that they tended NOT to have hits to the vulnerable areas.

Take another look at the diagram. Where are there very few bullet holes? The engine and forward cockpit. Of course! A relatively small number of hits to the engines would render them inoperable. Likewise, hits to the cockpit could result in casualties to the flight crew. In either case the plane would be lost.

Simply put, instead of looking for where the bullets were you should look for where they were not. Those are the parts that need armour, and not the bullet-riddled parts.

———-

So what does this have to do with eLearning? It turns out that in my previous career a significant part of my efforts were dedicated to the improvement of the quality of our instructional efforts. I approached this is various ways: reading about things done differently elsewhere, researching new devices and attendant methods, conferring with teachers and interviewing successful students. These tended, at first, to be my main starting points. Over time, though, I slowly moved away from all of these somewhat.

It started in a somewhat unexpected fashion. Each year I would address all of the intermediate-secondary student teachers at Memorial University in order to explain to them how the province’s distance education program worked. As part of the presentation I would those in the audience who has received part of their high school program from the program to identify themselves and would ask them to offer up their perspectives on the experience.

Of course, in all honesty, I was, in part, “selling” the program. I was part of that same system and certainly took great pride in it and in my contribution to it. While I was making it look like I was seeking an unbiased assessment I know—now—that in the initial stages I was really seeking affirmation; an ‘independent’ external source that validated the program as being worthwhile.

To my great surprise that’s not exactly what I got. Yes, many of the students were quite positive about the experience they’d had in the distance education program, but not all of them were. Numerous students indicated that they’d not found it great or that they much preferred the more traditional face-to-face approach.

The first few times this happened I responded by downplaying the responses, merely assuming that they were just the voices of the disgruntled few who had not enjoyed success probably through their own efforts or, more accurately, lack thereof. In time, though, I came around. Rather than dismissing those voices or, worse, glossing over what they’d said I began showing active interest in their points of view. I would not just let their comments sit unacknowledged; unchallenged. Instead, I slowly came around to a practice whereby I would probe deeper whenever I got the somewhat negative responses, attempting to determine just exactly had led to what I’d found.

It was enlightening, to say the least. Space does not permit a detailed exposition of what I found but, in general, here were a couple of items that were frequently encountered:

  • The choice to enrol in a particular course, which also happened to be a distance education offering, was not made by the student but, rather, by the parents or, even more frequently, the school administrator or the school district office.
  • The instructor had not made a concerted effort to reach out to the student but seemed, rather to either just teach to nobody in particular, seldom involving anyone in the class or, instead, appeared to play favourites.
  • Technical issues had resulted in significant ‘down time.

Now, lest you get the impression that this post is a mean-spirited barb at my former employer, let me assure you that nothing could be further from the truth. The pride I felt, and continue to feel in that program, is built on more than just emotion. It is, rather, something that is rooted in significant evidence that indicates its overall efficacy. The numbers don’t lie and they indicate that the students tend to do well. Just not all of them.

My point, rather, is to point out that in the later part of my career I found much more use in finding out why students did not find success than I did in identifying those factors that were associated with success.

Like Wald, I found it useful to consider the planes that did not return.

———-

As for that telling response to the question, “What would you do if a student is not having success in your course?”

The desired response: “I would find out what was wrong.” That’s a lesson I earned through long and often painful experience.

Never mind the extra classes, the tutorials and the varied approaches, just figure out why the student is not doing well and do what can be done.

———-

But there’s still ‘Angela,’ the student I found in my class, the one who unexpectedly dropped in and who was not finding any success. Yes, I did seek to get to the bottom of it all.

And I did.

I learned that she had just returned to her home community, after living away for several years. Her mom was a single parent but had found a new boyfriend so she’d moved away to be with him, taking her daughter with her. It became an abusive relationship and one night, in a drunken rage, the boyfriend had murdered Angela’s mom while she was present there in the apartment. She’d returned to her home community and was placed in foster care and that’s why she’d been dropped unexpectedly in my grade eleven physics class.

I tried as best I could to make things work for Angela. Unfortunately I did not succeed. I did not end up giving her a passing grade and she was not in my online physics class the following year. I do not know how she fared in life after that but do think of her often, especially when I need a good dose of humility. Sometimes, even with hard work, skill and insight you still cannot get the success you hope for. Yes, you generally do, with effort and teamwork, but not always.

Angela did not have a good experience in my Physics class. It continues to be a humbling truth.

, , ,

16 Comments

Small Schools Rank Higher: It’s Built-In; Do the Math (& a twist)

From time to time you will see institutions ranked according to various criteria. Generally this is done with the intention of demonstrating how well each is performing. It’s not unusual to see this done with schools and here’s a claim that is often made, and substantiated by the numbers:

Small Schools Tend to Lead the Ranks

This is consistent. For years I saw it in my own region, reflected in the annual report card issued by the Halifax-Based Atlantic Institute for Market Studies (AIMS). Year after year, small schools led the provincial rankings. As a professional whose entire career was devoted to the betterment of small rural schools I wanted to be able to brag about this, to puff out my chest and say, “look, I told you that small schools were better for our children. It’s obvious that the extra care and attention they get on an individual basis, as well as the better socialization caused by the fact that everyone knows everyone else, is making a positive difference.”

I never did that, of course. It’s not because I don’t believe in small schools–I truly do. My silence on the matter was, to some degree due to the fact that at the time the studies were published I was a non-executive member of the Department of Education. As such I was not authorized to speak on its behalf. That was not the real reason though.

No, I knew that a far more powerful force was afoot; something that affected the results much more than did either good teaching, a supportive (small) ecosystem and the presence of many brilliant bay woman and men.

Although all of those are positive factors.

No the most powerful effect was something else, something related to straightforward mathematical behaviour, and if you’ll spare a few minutes of your attention I will explain.

Simply Put: Small schools have an advantage in these rankings that is due only to the fact that they are small.

And there is an unexpected twist too, one not often mentioned in the discussion of the reports.

Here goes!

Let’s simplify the situation and assume that the rankings are based on the outcome of one test only. Furthermore, let’s say that the result of that test, for any given student, is completely random; that is, any student who writes it will get a random grade between 0 and 100. In other words let’s act like there’s really no difference between the students at all of the schools. The small ones will still come out on top.

Let’s see what this would mean for ten small schools (we’ll arbitrarily name them sml01 to sml10), each one having only fifteen students in grade twelve and writing the test on which our report is based. The results for all of the students are tabulated below. In reality the table was produced using a random number generator in Microsoft Excel. You don’t need to read the table in detail. It’s just here so you know I’m not making the whole thing up!

School sml01 sml02 sml03 sml04 sml05 sml06 sml07 sml08 sml09 sml10
Score 15.0 14.0 45.0 55.0 53.0 70.0 55.0 100.0 79.0 56.0
Score 6.0 34.0 94.0 75.0 64.0 75.0 59.0 75.0 73.0 52.0
Score 15.0 83.0 65.0 30.0 84.0 46.0 64.0 10.0 35.0 20.0
Score 64.0 16.0 80.0 77.0 10.0 55.0 85.0 32.0 91.0 97.0
Score 29.0 28.0 96.0 98.0 6.0 67.0 51.0 74.0 69.0 9.0
Score 43.0 29.0 49.0 79.0 17.0 64.0 54.0 11.0 32.0 91.0
Score 31.0 49.0 62.0 33.0 0.0 92.0 35.0 59.0 91.0 45.0
Score 51.0 21.0 98.0 75.0 47.0 57.0 32.0 32.0 25.0 58.0
Score 45.0 27.0 18.0 6.0 24.0 31.0 84.0 5.0 89.0 2.0
Score 62.0 62.0 38.0 84.0 16.0 23.0 39.0 84.0 36.0 17.0
Score 9.0 0.0 6.0 67.0 53.0 99.0 54.0 23.0 97.0 15.0
Score 38.0 34.0 21.0 70.0 58.0 40.0 37.0 21.0 56.0 50.0
Score 21.0 55.0 51.0 97.0 92.0 40.0 48.0 100.0 76.0 4.0
Score 17.0 50.0 43.0 38.0 6.0 1.0 72.0 77.0 16.0 25.0
Score 35.0 69.0 83.0 34.0 75.0 16.0 46.0 51.0 77.0 33.0
Average 32.1 38.1 56.6 61.2 40.3 51.7 54.3 50.3 62.8 38.3

Table 1: School results for ten small schools.

That’s a huge pile of numbers and we are only looking at the results for the schools so lets just redo that table showing only the schools and the averages.

School Average
sml01 32.1
sml02 38.1
sml03 56.6
sml04 61.2
sml05 40.3
sml06 51.7
sml07 54.3
sml08 50.3
sml09 62.8
sml10 38.3
Table 2: Small School Results

Notice that the results show a fair bit of variability. They cluster around an average of 50  but some schools had averages in the thirties while others were up around 60.

Now let’s do it all over again, but this time let’s see what would happen in larger schools (named big01 to big10). For the small schools we assumed there were only 15 students per grade level but for the larger ones let’s assume that there are in fact 120 students in grade 12 writing the test.

The rather long table is below just so you know I’m not pulling the numbers out of my head. As was the case with the small school simulation it was done using a random number generator in Microsoft Excel and just pasted directly into WordPress. Scroll to the bottom of the table :-)

School big01 big02 big03 big04 big05 big06 big07 big08 big09 big10
Score 67.0 100.0 82.0 25.0 27.0 100.0 10.0 69.0 22.0 96.0
Score 42.0 92.0 63.0 16.0 42.0 12.0 69.0 94.0 66.0 60.0
Score 100.0 27.0 42.0 59.0 88.0 79.0 83.0 49.0 27.0 96.0
Score 44.0 29.0 46.0 63.0 28.0 69.0 31.0 19.0 18.0 59.0
Score 16.0 33.0 66.0 81.0 11.0 21.0 76.0 67.0 70.0 85.0
Score 95.0 31.0 11.0 2.0 80.0 15.0 21.0 78.0 91.0 33.0
Score 32.0 32.0 38.0 34.0 44.0 61.0 55.0 0.0 89.0 64.0
Score 82.0 6.0 96.0 57.0 8.0 52.0 64.0 55.0 62.0 72.0
Score 100.0 48.0 45.0 10.0 49.0 93.0 89.0 72.0 87.0 72.0
Score 88.0 91.0 33.0 0.0 36.0 11.0 76.0 10.0 78.0 55.0
Score 54.0 95.0 41.0 68.0 92.0 75.0 54.0 75.0 20.0 52.0
Score 44.0 79.0 88.0 69.0 82.0 9.0 31.0 74.0 1.0 78.0
Score 71.0 36.0 2.0 51.0 58.0 2.0 17.0 68.0 29.0 36.0
Score 93.0 47.0 89.0 91.0 25.0 47.0 85.0 96.0 63.0 23.0
Score 9.0 79.0 33.0 41.0 68.0 19.0 74.0 81.0 57.0 47.0
Score 77.0 84.0 28.0 44.0 2.0 54.0 37.0 48.0 25.0 54.0
Score 20.0 74.0 33.0 57.0 15.0 65.0 85.0 59.0 21.0 80.0
Score 56.0 98.0 27.0 68.0 45.0 75.0 58.0 71.0 92.0 58.0
Score 7.0 70.0 83.0 74.0 26.0 52.0 71.0 40.0 75.0 87.0
Score 80.0 32.0 65.0 7.0 54.0 62.0 68.0 7.0 87.0 88.0
Score 65.0 12.0 68.0 22.0 5.0 26.0 36.0 92.0 79.0 40.0
Score 87.0 89.0 51.0 70.0 96.0 98.0 56.0 13.0 10.0 51.0
Score 52.0 71.0 13.0 86.0 88.0 54.0 11.0 20.0 26.0 18.0
Score 69.0 57.0 11.0 36.0 39.0 5.0 38.0 56.0 82.0 40.0
Score 95.0 54.0 54.0 77.0 52.0 74.0 100.0 82.0 35.0 7.0
Score 49.0 80.0 24.0 42.0 11.0 82.0 70.0 18.0 30.0 19.0
Score 46.0 26.0 3.0 56.0 54.0 50.0 2.0 9.0 26.0 47.0
Score 58.0 57.0 98.0 62.0 65.0 50.0 7.0 94.0 9.0 43.0
Score 86.0 86.0 32.0 81.0 63.0 49.0 60.0 61.0 93.0 5.0
Score 9.0 54.0 74.0 65.0 27.0 38.0 42.0 30.0 42.0 99.0
Score 41.0 37.0 30.0 70.0 77.0 86.0 58.0 48.0 53.0 99.0
Score 23.0 82.0 9.0 73.0 9.0 9.0 86.0 27.0 57.0 50.0
Score 52.0 97.0 91.0 90.0 58.0 11.0 56.0 16.0 53.0 89.0
Score 54.0 84.0 46.0 0.0 26.0 55.0 36.0 94.0 89.0 46.0
Score 94.0 75.0 32.0 16.0 77.0 9.0 87.0 21.0 58.0 59.0
Score 77.0 27.0 93.0 65.0 61.0 23.0 53.0 60.0 29.0 23.0
Score 41.0 26.0 34.0 21.0 24.0 57.0 34.0 78.0 99.0 90.0
Score 73.0 67.0 83.0 54.0 99.0 63.0 24.0 65.0 75.0 37.0
Score 55.0 76.0 30.0 85.0 92.0 57.0 31.0 69.0 82.0 43.0
Score 12.0 38.0 53.0 56.0 40.0 67.0 3.0 50.0 86.0 90.0
Score 48.0 89.0 86.0 77.0 80.0 83.0 92.0 38.0 67.0 0.0
Score 59.0 81.0 65.0 0.0 47.0 24.0 57.0 18.0 27.0 90.0
Score 32.0 72.0 46.0 54.0 92.0 54.0 41.0 99.0 0.0 87.0
Score 5.0 55.0 0.0 78.0 13.0 83.0 60.0 68.0 68.0 86.0
Score 0.0 0.0 91.0 66.0 38.0 22.0 2.0 82.0 32.0 12.0
Score 19.0 74.0 40.0 54.0 93.0 37.0 68.0 75.0 57.0 35.0
Score 13.0 81.0 36.0 39.0 50.0 3.0 44.0 19.0 100.0 16.0
Score 36.0 95.0 4.0 100.0 60.0 89.0 47.0 99.0 70.0 43.0
Score 29.0 46.0 12.0 92.0 35.0 28.0 17.0 74.0 38.0 85.0
Score 49.0 84.0 35.0 70.0 36.0 12.0 32.0 43.0 81.0 39.0
Score 87.0 32.0 89.0 71.0 11.0 0.0 93.0 51.0 10.0 39.0
Score 43.0 27.0 12.0 9.0 81.0 78.0 52.0 99.0 82.0 86.0
Score 51.0 41.0 50.0 73.0 83.0 65.0 51.0 44.0 89.0 5.0
Score 21.0 56.0 89.0 6.0 47.0 41.0 57.0 17.0 72.0 53.0
Score 12.0 39.0 51.0 18.0 96.0 75.0 23.0 39.0 75.0 39.0
Score 0.0 48.0 11.0 51.0 61.0 22.0 39.0 35.0 88.0 75.0
Score 33.0 53.0 23.0 68.0 88.0 69.0 48.0 40.0 19.0 100.0
Score 31.0 30.0 82.0 31.0 13.0 55.0 89.0 94.0 40.0 60.0
Score 90.0 5.0 19.0 26.0 68.0 60.0 77.0 63.0 51.0 6.0
Score 41.0 65.0 72.0 76.0 91.0 11.0 71.0 37.0 68.0 53.0
Score 24.0 80.0 70.0 73.0 61.0 4.0 79.0 59.0 37.0 73.0
Score 11.0 24.0 72.0 48.0 64.0 28.0 38.0 79.0 66.0 22.0
Score 22.0 13.0 14.0 83.0 2.0 21.0 95.0 100.0 55.0 55.0
Score 50.0 97.0 59.0 85.0 15.0 82.0 77.0 31.0 21.0 92.0
Score 81.0 9.0 45.0 56.0 16.0 55.0 66.0 69.0 79.0 78.0
Score 36.0 74.0 68.0 7.0 36.0 42.0 5.0 76.0 41.0 76.0
Score 30.0 35.0 68.0 59.0 92.0 50.0 9.0 50.0 98.0 97.0
Score 30.0 31.0 2.0 1.0 62.0 64.0 82.0 88.0 84.0 53.0
Score 4.0 46.0 55.0 54.0 61.0 42.0 81.0 77.0 25.0 27.0
Score 32.0 51.0 79.0 58.0 2.0 33.0 66.0 92.0 20.0 68.0
Score 70.0 76.0 52.0 24.0 2.0 21.0 6.0 98.0 63.0 37.0
Score 54.0 68.0 91.0 56.0 58.0 32.0 41.0 74.0 64.0 45.0
Score 37.0 48.0 29.0 42.0 4.0 93.0 10.0 29.0 97.0 40.0
Score 14.0 47.0 46.0 83.0 80.0 52.0 42.0 54.0 33.0 29.0
Score 15.0 2.0 100.0 12.0 9.0 84.0 52.0 53.0 53.0 6.0
Score 8.0 23.0 35.0 63.0 78.0 34.0 30.0 75.0 14.0 54.0
Score 16.0 90.0 13.0 80.0 32.0 29.0 99.0 21.0 34.0 80.0
Score 99.0 48.0 47.0 5.0 71.0 88.0 77.0 68.0 50.0 2.0
Score 45.0 15.0 18.0 38.0 49.0 8.0 90.0 13.0 71.0 33.0
Score 42.0 50.0 86.0 80.0 79.0 53.0 21.0 81.0 53.0 36.0
Score 16.0 14.0 51.0 14.0 19.0 97.0 50.0 49.0 8.0 2.0
Score 34.0 85.0 55.0 54.0 49.0 63.0 1.0 58.0 73.0 13.0
Score 8.0 98.0 9.0 7.0 70.0 78.0 41.0 18.0 94.0 74.0
Score 59.0 43.0 31.0 30.0 97.0 85.0 64.0 94.0 3.0 91.0
Score 29.0 34.0 6.0 17.0 43.0 78.0 67.0 17.0 50.0 34.0
Score 80.0 12.0 98.0 24.0 84.0 25.0 96.0 76.0 16.0 67.0
Score 87.0 89.0 11.0 86.0 5.0 39.0 83.0 98.0 27.0 13.0
Score 62.0 73.0 69.0 91.0 47.0 52.0 91.0 57.0 87.0 39.0
Score 22.0 64.0 86.0 64.0 10.0 88.0 6.0 62.0 91.0 26.0
Score 28.0 74.0 88.0 19.0 45.0 97.0 94.0 3.0 75.0 30.0
Score 27.0 11.0 11.0 55.0 39.0 30.0 39.0 54.0 99.0 86.0
Score 94.0 85.0 60.0 1.0 42.0 23.0 57.0 97.0 58.0 24.0
Score 78.0 7.0 30.0 94.0 26.0 75.0 100.0 11.0 99.0 11.0
Score 94.0 12.0 81.0 50.0 49.0 36.0 68.0 95.0 67.0 33.0
Score 5.0 9.0 39.0 23.0 31.0 29.0 23.0 22.0 57.0 46.0
Score 67.0 55.0 98.0 81.0 80.0 72.0 31.0 53.0 80.0 95.0
Score 68.0 57.0 17.0 34.0 26.0 38.0 46.0 55.0 74.0 24.0
Score 8.0 39.0 82.0 34.0 65.0 74.0 34.0 39.0 62.0 19.0
Score 83.0 97.0 14.0 84.0 71.0 66.0 62.0 13.0 8.0 82.0
Score 83.0 78.0 39.0 45.0 15.0 70.0 63.0 65.0 75.0 68.0
Score 8.0 32.0 75.0 8.0 53.0 67.0 22.0 4.0 34.0 46.0
Score 91.0 38.0 48.0 85.0 11.0 93.0 96.0 17.0 80.0 13.0
Score 46.0 90.0 14.0 41.0 0.0 40.0 97.0 74.0 0.0 25.0
Score 26.0 14.0 85.0 92.0 29.0 63.0 77.0 94.0 80.0 8.0
Score 99.0 60.0 48.0 94.0 23.0 37.0 74.0 57.0 2.0 96.0
Score 51.0 99.0 89.0 67.0 69.0 5.0 91.0 6.0 97.0 97.0
Score 41.0 21.0 55.0 63.0 68.0 55.0 1.0 60.0 11.0 54.0
Score 35.0 18.0 65.0 78.0 96.0 79.0 3.0 22.0 80.0 44.0
Score 64.0 62.0 37.0 12.0 81.0 71.0 50.0 29.0 33.0 82.0
Score 27.0 24.0 4.0 29.0 86.0 36.0 11.0 47.0 77.0 5.0
Score 40.0 14.0 44.0 95.0 78.0 90.0 42.0 41.0 99.0 83.0
Score 10.0 93.0 37.0 48.0 87.0 27.0 79.0 15.0 94.0 57.0
Score 92.0 100.0 24.0 81.0 61.0 93.0 68.0 8.0 54.0 46.0
Score 55.0 79.0 17.0 88.0 96.0 83.0 88.0 99.0 49.0 21.0
Score 88.0 46.0 20.0 17.0 74.0 76.0 12.0 53.0 89.0 22.0
Score 85.0 42.0 26.0 26.0 87.0 69.0 49.0 68.0 57.0 49.0
Score 59.0 31.0 21.0 74.0 56.0 3.0 94.0 95.0 26.0 18.0
Score 100.0 75.0 59.0 39.0 64.0 54.0 7.0 30.0 34.0 38.0
Score 58.0 100.0 60.0 96.0 100.0 70.0 71.0 19.0 58.0 7.0
Score 13.0 66.0 94.0 82.0 92.0 87.0 51.0 51.0 80.0 13.0
Score 36.0 3.0 32.0 29.0 83.0 95.0 20.0 42.0 53.0 95.0
Average 48.7 53.9 48.2 51.7 52.1 51.8 52.8 53.9 56.0 50.4
Table 3: School results for ten big schools.

As before let’s just look at the averages for each school.

School Average
big01 48.7
big02 53.9
big03 48.2
big04 51.7
big05 52.1
big06 51.8
big07 52.8
big08 53.9
big09 56.0
big10 50.4
Table 4: Big School Results

Notice that, like table 2 the results are clustered about an average of around 50. Notice, though, that the numbers are not spread nearly as much.

Let’s put the two tables side-by-side for a better look

Scool Average School Average
sml01 32.1 big01 48.7
sml02 38.1 big02 53.9
sml03 56.6 big03 48.2
sml04 61.2 big04 51.7
sml05 40.3 big05 52.1
sml06 51.7 big06 51.8
sml07 54.3 big07 52.8
sml08 50.3 big08 53.9
sml09 62.8 big09 56.0
sml10 38.3 big10 50.4
Table 5: Averages for both small and big schools

The thing to notice is that the big schools show much less variability. In small schools, individual students who do very well or very poorly (we call them outliers) tend to have a large effect on the average. In larger schools, the increased number of results tends to “smooth out” the results; to make them less variable.

This is something that is well-known in mathematics. It  even has a name: The Law of Large Numbers. Simply put, in larger populations repeated experiments tend to cluster better about the expected result.

Now, this is where things get interesting. Recall that this is all about the fact that small schools get a built-in advantage due only to the fact that they are small. Let’s see what it looks like when all twenty schools are ranked from highest to lowest.

School Average
sml09 62.8
sml04 61.2
sml03 56.6
big09 56.0
sml07 54.3
big08 53.9
big02 53.9
big07 52.8
big05 52.1
big06 51.8
big04 51.7
sml06 51.7
big10 50.4
sml08 50.3
big01 48.7
big03 48.2
sml05 40.3
sml10 38.3
sml02 38.1
sml01 32.1
Table 6: All twenty schools ranked from highest to lowest.

Did you see what happened? The top schools were all small schools. They reached the top due to nothing other than the law of mall numbers working in their favour. Random variability–two or three bright students or an absence of  two or three weaker students had a profoundly positive effect on the school average.

Recall also I mentioned there would be a twist. Notice that while the highest ranking institutions were drawn from the pool of small schools, so, too were the lowest ranking ones, and for the same reason–namely the presence of a few weaker students or he absence of a few strong ones.

So, based on this little experiment it’s plain to see that when ranked this way, small schools will tend to come out on top simply because they are small and the fact that the law of large numbers is better able to work in their favour.

As for the small schools at the bottom, it happens too and it’s at best likely that these are rarely mentioned due to the presence of selection bias on behalf of whoever wishes to weave the numbers into a narrative that suits their own political ends. One wonders, though, how many small schools have been closed or otherwise penalized for nothing other than being the unfortunate victims of chance.

Closing Note: this is in no way intended to cast AIMS in any negative light. To the best of my knowledge neither they, nor the various Departments of Education nor the various school districts ever tried to spin the reports into any grandiose claims regarding big and small schools. The false claims I have heard have generally be made by private individuals, each with their own axes to grind.

As for my own conclusion: Ranking systems, regardless of the context, whether it be health care, law enforcement, customer care or, as is the case here, school-based student achievement, serve a useful purpose but be wary of the law of large numbers before making any sweeping generalizations.

, , ,

12 Comments

Asking Better Questions: Ends and Means in eLearning

In a previous post I considered the possibility that much of what is presented as “Innovation” is anything but that. With access to some fairly new and attractive or otherwise popular products and armed with even a slight grasp of how to operate them it’s relatively easy to create an appearance of innovation. Simply put, if you can get your hands on some new gear, in even a short while you can present quite a convincing front.

Worse again, it is equally easy to generate what passes for proof; to an untrained eye you can make it look like your so-called innovation is creating some real differences. Any of these strategies can give you reams of what looks like convincing evidence:

  • Deliberately pick enthusiastic students or teachers and pile on the anecdotes that endorse the desired point of view. People who rely on system-one (more or less intuitive) reasoning are easily swayed by stories so it won’t be hard to capitalize on that to get some people talking about how innovative the project is.
  • Stage the project in a relatively well-off school or class and then compare the results from this highly-biased “treatment” group to the population in general. Very few will dig deep enough to see that the superior achievement results predated, and were independent of, the treatment.
  • Rely on manufacturer or vendor supplied “research” when crafting reports, proposals and press releases.
  • Bluff; just preface your claims with clauses like “decades of research shows…” and leave it at that. You might be surprised to see how few—if anyone—will call you out. Besides it will be relatively easy to portray those that do as kooks or curmudgeons.

That said, you could instead opt to take the more difficult path and strive for some real gains.

Notwithstanding the cynical tone of the opening of this post, it needs to be emphasized that emerging technologies should be welcomed, albeit guardedly, in all places of learning. I’ve come by this knowledge the hard way with ample personal experience in doing it both the right AND the wrong way. Lessons learned well generally involve first-hand experience and I have it, having done things for both the right and the wrong reasons–but generally having benefited from the experience in either case. It can be summed up succinctly: it’s best when you develop and refine an appropriate match between the technologies and the desired learning outcomes. This means, in particular, to start with the right sort of question:

  • Bad Question: How can (insert gadget name here) be used in the (insert subject name) classroom?
  • Better Question: What combination of equipment and methodology will foster better achievement in (insert subject name/outcome area)?

Notice the difference? Instead of placing the focus on the tools, place it on the learning.

right-questions-01

You might say that, in the end, the two are the same. Yes, in both cases the goal is to do a better job. Take a closer look, though. Notice that the bad question is, in fact all wrong. First, by selecting a particular device it sets serious limits on what can be done. This can even lead to the selection of inferior methods. Consider this: A teacher wants to see if physics achievement can be improved through the use of tablets in the class after noticing that there are some good simulations available and asks, “how can I use tablets in the classroom?” With the best of intentions the approach is changed, replacing hands on activities with simulations. Now, while simulations are an excellent way to introduce topics, especially ones that cannot be done cheaply or safely, it makes little sense, when you think about it, to replace hands on activities involving motion, sound, electricity and light with simulations in which the only physical interaction is sliding a finger along a glass screen! After all, physics is all about interacting with the physical world! How ironic! If, instead the right question had been asked, no doubt the simulations would have been used but their use would have been balanced with follow up real-world interactions.

Second, the selection of a particular device sets in place a condition in which demonstrable improvements are expected. That’s nice, but what if it’s the case that the new technology is in fact inferior? You might suggest, “no problem, the report will show this.” Think about it, though, and be careful to layer in some human nature.

Consider again the previous case involving tablets and physics. Suppose that the unit of study was about current electricity and the tablets were used to explore the topic through simulations in which students constructed virtual circuits involving batteries, resistors, lamps, switches and meters to measure voltage and current instead of doing the same with the real thing. At the end of the unit the evaluation would be based on what could be measured, either online or using pencil and paper, and NOT on actually constructing the circuits.

How likely would it be that students would be able to do the same with real circuits? Not likely.

How likely is it that they would do about the same on a test? Very likely.

What’s the difference? In which class would it be more likely that you would find someone who could help you wire your basement? If, on the other hand, the right question had been asked, again, in all likelihood the simulations would have been utilized but their user would be balanced, blended with hands-on activities too.

Focusing instead on the learning will have two likely outcomes:

  • You will likely not get famous as “it’s” clearly about learning and not about you.
  • The project will show modest but useful results.

Whenever embarking on any effort to improve results in education it’s important to bear in mind one simple truth: you are not starting from scratch. The “traditional” methods that self-nominated reformers (most of whom have only limited classroom experience, other than the imagined stuff) so love to mock are in fact reasonably effective. The huge majority of people–those who’ve not been the beneficiaries of their enlightened practice but who have still managed to thrive nonetheless bear testament to that. Existing methods are, perhaps, not as good as they could be but are still nonetheless effective. Reformers should bear in mind that the traditional methods they so distain have several important advantages over proposed new ones. First, they are understood since, in all likelihood, existing practitioners not only use them now but will likely have been taught using them. More importantly, though, traditional methods have been refined from extensive classroom use. Proposed methods, by contrast are not well understood, raw and untested.

Far too often reformers boldly charge into classrooms armed with little more than vague ideas, shiny new equipment and an unhealthy combination of ignorance and arrogance. Students, parents, colleagues and administrators generally tolerate the ensuing activity since (a) it probably doesn’t interfere with them too much and (b) there is always the chance that some good might come of it. The proponent will usually get a little something—a write up in a journal, perhaps a trip to a conference, maybe even an award—but in the end the students will likely be left no better off and the effect on general classroom practice will be negligible.

It does not need to be that way, though. If, instead, the proponents asked the right question, one that focused on making some real improvement in student learning, then wins would be had all around. That is, better teaching and learning would result and, who knows, maybe the innovator’s career would get a boost anyway.

, , ,

15 Comments

Managing the Distractions

I came across something like this “unhelpful high school teacher” meme the other day and it got me thinking about the distracted landscape our students occupy.

meme-unhelpful-teach-01

All too often the opinions you encounter on the web and in other parts of everyday life are one-sided; normally the work of someone with an axe to grind; someone wishing to provide just one side of a rather complicated issue and this is no exception. There are very valid reasons why educators have to be skeptical about the unrestricted use of electronic devices such as laptops and tablets in class.

In my previous job my office was located on campus at a fairly large university. It gave ample opportunity to view the electronic habits of typical students and was a never-ending source of amazement—both the good and the bad kinds.

One incident in particular stands out. I wished to confer briefly with a colleague who was, at the time, teaching a large class (around 160+ senior education students) in one of two large lecture theatres located in the basement of the building we both inhabited. I decided to just head over to the class and chat with him before it started. Unfortunately, as is often the case, I was briefly distracted, and, by the time I arrived at the door the class had already started. Out of curiosity I looked in. My vantage point was from the centre back and, as the lecture theatre slopes toward the front, I had an excellent view of exactly what the students were doing.

Almost all of them had either a laptop or a tablet device open and active. What was interesting was the fact that the majority of the students were not just taking notes on the machines but also had a web browser open. Well over half of the students would periodically switch from the note taking application (typically a word processor) to the browser. The browsers had the usual suspects, of course (Facebook, Twitter and other social media applications) but a surprising number of students were also shopping online during class time. I’d estimate now that somewhere between 10 and 20 of the approximately 150 students were doing this! Only a very small fraction—I’d estimate now around 20 to 25%–seemed to be totally focused on the lecture; at least as evidenced by their keeping the notes application open throughout the five minutes or so I was watching.

meme-buzz-woody-01

I recall the moment quite well as it was one of those times when something became quite clear to me; a time that has sparked a considerable number of subsequent informal observations. Right then and there I decided to also take a look at the other lecture theatre. This one had a 2nd semester calculus class going on and, unlike the former one, was one in which electronic devices were not that well suited to taking notes (unless, of course, you had a touch screen or some stylus such as a Wacom device in which you could render back handwriting. After all, typing calculus notes is not something anyone can do on the fly!). Guess what? Same thing! Once again I saw a sea of laptops and tablets. Not quite so many, of course—I’d estimate around 50% of the students had them open as opposed to over 90% as was the case in the education class. Once again, though, the screens were dominated by not just social media but also online shopping!

Just a thought—maybe someone should run their own set of observations and verify this. At any rate, this short anecdote lends a bit (yes, I know “piling on the anecdotes” is a very flawed form of research) of credibility to the notion that we all have of how distracted our students really are.

meme-spock

Which brings us to the point: as educators it is in our best interests, and those of our students, if we find effective ways of managing the many distractions that electronic gadgets bring to our classrooms. While it is certainly true that electronic devices hold incredible promise for all aspects of education it must also be acknowledged that the devices are equally effective at pulling students away from the tasks that should be at hand. The same conduit that brings research, information and activities right to the students’ foregrounds is equally adept at bringing in distractions such as off-topic interactions, irrelevant information and other distractions, particularly games that have nothing to do with learning.

Blocking unrelated content is a strategy that will never work. Go ahead and block Facebook at the Wi-Fi router. The students will hardly be slowed at all. Some will switch back to getting it through their phones, which you cannot block. Others will switch to a different social media platform—new ones pop up almost weekly, and still others will just connect through a proxy server which will just circumvent the router and firewall rules. It’s a losing game of cat and mouse.

Blocking the use of electronic devices is equally counterproductive. First of all, it drags instruction back to the 19th century—and we cannot afford to do that. More importantly, though, the whole practice of “blocking” or “banning” is anathema to the whole idea of schools as places of learning.

So what, then? What is the magic bullet? As expected, because it’s nearly always the case, there is no one simple solution. There are, however general strategies that can be applied and which will be found effective. Here are some suggestions:

  • Make a personal contact with the students: When students turn to the web browser they are turning away from you, the instructor. The less personal you are to the students the more they will do this.
  • Communicate your values clearly: Typically around 80% of people will respect your wishes so make sure they know what your wishes are. Make it clear to the students that you do value the use of electronic equipment but that they must also make the best use of their class time. To do this they should minimize distractions and, in particular, save the social networking and shopping for some other time. It’s also worth noting that of the remaining 20%, around three-quarters of them can be convinced to follow along too especially if you ensure that you move around the room to make it apparent that you are checking to see If students are engaged. It should also be noted that the small remainder—around 5% of the total—will do what they please regardless of what you do and you would be well advised that this small group may be regarded as “beyond the point of diminishing returns” so long as they do not distract others with their off-topic pursuits.
  • Find ways to leverage the potentially-distracting technology: You can always find ways to put the devices to some good use. Examples include: (1) getting the students to install “clicker” applications and build in “instant response” activities to your classes (2) provide electronic versions of partial notes (sometimes referred to as “gap notes”) that the students can complete online if they have annotation software (3) make effective use of simulations in class time if appropriate (4) use appropriate application software for in-class activities.

meme-what-if-told-u

, ,

12 Comments

Follow

Get every new post delivered to your Inbox.

Join 843 other followers