Blackboard Thinks They Know How You Teach

In November of last year a blog post by Blackboard was making the rounds. A Blackboard study documented how instructors use their system which was somehow conflated as being equivalent to knowing how teachers teach. We could talk about how terrifying this blog post is, or how it’s devoid of solid analysis. Instead I’d like to turn over a new leaf and not harp on about privacy, or a company using the aggregate data they’ve collected with your consent (although you probably weren’t aware that when you agreed to “allow Blackboard to take your data and improve the system” that they’d make this sort of “analysis”), but just tear into the findings.

Their arbitrary classifications (supplemental? complementary?) are entirely devoid of the main driver of LMS use: grades, not instructor pedagogical philosophy – students go to the tools where they are going to get marked on.  Social? I guarantee if you did an analysis of courses that value use of the discussion tool (note, that’s not about quality) as greater than 30% of the final grade, you’ll see a  hell of a lot more use of discussions. It’s not that the instructor laid out a social course, with a student making an average of 50 posts throughout the course (which if you break that down to 14 weeks of “activity” that works out to 3.5 posts per week – which is strangely similar to the standard “post once and respond twice” to get your marks – it’s that discussions are the only tool that give some power to students to author.

Time spent is also a terrible metric for measuring anything. Tabs get left open. People sit on a page because there’s an embedded video that’s 20 minutes long. Other pages have text which is scannable. Connections are measured how? Is it entry and exit times? What if there’s no exit, how does the system measure that (typically it measures for a set amount of activity and then if none occurs the system assumes an exit)? Were those values excluded from this study? If so, how many were removed? Is that significant?

Now Blackboard will say that because of the scale of the data collected that those outliers will be mitigated. I don’t buy that for a second because LMS’s are not great at capturing this data from the start (in fact server logs are not much better), because there’s very little pinging for active window or other little tricks that can only be really assessed using eye tracking software and computer activity monitoring. Essentially garbage-in, garbage-out. We have half baked data that is abstractly viewed and an attempt is made at some generalizations.

Here’s a better way to get at this data: ask some teachers how they use the LMS. And why. It’s that simple.

If you look at Blackboard’s findings, you should be frankly scared if you’re Blackboard. Over half of your usage (53% “supplemental” or content-heavy) could be done in a password protected CMS (Content Management System). That’s a system that could cost nothing (using Joomla or any of the millions of free CMS software available) or replicated using something institutions already have. The only benefit institutions have is that they can point at the vendor when stuff goes wrong and save money on outsourcing expertise into an external company.

If you take the findings further, only 12% of courses (“evaluative” and “holistic”) get at the best parts of the LMS, the assessment tools. So 88% of courses reviewed do things in the system that can be replicated better elsewhere. Where’s the value added?

Student Engagement Idea: Upvote/Downvote Questions In Discussions

Way back at the beginning of time, I worked in Language Studies with a professor, Gerry Dion. Gerry was (and is) a great guy, and frankly, about ten years ahead of his time. In my first semester working with Gerry, my job was to maintain computers in a small lab, and help students use the First Class communication system, which was a really good system to use for class collaboration and instructor flexibility. Gerry taught “French, in a Canadian Context” (course code was LL636, which is part of the minutia that I carry around in my head). Anyways, this course was a flipped class with course materials on a CD-Rom. The first six weeks were French language basics, the second six weeks were cultural research. Students grouped up, did research, and then posted to First Class their essays. Everyone in the class would read each other’s essay. Then there was a multiple choice final exam that consisted of questions pulled from the essays.

I’m on a bit of a nostalgia kick lately, and well, I was thinking about this sort of activity, except within a modern LMS context. Here’s how a similar exercise using the discussion forums in D2L’s version 10.3 might work:

First, have students read each other’s work. Typically these are submitted to the discussion forum, for posting – but a group dropbox (with the entire class enrolled in one group) might work as well. Second, have a forum for each essay, and students are to generate x number of questions (and potential answers) each, and post to the appropriate forums the questions. This part of the exercise ensures that they at least read some of the work submitted. Third, have students use the upvote feature (new in version 10.3) to vote up or down the question – voting up the questions means you could answer the question, voting down means you (as a student) couldn’t answer it correctly. Students essentially identify easier questions on a fairly large scale.  Fourth, to assess activity in the system you could use the reporting feature attached to discussions to export a list of activity to CSV, manipulate that CSV to be a grades import (aligned with previously created columns). Fifth, using the upvote data, select questions that did well, and did not do well and create a quiz out of those – you could even take it a step further and allow the system to randomly select questions (once imported into the Question Library) from each difficulty of question.

Going back to my history lesson at the top, Gerry was not an easy marker, but almost always did well on his student feedback – students learned from those sorts of exercises.