Student Engagement Idea: Upvote/Downvote Questions In Discussions

Way back at the beginning of time, I worked in Language Studies with a professor, Gerry Dion. Gerry was (and is) a great guy, and frankly, about ten years ahead of his time. In my first semester working with Gerry, my job was to maintain computers in a small lab, and help students use the First Class communication system, which was a really good system to use for class collaboration and instructor flexibility. Gerry taught “French, in a Canadian Context” (course code was LL636, which is part of the minutia that I carry around in my head). Anyways, this course was a flipped class with course materials on a CD-Rom. The first six weeks were French language basics, the second six weeks were cultural research. Students grouped up, did research, and then posted to First Class their essays. Everyone in the class would read each other’s essay. Then there was a multiple choice final exam that consisted of questions pulled from the essays.

I’m on a bit of a nostalgia kick lately, and well, I was thinking about this sort of activity, except within a modern LMS context. Here’s how a similar exercise using the discussion forums in D2L’s version 10.3 might work:

First, have students read each other’s work. Typically these are submitted to the discussion forum, for posting – but a group dropbox (with the entire class enrolled in one group) might work as well. Second, have a forum for each essay, and students are to generate x number of questions (and potential answers) each, and post to the appropriate forums the questions. This part of the exercise ensures that they at least read some of the work submitted. Third, have students use the upvote feature (new in version 10.3) to vote up or down the question – voting up the questions means you could answer the question, voting down means you (as a student) couldn’t answer it correctly. Students essentially identify easier questions on a fairly large scale.  Fourth, to assess activity in the system you could use the reporting feature attached to discussions to export a list of activity to CSV, manipulate that CSV to be a grades import (aligned with previously created columns). Fifth, using the upvote data, select questions that did well, and did not do well and create a quiz out of those – you could even take it a step further and allow the system to randomly select questions (once imported into the Question Library) from each difficulty of question.

Going back to my history lesson at the top, Gerry was not an easy marker, but almost always did well on his student feedback – students learned from those sorts of exercises.

Ontario Ignite 2013 Recap

I feel like it’s valuable to post my notes about conferences and events I attend – I hope you find them useful as well.

I attended and presented at the Ontario Ignite Regional Conference, which is a Desire2Learn conference intended to gather users of the system together and exchange information. It’s a good chance to catch up with friends and to learn a whole bunch of stuff. Someone suggested it was like a mini-Fusion, which I think is a pretty apt comparison. These smaller conferences might even be better as you have less choice and are more likely to hit on some tips that are interesting to you. I presented about putting the Polling Widget into your Org Level or at a Course Level homepage, which I’ve posted on here elsewhere. After that, I attended other presentations. Here’s a cliff’s notes (or a synopsis for you non-Canadians).

COMPETENCIES

University of Guelph, who are using the Competencies and Rubrics tool more than maybe other Universities (certainly mine). In this context the presentation outlined the process for Guelph’s Applied Human Nutrition program who have external competencies from the College of Dietitians Ontario. The course(s) need to feed 147 competencies, which were previously tracked in Excel. Currently content and quizzes feed these assessments, needing to meet both the requirements to trigger the Competency. Competencies allow two options – all sub options or any of the sub options – which allows for some interesting options when assessing criteria. One has to be careful when wanting to assess any of the options to meet the competency. Outcomes are able to be assessed on individual criteria of rubrics. Quizzes are able to have outcomes tied to individual quiz questions.

LEARNING OUTCOMES 

University of Guelph has been working on Learning Outcomes for many years; in 2008 UDLE’s and GDLE’s (Undergraduate Degree Level Expectation and Graduate Degree Level Expectation).  In 2012, new undergraduate learning outcomes developed for all (in some cases redeveloped). Guelph developed a curriculum mapping tool (CurricKit) since 2007. In 2012, Guelph engaged in an Analytics pilot with D2L.

Some best practices of curriculum development:

  • Faculty driven
  • Evidence based
  • Discipline oriented
  • Collaborative
  • Facilitator-guided
  • Learning centred

Guelph’s curriculum mapping survey asks faculty what they intend to assess with the course. They renamed Competencies to Outcomes on tabs within tools – which better aligns with how most people know this thing. Competencies at org level feed competencies at the program level then feed competencies at the department level in D2L. Courses with individual templates are not going to work with this structure – need programs to sit separately due to the cross disciplinary nature of courses at Guelph. I asked, Do you only associate learning outcomes at the course level? No, it’s actually at the template level, activities are at the course offering level –  set up this way to facilitate visibility and to aggregate data to the template. One problem was how to validate course assessment as outcomes assessment? Does the course assessment really add up to an achievement with outcomes? Can we make the assumption with course based assessment = program based assessment? Literature suggests not, so how do we rectify?

 

BINDER

Etexts can include digital media (eg. Video). I asked, Can licensed materials be disabled after x amount of time? Yes, controlled by date permissions in Content. Vision: “Any learner today should have the best in class utility and access to all of their course content – all in one place, all digital anytime and anywhere.” Files from your ipad will be added to that instance of Binder – not uploaded to the cloud version of Binder. Binder for Android is coming, as well as a web client. Discoverbinder.com – fusion or community.desire2learn.com will give you content to pull in.

DRM and Copyright controls in Binder

  • Date and time restrictions are respected by Binder
  • Instructors can turn off send to Binder feature as well
  • Students are notified that content is expiring

Binder Apps: provide annotation and Dropbox and Skydrive integration, within ios you can shift attachments in Mail app to Binder (on that device).  Binder allows annotation, exporting, tagging.

EduBabble Indeed

This one may be a bit snarky because I’m on holiday and a day past my fortieth birthday (and I don’t suffer idiots well).

Well, isn’t this interesting? A pair of old white dudes, write a scathing report based on ONE student’s anecdotal recitation of what happened to them in their Master’s work. Whoever wrote this media release, and indeed the “centre” behind this is a joke – how can you even begin to call this a report that  “highlight(s) the weak academic standards, biased teaching, and nonsensical edu-babble found in this course. ” First of all you’re basing this on one student’s memory of this course – there’s nothing scientific about this survey, there’s no report here – this doesn’t pass the sniff test for journalism (for which the standards are awfully low) and it certainly doesn’t pass the sniff test for academic research. Secondly, the two authors are very biased against current educational theories and practices. Of course, they’re going to write a biased report, when the one author clearly outlines his common sense education platform, which includes standardized testing. Standardized testing is something that’s been trotted out as a way to make sure everyone is receiving the same education – but people aren’t the same, so why should education? I don’t learn the same way you do.  Might I remind you of another common sense revolution that did wonders for education and health care.  Why should we listen to this?

We shouldn’t.