So I'm the (self proclaimed) technical lead for the ePortfolio tool (which McMaster has rebranded Learning Portfolio). Actually I fell into the role when the person who was supposed to manage it ended up not being able to deliver the initial rollout presentation due to illness. The tool has worked admirably, scaled well, and frankly done its job. There are warts on the tool - maybe barnacles are a better imagery (to go a step further, barnacles can be removed with a lot of hard work). The way one assembles a presentation (a sharable portfolio) is very mid 90's - which makes the product a hard sell to student used to the ease of Tumblr or WordPress. Despite that major hurdle, we have seen over the first semester a moderate success, just over 21,700 items built in ePortfolio, with close to 3500 of those being reflections. We have over 3200 unique users, with just 17 courses using it. We're now starting phase two of the first year, which will be periodic reminders to students, and we should see an uptick in usage - as well as an addition of two courses which will impact at least 1500 students. Now the real question is will students start to use it outside of classes?
With every passing year I spend in edtech, I always pick this PDF up with some dread. It provides hope on many of it's long view items - hope that educational technology will get better, less manipulative, less data driven, and more inventive - allowing teachers to do what they love (hopefully) better and differently. This year's report is pointing at The Quantifiable Self as something teaching and learning will be doing in five years as well as Virtual Assistants.
On the surface, these seem reasonable - however I believe that most institutions are doing this in some form already. The Quantifiable Self is really about identifying trends (mostly around health through tools like FitBit or Nike+ app) and using that physical information to push you to do better, walk more and so on. With education, in a well designed course, students are already doing this - taking self-assessments that build confidence in a field, confirming that the student "knows" something. LMS analytics also contribute to this - in our instance Desire2Learn has the Student Success System, which gives feedback to students individually on how they're performing in the class and in school in general. This horizon technology is already here, not five years away at all. At some point, there will be pushback (I hope) on all this data collection that is saved in private institutions - you as the creator of that data should be able to control it - it is in fact your intellectual property.
Virtual Assistants? Oh c'mon. That's here in higher education now. Students are checking Google on their phone, which gives them more information (tailored to their search patterns) that they might need. On the Android phones, Google Now is providing contextual information that can be used in context - if you search for political science information every Wednesday at 7:00, Google Now will start feeding you information about political science at that time. Furthermore, Google then takes your new interest in political science into consideration on future searches. Siri and Iris are omnipresent in classrooms, and used to fact-check, find alternate solutions to problems, or just alert the user to a new deal on shoes. Again, this is not on the horizon, it's here.
What is on the horizon is faculty using and integrating this technology into their classrooms in creative ways.
So, undoubtedly this announcement by the Ontario government to give $42 million dollars to develop online courses, by getting Universities to compete in some sort of race-to-the bottom/bloodsport of course development so that they get money to develop courses that can be delivered online is kinda' quaint. I mean, I'd be impressed if this was 1997, or even 2003, but 2014 seems a bit shameful.
Let's face it, I've said it before, and I'll say it again, Ontario is a decade behind the western provinces (who easily lead innovation in this realm in Canada) and probably five or six years behind the maritime provinces, who do online learning out of necessity. The really shameful thing is that Ontario Universities should've been able to see this gap growing for years, and sought to address it sooner. I guess the last fifteen years of conservative government at the provincial level didn't help as they were looking to scale back and demolish education into a private business as they are likely to do. $42 million dollars is really a drop in the bucket - especially if you're setting up a whole education system outside of the existing University and College system, that will allow transfer of credits to those systems. Which brings up an interesting issue, if I take a University credit, will Colleges accept that towards a diploma they offer? What about the other way?
Of course all Universities will be in on this action, and some of the more developed online schools will have a leg and hand up but it's a nice development for someone like me who needs 4 or 5 more credits for my degree, and has had a lot of problem with either a home University accepting a credit, but the visiting University not accepting me as a student (in favour of their own); or taking the course and not having the credit accepted at all.
What worries me most though is that if this initiative is ushered in (as no doubt the government goes to election) and is a fair to midlin' reception. What happens then? This institute and and initiative is certainly subject to a new government's impulses and legislation. I can easily see a conservative government saying "this doesn't work, let's let private market run it better and cheaper."
And we know where that will end up.
In the past I've looked at previous posts about what I think will happen, and reflect on those ideas. It's not that I don't think reflection is valuable, it's just that I'm not that interested in navel gazing (hell, I can see my navel getting bigger by day).
This year, I'll outline some of the projects I'm currently involved with and will try to write about this year.
So for work, I'm working on two large-ish projects. One is a Productivity and Innovation Grant funded project lead by the University of Guelph, around learning outcomes in D2L. What the project encapsulates is ensuring there's alignment between course, program and ultimately University related outcomes - and the reporting that D2L will suggests where there are holes in the alignment. It seems like it will improve the Analytics/Insights tool greatly with global reporting options - which is something I've struggled with greatly.
The other, is around Learning Portfolios. The department that I'm embedded with has gotten some funding from the University to advance Learning Portfolios (the ePortfolio tool in D2L) on campus and it's looking like we will be responsible for this area from here on out. I think that some improvements to the way the tool works by D2L will only help the adoption of the tool - however there's still some major hurdles that have to be overcome before there's widespread adoption. That's not to say that adoption and use hasn't grown greatly, it has - just the impact of the use so far has not produced enough of a ripple to spread campus-wide. That's our job in year two. I'm putting in a Fusion 2014 proposal to co-present one of the really interesting stories from first semester that ties blended learning, learning portfolios and helping students reflect (in this case on career choices).
Other than the banal things like redo the bathroom and visit more places, I'm putting out a record with my one band and releasing another record with my other band. Not very exciting unless you like hardcore punk.
While this is work-related, I want to put together a rubrics repository (like Rubistar, but much more focused on local courses, and local sharing) that has a series of rubrics saved covering higher education courses that the University teaches. This way, it gathers together some of the best work that faculty have done, recognizes them, allows them to set sharing permissions, and ultimately, choose to export as PDF or into D2L. This is a big project, and really not on anyone's timeline, but I want it to happen. It'll have to be open source, and to that end, maybe it doesn't just spit into D2L but into Blackboard or other systems too. The first iteration will of course work with our system (D2L) and then maybe we can branch out.
I'd love to help update the Feed2JS codebase to get it WCAG 2.0 compliant. I'd also love to blog more.
I've written a fair bit about my process for developing support materials - which is something I've become good at over the years. I haven't really spent any time writing about some of the other, smaller initiatives around the Learning Portfolio. One of which is getting students to set their own learning goals.
It's ironic that an institution (who have their goals for students) wants students to be self-directed, yet everywhere in their classes have very little choice in the way that they can determine their own goals and achieve them. This self determination is left to their own time, and is in every sense, not a part of school. However, the institution is certainly looking at ways that students can determine, set and track their own goals - and to that end, with a consultation with Desire2Learn, developed a widget that sits on the homepage that students can click on the link, enter their information on a form which creates an artifact in their ePortfolio/Learning Portfolio. A small portion of the student population use the Learning Portfolio (about 5%-10% of our full time enrollment) but use is picking up. Use of the Learning Goals widget is also a large percent of that group - maybe making up 2% of our total population.
The idea with this widget is to get students thinking about their own goals and trying to tie those goals to their academic learning. I don't know if there's enough being done around this - and we're only in the beginning, but some thinking needs to be done about this for sure. We aren't tracking whether these goals are ever met, or even the content of these goals - in fact we only can make a guestimate based on how many times the title of the widget shows up in the Learning Portfolio object creation report.
I'd be interested in hearing about whether anyone's doing major ePortfolio/Capstone project work - trying to encapsulate a student's experience at University or College in that course/project - and ways that an ePortfolio might help that along. Sure, we know that portfolios are great for procedural/project based learning, especially when guided. I'm more interested in if this can be done institutionally. Really adding a great incentive to building this work.
Way back at the beginning of time, I worked in Language Studies with a professor, Gerry Dion. Gerry was (and is) a great guy, and frankly, about ten years ahead of his time. In my first semester working with Gerry, my job was to maintain computers in a small lab, and help students use the First Class communication system, which was a really good system to use for class collaboration and instructor flexibility. Gerry taught "French, in a Canadian Context" (course code was LL636, which is part of the minutia that I carry around in my head). Anyways, this course was a flipped class with course materials on a CD-Rom. The first six weeks were French language basics, the second six weeks were cultural research. Students grouped up, did research, and then posted to First Class their essays. Everyone in the class would read each other's essay. Then there was a multiple choice final exam that consisted of questions pulled from the essays.
I'm on a bit of a nostalgia kick lately, and well, I was thinking about this sort of activity, except within a modern LMS context. Here's how a similar exercise using the discussion forums in D2L's version 10.3 might work:
First, have students read each other's work. Typically these are submitted to the discussion forum, for posting - but a group dropbox (with the entire class enrolled in one group) might work as well. Second, have a forum for each essay, and students are to generate x number of questions (and potential answers) each, and post to the appropriate forums the questions. This part of the exercise ensures that they at least read some of the work submitted. Third, have students use the upvote feature (new in version 10.3) to vote up or down the question - voting up the questions means you could answer the question, voting down means you (as a student) couldn't answer it correctly. Students essentially identify easier questions on a fairly large scale. Fourth, to assess activity in the system you could use the reporting feature attached to discussions to export a list of activity to CSV, manipulate that CSV to be a grades import (aligned with previously created columns). Fifth, using the upvote data, select questions that did well, and did not do well and create a quiz out of those - you could even take it a step further and allow the system to randomly select questions (once imported into the Question Library) from each difficulty of question.
Going back to my history lesson at the top, Gerry was not an easy marker, but almost always did well on his student feedback - students learned from those sorts of exercises.
I feel like it's valuable to post my notes about conferences and events I attend - I hope you find them useful as well.
I attended and presented at the Ontario Ignite Regional Conference, which is a Desire2Learn conference intended to gather users of the system together and exchange information. It's a good chance to catch up with friends and to learn a whole bunch of stuff. Someone suggested it was like a mini-Fusion, which I think is a pretty apt comparison. These smaller conferences might even be better as you have less choice and are more likely to hit on some tips that are interesting to you. I presented about putting the Polling Widget into your Org Level or at a Course Level homepage, which I've posted on here elsewhere. After that, I attended other presentations. Here's a cliff's notes (or a synopsis for you non-Canadians).
University of Guelph, who are using the Competencies and Rubrics tool more than maybe other Universities (certainly mine). In this context the presentation outlined the process for Guelph's Applied Human Nutrition program who have external competencies from the College of Dietitians Ontario. The course(s) need to feed 147 competencies, which were previously tracked in Excel. Currently content and quizzes feed these assessments, needing to meet both the requirements to trigger the Competency. Competencies allow two options – all sub options or any of the sub options - which allows for some interesting options when assessing criteria. One has to be careful when wanting to assess any of the options to meet the competency. Outcomes are able to be assessed on individual criteria of rubrics. Quizzes are able to have outcomes tied to individual quiz questions.
University of Guelph has been working on Learning Outcomes for many years; in 2008 UDLE’s and GDLE’s (Undergraduate Degree Level Expectation and Graduate Degree Level Expectation). In 2012, new undergraduate learning outcomes developed for all (in some cases redeveloped). Guelph developed a curriculum mapping tool (CurricKit) since 2007. In 2012, Guelph engaged in an Analytics pilot with D2L.
Some best practices of curriculum development:
- Faculty driven
- Evidence based
- Discipline oriented
- Learning centred
Guelph's curriculum mapping survey asks faculty what they intend to assess with the course. They renamed Competencies to Outcomes on tabs within tools - which better aligns with how most people know this thing. Competencies at org level feed competencies at the program level then feed competencies at the department level in D2L. Courses with individual templates are not going to work with this structure – need programs to sit separately due to the cross disciplinary nature of courses at Guelph. I asked, Do you only associate learning outcomes at the course level? No, it’s actually at the template level, activities are at the course offering level - set up this way to facilitate visibility and to aggregate data to the template. One problem was how to validate course assessment as outcomes assessment? Does the course assessment really add up to an achievement with outcomes? Can we make the assumption with course based assessment = program based assessment? Literature suggests not, so how do we rectify?
Etexts can include digital media (eg. Video). I asked, Can licensed materials be disabled after x amount of time? Yes, controlled by date permissions in Content. Vision: “Any learner today should have the best in class utility and access to all of their course content – all in one place, all digital anytime and anywhere.” Files from your ipad will be added to that instance of Binder – not uploaded to the cloud version of Binder. Binder for Android is coming, as well as a web client. Discoverbinder.com – fusion or community.desire2learn.com will give you content to pull in.
DRM and Copyright controls in Binder
- Date and time restrictions are respected by Binder
- Instructors can turn off send to Binder feature as well
- Students are notified that content is expiring
Binder Apps: provide annotation and Dropbox and Skydrive integration, within ios you can shift attachments in Mail app to Binder (on that device). Binder allows annotation, exporting, tagging.
This one may be a bit snarky because I'm on holiday and a day past my fortieth birthday (and I don't suffer idiots well).
Well, isn't this interesting? A pair of old white dudes, write a scathing report based on ONE student's anecdotal recitation of what happened to them in their Master's work. Whoever wrote this media release, and indeed the "centre" behind this is a joke - how can you even begin to call this a report that "highlight(s) the weak academic standards, biased teaching, and nonsensical edu-babble found in this course. " First of all you're basing this on one student's memory of this course - there's nothing scientific about this survey, there's no report here - this doesn't pass the sniff test for journalism (for which the standards are awfully low) and it certainly doesn't pass the sniff test for academic research. Secondly, the two authors are very biased against current educational theories and practices. Of course, they're going to write a biased report, when the one author clearly outlines his common sense education platform, which includes standardized testing. Standardized testing is something that's been trotted out as a way to make sure everyone is receiving the same education - but people aren't the same, so why should education? I don't learn the same way you do. Might I remind you of another common sense revolution that did wonders for education and health care. Why should we listen to this?
I've been meaning to write this post for a long time, but I've only gotten my head above water about this in the last few days. Bear with me as semester start-itis hasn't really cleared my system and I'm definitely fighting off a cold with flu like symptoms.
I'm very lucky to work at an institution with some pretty good vision from the top down. I'm not glad handing anyone here, I honestly mean it. Patrick Deane, McMaster's President, has a vision, and put some institutional money behind enacting that vision. He's identified the learning portfolio as something that students can use to document their learning journey. Watch the YouTube video for his take.
Admittedly, I have a vested interest in this idea working out, because I'm front and centre with this initiative around what we call Learning Portfolios, and what most of you call ePortfolios. I've ran over 20 workshops on the tool in the last year and a half, have become the defacto technical face of the thing, and well, I spent my summer working on it so that I can support the widespread use of it by faculty and students this year.
The summer has been a blur of activity - I've mostly had a chance to tweet about some of the things I'm doing, but here's a quick rundown:
Figure out how to change the wording of ePortfolio in the D2L instance to Learning Portfolio
You'd think this was an easy thing - except that it's buried in the Language options of the D2L admin settings. To my dismay, it wasn't as easy as changing one thing and it propagating to the hundreds of instances of the word throughout the system - it was hundreds of entries. Think it took 4 hours to manually go through and adjust this. After the upgrade to version 10.2 I needed to redo some of this work as the upgrade process either created new language entries or something got overwritten.
Document how to use it
I've always believed in local documentation - I frankly hate using vendor documentation because it's never specific enough to your particular install, nor does it illustrate why you would use it. I stayed away from the why, because that was for other people (students, namely) to decide. I did however create this repository of PDFs. 20 PDFs, articulating and well, showing students and faculty how to use the damn thing. There's about 10 more that I want to do - these having to do with social media connections. While some of this stuff is documented in the vendor documentation, with us changing the name of the tool to something else, there is a high likelihood of confusion had we not done this. This whole thing took months of work, and isn't close to being done. Also, I think we all recognize that students generally don't use these systems because they aren't as slick as Pinterest, or Facebook, or Tumblr. The ePortfolio tool is definitely not as slick as Tumblr, nor as easily customizable. So anything to get the technical details out of the way and into student hands is a desirable thing.
To that end I wrote scripts for twelve videos - eight of them are on YouTube. These were talked about in another post so I won't go into much detail other than many people have really dug them. I think we'll be doing more of these for not only the Learning Portfolio but also for the LMS. With very little promotion, one video has 100 hits and the channel has 15 subscriptions, so it's fair to say that people are interested in this type of support. More of these will have to be made.
Bend it to make it do what you want
One of the institutional goals is to help students take ownership of their learning and make explicit learning goals for themselves. With self-directed goals, many of them are not achievable in four years (or five, or fifteen even). We looked at some ways that might make sense for learners to create and self select their own goals. Ultimately, with some help from Desire2Learn, we settled on a form accessible via a widget that lives on the homepage of the LMS. That form allows the student to fill out a learning goal, making it explicit, do some thinking about it, and track it throughout your time at McMaster. So, we have a piece that does that in conjunction with the Learning Portfolio. I hope that Desire2Learn are thinking about adding this sort of functionality into the ePortfolio tool, because that gives users a giant reason why the ePortfolio tool is better than a blog, Tumblr or any other service in helping you make those connections.
Push out a template for students to use
Seeing as the institution wants to help students - what is the best way to help? There's no one best way I'm sure, but one idea is that we push out a template to any student who registered in a course between Fall 2012 and Fall 2013. Our System Administrator looked at using the API to accomplish a push, however, there wasn't enough time to test the script properly, so we are manually doing this. I have a CSV extract of all 30,000+ students who had the student role in at least a course over the last calendar year. I chop that down to 1000 student segments, enroll those students in a course, then push the template from my account into those student's Learning Portfolio. Unenroll the students, after making course active due to a bug in D2L's system (you can't unenroll a person from a course while the course is inactive), and repeat the process.
I ramped up the numbers to 1000 (I started at 200, moved to 600, then 800, then a 1000), and that seems to be the best number to hold at. When I approached some of the ePortfolio team over the summer about our plans, they were concerned about a push to 20,000. I hope they work on the scalability of the tool going forward. Once I got into a rhythm, and used the 1000 enrollment levels, I knocked off 10,000 fairly quickly.
The process to install a polling widget on your institution's homepage is fairly straight forward. I tend to prefer self-hosting solutions, and open source at that. Thankfully in my job we have that luxury. If you're attempting this with no knowledge of PHP or servers, you might have some issues. I'll try to explain as best as possible, but comment if you get lost in the process, and I'll be happy to clarify what I can.
The first step is to find a polling software solution; basically any polling software that creates an html/php page can be embedded. It's preferred that the page lives behind HTTPS, or secure HTTP connection - so if you're self-hosting the polling solution as we are, you should put it behind the extra security. Why? Well, Internet Explorer doesn't handle mixed secure and insecure solutions and will give the end user a pop up with some unclear language that in the end, only adds more hurdles for the user to answer the poll. In fact, Firefox now has similar behaviour (with an even less apparent notification that needs intervention before fixing).
We're using this polling software: http://codefuture.co.uk/projects/cf_polling/ which serves our purposes quite nicely. It's doesn't allow for question types other than multiple choice, so if you need that functionality, you'll have to choose something else. For our polls, we've worked the questions so that they fit this mold. The extra bonus of this one is that it stores all the data in a flat file - not in a database. So you only have one thing to maintain.
Within the PHP code, you can edit the options - the PHP file is well commented and shouldn't give you any issues. One trick I've run into is that the D2L widget editor doesn't refresh the data well - so if you make an error in the PHP, you should create a new file to upload rather than trying to overwrite, I couldn't figure out why it wasn't letting me reset the data collected (I suspect that the flat file is generated using the name of the PHP file, so when you update the PHP, it won't force a reset of the data captured. Of course, why it wouldn't overwrite the typo in the one answer, I'm not sure).
Another downside, and it's a big one if you want to use these numbers as more than a general indicator - is that this solution does not track users. So, if you do choose this route, be aware that this poll sets a cookie on the computer that answers the poll, not necessarily attached to the user who answered the poll - so the same person could answer the poll multiple times. We don't particularly care about that, only because we're using it for a general sense of how the community feels on these issues. With large enough data, even with some mischevious numbers, we'd be OK.
You'll need some basic CSS skills as well to edit how the page will look - there's three options by default - but I've trimmed out the script to not include the extra options we aren't using. I've rewritten the CSS to more accurately reflect the branding and colour scheme that we use at my institution.
I've included the text of the script listed above for an example of what we run and how we customize it. If you can't see it, visit the text on pastebin.
// include the cf polling class file
// your poll question
$poll_question ='How well did the Discussion tool stimulate a conversation that improved understanding of the course material?';
// In this variable you can enter the answers (voting options),
// which are selectable by the visitors.
// Each vote option gets an own variable. Example
$answers = 'did not use';
$answers = 'a little bit';
$answers = 'a lot';
$answers = 'was crucial';
// Make new poll
$new_poll = new cf_poll($poll_question,$answers);
// if you are not using one_vote there is no need to use this.
// $new_poll -> setCookieOff(); //(new 0.93)
// One vote per ip address (and cookies if not off)
$new_poll -> one_vote();
// Number of days to run the poll for
$new_poll -> poll_for(28);// end in 28 days
// $new_poll -> endPollOn(02,03,2010);// (D,M,Y) the date to end the poll on (new 0.92)
// Set the Poll container id (used for css)
$new_poll -> css_id('cfpoll2');
// chack to see if a vote has been cast
$new_poll -> new_vote($_POST);
// echo/print poll to page
echo $new_poll -> poll_html($_GET);
So that's the backend of things. We currently manually set up a polling question, and will rotate through six different questions (which means six different unique PHP scripts) in a semester. Every three weeks, we prepare a new script page by copying the previous one and editing the end date, questions and answers, and upload it to the server.
Now getting it into a widget in a course (or at the organization level) is dead simple. Create a new widget, edit that widget and get to the HTML code view for the content of that new widget. Once there, put in this code:
<p style="text-align: center;"><iframe src="LOCATION OF YOUR FILE HERE" height="340" width="280" scrolling="no"></iframe></p>
Of course, you'll substitute wherever the location of the PHP script you're using is located where I've written "LOCATION OF YOUR FILE HERE". Click Save to save the widget. You won't be able to preview this widget, so you'll have to have a bit of faith that your code (and my code) is correct. Add the widget to your homepage, and you're home for dinner.
Our experience with this is pretty surprising. The first time we ran the polls there was 36 responses in 10 minutes (during exams), 1450 in 24 hours and 2655 after one week. After three weeks the final tally was 3598. Now remember, that's votes, not individuals. Even so, consider that each student might only vote as an average of 1.4 times, which might skew the numbers somewhat, even so that's pretty representational (and corresponds with our internal numbers for the tool we surveyed about).
Here's what the Poll looks like:
What do we hope to find with this? Well, personally I wanted to see how the Analytics tool use numbers would compare with users self-reporting. Does use of the tool make for an impression of using the tool? Are students even aware of the different tools in D2L?