i>Clicker Integration with D2L/Brightspace

I’m sure i>Clicker won’t be particularly happy with this post, that’s ok, because I’m not happy with them. The one thing as an LMS administrator that you should feel very, very sanctimonious about is sharing data. We have an internal policy that we’ll never share student numbers. I also think that we shouldn’t share usernames, however some folks feel that’s ok. I guess it depends on your username convention. First dot last as a convention is great, except in these cases when you’re trying to maintain privacy. Now, if you’re making a single sign on connection through LTI, you can have the originating LMS username obfuscated, which essentially boils down to a paired database table that has two usernames in it – the one handed off to the third party via LTI, and the one in the LMS. Typically, I’m a little bristly about this as well, because you become reliant on the system not changing how this is handled, and if there’s an HTTP call mixed in there, it still could be sniffed out while in transit… but that’s not what this post is about.

Since last year, I’ve been bugged by i>Clicker to do an integration to make sign-up with their service seamless for students. There has been some requests for integration between i>Clicker and the gradebook in D2L from some of the more heavy users. The first request, I’ve always put off because, frankly, I don’t do anything a private enterprise wants me to do, they aren’t my boss, nor are they a member of my community. They serve one purpose, and that’s collecting money. The second, however, does benefit a select user group on campus. We were thinking about this since last year’s summer of integration, and April had a couple minutes free (more on that in forthcoming blog posts), so we scheduled some time to finally make it happen.

We schedule a call, to walk us through the integration and see if there’s anything that we have questions about. I don’t need anyone to walk me through an integration, but I often like to raise privacy, data collection policies, and other awkward questions to the poor sucker who’s on the end of the other line. Typically they are ill informed. I didn’t even get to the awkward stage, as a request was made that was frankly shocking. I was asked to turn on passing the Org Defined ID – or our student number – to facilitate the connection. Not just at the tool level, but at the configuration for the Tool Information. See below:

config_tool_consumer (1) Now if I understand this panel correctly, it not only changes the configuration settings for the tool in question, but for all the tools. ALL the tools. So not only i>Clicker, but anything else you have connected through using the External Learning Tools administration panel. I asked our technical account manager about changing it, and he basically said, “yeah, that’s not good.”

So in the middle of this exchange where I explain how we don’t pass the student number under any circumstances, the i>Clicker representative seemed to be a little miffed about my protests. He wasn’t particularly nasty about it, but certainly didn’t seem to understand why this was an issue at all. Looking at i>Clicker’s website, students are asked for their ID. It’s different asking a student to give them their ID (consent) and setting up access to everyone’s ID (no consent).

What makes me wonder is, how many other institutions even give this a thought? Surely we can’t be the first person to balk at the idea of handing over student data like this? Or maybe we are being too paranoid? I mean, I guess there’s people who have faith in the third-party vendors, but I’d prefer having a license, stating exactly what they are doing with data, how they’re using it, how long it’s retained and an agreement signed between the two parties. That way if the external party violates the agreement, the institution can hold them liable for the data breech – something a little stiffer than “oops”.

(Spare) Change

This post may only be worth a defunct penny, but it’s all I’ve got on a sunny, near-Spring, Friday afternoon. With the accelerated rate we accept change in our devices, software, computers, media consumption habits – it’s shocking how glacial Learning Management Systems move. It’s as if they don’t think anyone would want something different. Even the new upstart, Canvas, does the same sort of thing (albeit under a nice shiny coat of paint). Every system’s sales pitch is about giving students (or learners) the power to do stuff, but ultimately, the administrators of the system disallow that for many good and some bad reasons.

Good reasons? Privacy is a huge one, although maybe becoming less and less important as time moves on. Security is another – for instance, while administrators can give student roles the power to create content, it would also reveal content that was due to come up later in the course. First Class was always a great tool that gave control over users to differing levels – so if you had a group, a student could have full capability to delete, add, create or remove anything in that group. If that group lived in a course (a conference in First Class terms) the student could traverse higher into the course level and only access certain functions.

Bad reasons? Convenience is one of them. “Everyone gives students this permission set, so we suggest that you do too”.

What it often means is that we have a pre-conceived notion of how a student should behave in an online space, and by hook or by crook, we’re going to make them fit in. Pounding square pegs into round holes doesn’t make sense in life, why do we insist on it in our online course delivery vehicles? Why can’t modern systems allow for a level of complexity that respects people. In some cases, you do want content to be protected, but it’d be awesome if students could add their own content as well. Students present in your course? Wouldn’t it be great to have them able to access an area to share their presentations with each other?

This isn’t such a startlingly new idea. The criticisms that are laid out in 2004 are the same as they are now. Why? The Learning Management System is solving a problem based on a narrow definition of what a classroom is, rather than what learning is. Learning in 2015 is a much more nuanced, complex thing that it was even a decade ago. More students, larger classes, more dynamic mix of genders, races, classes and sexualities – never mind learning preferences and personal history. How can a monolithic system solve that problem? By getting more complex. Is that solving the problem though? Based on my experience, no.

So where does that leave us? Well, I suspect at the tip of the iceberg. Maybe experiments like DS106 are part of the answer. Maybe some of the interesting analytics work will help us define who students are, and how to teach to them. Maybe it’s the abolishment of the course and implementation of a much more agile, holistic package delivering real learning based on individual need.And that’s where my thoughts circle back to student empowerment. I’m sure you can piece together where this might be going, but maybe getting rid of the moronic, capitalist structure (yes a course is a capitalist invention to assign a monetary value for a period of time with instruction) is the first step. A first step that won’t happen, but a good first step. Then we need to get real about student empowerment. Not talking about  it, but actually doing it.

Extending D2L Reports Using Python

So before I get going on this long post, a little bit about me. I fancy myself a programmer. In fact, the first “real” job I had out of college (the second time I went through college, the first job the first time through college was a video store clerk) was programming a PHP/MySQL driven website in 2003-ish (on a scavenged Windows 2000 server running on an HP box). So I have some chops. Of course, as educational technology has moved away from being a multiple skillset job where a little programming, a little graphic design, a little instructional design was all part of what one did, now it’s been parsed out into teams. Anyways, I haven’t had much of an opportunity to write any code for a long time. The last thing I did seriously was some CSS work on my website about four months ago, which is not really doing anything.

With that all said, the LMS administration I do requires little programming skill, as being an admin, while having it’s perks, isn’t exactly a technical job. I do recognize that some admins deal with enrollments and SIS integrations that require a whole skillset that I could do, but I don’t.

As part of my admin jobs, I run monthly reports on how many items are created in D2L’s ePortfolio – these reports include date and time, username and item type. We use this as a gauge of how active students are and cross-referenced with other information it’s incredibly useful in letting us know things like 22,383 items were created by 2650 unique users in ePortfolio in November 2014 (actual numbers).

Using Excel, I can see if users in a month have ever used ePortfolio before, so I can also track how many users are new to the tool per month. However the one thing I can’t easily look at is how frequently a user might interact with ePortfolio. With a total record number of over 125,000, it’s not something I want to do by hand. So I wrote a script. In Python. A language I don’t know well.

If you want to avoid my lengthy diatribe about what I’m doing and skip right to the code it’s on GitHub here: https://github.com/jonkruithof/python-csv-manipulation-d2l-reports. If you don’t want to click a link, here’s the embedded code here:

Basically the code does this:

Look at the username, see if it exists in memory already. If it does, find the difference since the last login write that information to a CSV, and then overwrite the existing entry with this entry. If it does not exist, then write it to memory. Loop through and at the end, you’ll have a large ass CSV that you’d have to manipulate further to get some sense of whether users are frequently using the ePortfolio tool (which might indicate they see value in the reflection process). I do that in CSVed, a handy little Windows program that handles extremely large CSVs quite well. A quick sort by username (an improvement in the Python code coming sometime soonish) and we have something parseable by machine. I’m thinking of taking the difference in creation times, and creating some sort of chart that shows the distribution of differences… I’m not sure what it’ll look like.

Publisher Integrations with D2L

So I’ve tried to write this post several times. Tried to work up some semblance of why anyone would care about Pearson or McGraw Hill and integrating either publishers attempt at crafting their own LMS into the one that our institution has purchased (at no small amount).

Then it hit me while going through Audrey Watter’s great writing (in fact, go over there and read the whole damn blog then come back here for a quick hit of my snark). The stuff that I’ve been challenging Pearson and McGraw Hill on is stuff that we should be talking about. I’ll encapsulate my experiences with both because they tie nicely together this idea of giving private interests (eg. business, and big business in this case) a disproportionate say in what happens in the classroom.

The concept is really twofold. One, why are we providing a fractioned user experience for students? This disjointed, awkward meld is just a terrible experience between the two parties. Login to the LMS, then login again (once, then accept terms and conditions that are, well unreadable for average users), then go to a different landing page, find the thing I need to do and do it. Two, students are being held hostage for purchasing access (again!) to course activities that in any sense of justice, would be available to them free. Compounding this is the fact that most of these assessments are marked, and count towards their credit. And there’s really no opt-out mechanism for students. Never mind the fact that multiple choice tests are flawed in many cases, but to let the publisher decide how to assess, using the language of their choice, is downright ugly.

Pearson

So Pearson approached us to integrate with their MyLab solution about two years ago. We blankly said no, that request would have to come from a faculty member. Part of that is fed by my paranoia about integrations, especially when they require data transfers to work, part of it is that frankly, any company can request access to our system if we allow that sort of behaviour. Finally a faculty member came forward and we went forward with an integration between Pearson Direct and D2L. I will say that parties at D2L and Pearson were incredibly helpful, the process is simple and we had it setup in hours (barring an issue with using a Canadian data proxy which needed some extra tweaking to get working correctly). The issue really is should we be doing this? The LMS does multiple choice testing very, very well. MyLabs does multiple choice testing very, very well. Why are we forcing students to go somewhere else when the system we’ve bought for considerable sums of money does the very same thing well? Well the faculty member wanted it. What we’ve found is that most faculty who use these sorts of systems inevitably find that students don’t like jumping around for their marks.

Additionally, when the company has a long laundry list of problems with high stakes multiple choice testing, how does this engender faith in their system?

McGraw Hill

Again, all the people at McGraw Hill are lovely. None of them answer any question about accessibility, security or anything it seems straight. That may be because their overworked, that may be because they don’t know. None of the stuff I’ve seen is officially WCAG compliant, however it may have been created prior to that requirement being in place, so they may get a pass on that one. The LTI connection is very greedy, requiring all the options to be turned on to function, even though D2L obfuscates any user ids, and it bears no resemblance of an authenticated user (thanks, IMS inspector!) it still needs to be there or else the connection fails. What kind of bunk programming is this? Why is it there if you don’t use it? Why require it? Those questions were asked in that form in two separate conference calls, to which most went unanswered. I did receive a white paper about their security, which did little to answer my direct questions about what happens to the data (does it travel in a secure format? who has access to this feed?) after it leaves D2L and is in McGraw Hill’s domain.

Now I’m paranoid about data and private companies. I immediately think that if you’re asking me to hand over data that I think is privileged (here’s a hint, it’s all privileged) you should be able to at least answer why in the name of all that is unholy, that you deserve to have access to it, and if I agree to give it to you, what happens to that data, when it leaves my system and sits on yours. That should be easy to answer and not require any sort of thought. It makes me wonder if anyone is asking these questions at all. They must? I mean, everyone is thinking about this sort of stuff right?

D2L Regional Conference (Ontario) 2014 Recap

Thought I published this in late October, but I guess not. Whoops.

So this conference was all about ePortfolio and the Competencies tool – every presentation I went to had something to do with either of those two topics. If neither of those are interesting to you, then well this’ll be a boring recap.

I took part of a focus group around course level analytics – I don’t know if D2L got much out of our conversations, but they were good ones. I hope they heard that the analytics tools are good at what they do, but aren’t robust enough for the complexity of the task.

On to the  presentations.

Tracking Overall Expectations in Brightspace

This presentation was by the Hamilton Wentworth District School Board administrators who are using the Competencies tools to track outcomes. It’s interesting that they use the language learning goals (as a proxy for outcomes) and success criteria (as a proxy for activities). That language is much clearer for K-12 – but would probably be a bit bristly for higher education. John Baker was in the meeting and suggested that there will be a D2L tool to copy outcomes from the standards source (whether that be a District School Board, Provincial standard or other). If that comes to fruition, I could see the CEAB (Canadian Engineering Accreditation Board) lining up to get their standards in, which would be a big boom for D2L clients who have Engineering departments. They also shared their videos for how to use the Competency tool.

At lunch I was sitting with a friend and John Baker came by, and I asked him about future improvements to ePortfolio – as we’ve been struggling with the presentation creation aspect of the tool. He mentioned some things that they were working on, specifically learning goals (unfortunately I couldn’t ask him more about this) and automatic page creation based on tags.

D2L ePortfolio and All About Me

Durham CDSB use ePortfolio as a way for students to think about where they are going. Specifically these are K-6 students, so elementary students for those who don’t reside in Canada. It seems the program is all about identity forming and goal setting. Interesting aspects, and the approach is pretty much how you’d expect. Students are prompted to answer questions about their future, and they periodically answer the questions over the six years. One key thing is that they’ve developed a workflow page, which I think goes a long way to helping students with their tool use. Another interesting aspect is the teacher is selecting artifacts for the student, putting them into a collection and then sharing that collection with the student.

Learning Portfolio: Year One

This was my co-presentation with Catherine Swanson, the Learning Portfolio Program Manager. We talked about some of the challenges of introducing a portfolio process across all disciplines, in all facets of campus and how we did. As an institution we did fine. In many of the cases where the ePortfolio tool was used in conjunction with a good use in a course (particularly good where one has to make a choice or work through a messy process to “learn”) it was ultimately successful. Where the purpose was less directed, more fuzzy, well things didn’t work as well.

 

AAEEBL Conference Notes – Day Three

And on the third day, we rested… well not really. We heard about and connected with some people at Deakin University who are using the D2L ePortfolio tool in a collaborative way – which is a real limitation that we’ve run into in that even if the permissions on the shared item are “edit”, a presentation is still not editable. It’s a weird problem that seems to stump a lot of us, so it’ll be nice to talk to the people at Deakin who are using ePortfolio in collaboration.

ePortfolio Meets Peer Mentoring

At McMaster, we’re entering year two of the Learning Portfolio and one of the hopes is that students who used it in year one continue to use it in year two as well as mentor new students and offload some of the technical support that we do. Mercy College have a stipend for their peer mentors, who have to complete bi-weekly portfolio tasks which are submitted for assessment against the learning outcomes of the program (which include leadership, personal and professional development). It’s an interesting scenario because our student success office has something similar and is not running into the problems that Mercy College exhibited around time investment to put stuff in their ePortfolio platform (Mercy College uses Digication).

Evaluationg Artifact Selection to Improve EPIC Initiatives at Wentworth Institute of Technology

The first question on my mind was what’s EPIC? Turns out EPIC is External, Problem based, Interdisciplinary Curriculum. What they found was that students added anything that was graded above a B, and was related to their major. Students found that there were three top motivators to keep things up to date – jobs, professors and completion of the course. The real interesting thing is that forming a committee that’s student driven – basically an ePortfolio Advisory Board. I think this is something that we’ve missed here at McMaster.

ePortfolio and the Whole Learner: Intergrating Curricular, Co-Curricular and Experiential Learning

Guttman Community College’s Laura Gambino presented this session which was essentially a very specific use case – Guttman’s curriculum was designed in a very unique way, essentially tailored to the specifics of their student body. This includes a lot of experiential learning, curriculum tied to community based learning and the first year curriculum is very tailored to the Guttman learning outcomes. They use the ePortfolio for integrating learning between the courses – to help transfer and connect knowledge. The first year curriculum is based around a theme like Urban Ecology, and the ePortfolio helps connect the specific course curriculum to that theme by collecting reflections on the theme, and document community based learning days. To “standardize” ePortfolio marking Guttman has Assessment Days to look through portfolios, review processes and use ePortfolio as an LMS itself.

Future of Technology in Higher Education

Bryan Alexander’s keynote was a walk through potential futures of education – depending on how you view what the trends suggest where we might be heading. I can’t really recap the key points because it was something that you should really experience as a whole event. I will say that many of the scenarios were dire for education as we know it. That’s not necessarily bad, it suggests that we might be in for some turbulent times. The best quote of the day was during the keynote:

Social media will become media, media that isn’t social will be asocial.. like Blackboard.

I’m not sure that this is entirely true, I have a bit of a different view of new media in that it doesn’t replace or supplant existing media; it diminishes the prominence of the previous media structure. Social media is diminishing phone calls, letters, e-mail and that sort of long-form personal communication. If we look at previous media shifts, TV didn’t replace radio, DVD/VHS didn’t replace film and so on. Where it does replace existing media (digital photography replacing commercial film photography, telephone replacing the telegraph) it’s really the same media mode – with social media it’s a multimedia thing – social media is pictures, video, text and anything else people create and share. Maybe a slight criticism, but I get what Bryan was getting at. In that social media will become so ubiquitous that it’s just media. There will be very little differentiation between professional media created for consumption and amateur created media as the quality and accessibility of professional grade tools to create media.

AAEEBL Conference Notes – Day Two

Day two started as a sleepy morning in Boston – coffee and a bagel helped a fair bit. I was still jazzed over some of the ideas shared the previous day so that excitement began to sink into where my brain goes next, which is how to do this badging server at my place. Trent Batson said at one of the breaks, ” The beauty of badges is the metadata behind the badge.” Now a recap of the sessions I attended.

Lessons Learned – Mobilizing an Institution to Embrace ePortfolios to Measure Essential Learning Outcomes

Before this session even started, I was skeptical. We all know that marking using ePortfolios take longer, so how does one actually get the entire institution to take on that extra work. I mean, there’s a reason faculty use multiple choice, scantron style assessment methods right? It’s faster to mark. Well, this session didn’t totally answer the question, but it did mention two key concepts that the entire conference revolved around – learning outcomes (competencies or objectives depending on your local syntax) and curriculum change from teacher focused to student focused. Faculty and students were looking for a better experience, and the only way to do it was through curriculum change. Stockton College took four years to identify the correct outcomes (at the institutional level), gain consensus, map out the connections to the institution level and the course and program level, create an assessment plan and select an ePortfolio platform (Blackboard and Digication).

Digital Connections: From College to Career

This presentation was about a ePortfolio use in Business Administration program at Tunxis Community College – many of the same lessons we’ve learned in the first year; you need to make it worth something, and you need to integrate other elements of academic life into the ePortfolio process. It’s also nice to see that students there resist the ePortfolio stuff but recognize the value at the end when they see their own growth.

Iterating on the Academic Transcript: Linking Outcomes to ePortfolio Evidence

This session was about working on the lack of information in a traditional transcript. Stanford University has flipped the transcript – it presents information along the outcomes that were in the program. So it would list the outcome (say, something like ethical reasoning) and list the courses that assess that outcome and the result of that course. ePortfolios sit beside this process as an unofficial record of what was achieved. Drexel University is currently in the process of doing this – except they have things called student learning priorities, which are measured from three areas: co-curricular, curricular and co-op. I see a lot of McMaster’s Learning Portfolio initiative in Drexel’s approach.

Cultivating Learning Cultures: Reflective Habits of Mind and the Value of Uncertainty

“I wanted to hijack the eportfolio to be about the learning process.” Kathy Takayama

This keynote was one of those talks that sits with you for a while. I really had to think about this one, so my notes are not that great… but the gist of it was that ePortfolios are learning spaces for metacognition. By using open language (like “so far” and “at this time”) and integrating the language of uncertainty into the course (and obviously the requirements) it allowed Kathy a pathway to allow students to develop the idea of metacognition about one’s own learning.

Design Thinking: Digital Badges and Portfolios

This was a workshop, which engaged us in thinking about developing a series of badges that could be offered at our institutions, and give that framework some thought. We primarily worked with the Jisc Open Badge Design Kit. We did have to consider what goes into a badge to make it powerful – what we came up with was that the criteria is part of the badge, and that students provide evidence of “earning” or achieving that badge. I’ll write something in the near future about this, but we’ll be exploring this area in the next few months as our MacServe program wants to issue badges, and I have to make it happen!

AAEEBL Conference Notes – Day One

As part of my personal effort to broaden my scope beyond just LMS work (which is what I’ve done the last three years or so) I’m trying to attend all sorts of different educational technology related conferences. A lot of the work I’m doing has this weird intersection of outcomes based assessment and evidence, which ePortfolios really understand. Coincidentally, McMaster has this Learning Portfolio initiative going on. Strange how these things line up? So going back a few months, Tracy Penny Light was working with McMaster as a visiting professor around how we can better integrate the Learning Portfolio around campus, as well as how McMaster can start participating with ePortfolio campuses around the world – one of the suggested ways to start was to attend AAEEBL’s annual conference.

So after the brilliant people at Passport Canada got me an emergency passport (I had left mine in a car service the week before – and the car service was in Buffalo, and unable to return my passport), off to Boston I went. The sessions I was in were typically 20 minutes, so the notes won’t be as extensive as what I did for Fusion. Here’s my notes:

ePortfolio in Study Abroad: A Model for Engaged Intercultural Learning

A couple of interesting ideas – Indiana University Purdue offer 70+ study abroad programs and the ePortfolio use in those courses are widespread across disciplines (Humanities, Biology and Liberal Arts were mentioned specifically). These study abroad programs are aligned with graduation outcomes – I didn’t catch whether or not they were assessed for graduation, but certainly they could be. I wonder what a University or College would be like if they built in some experiential component, that required them to document what they’ve learned, and show evidence of that learning as part of graduating?

ePortfolios in Graduate Education – Developing Critical Reflection and Lifelong Learning

Athabasca University uses ePortfolios at the Master’s level in their Distance Education program to assess for PLA (prior learning assessment) and as a program long piece. One of the big takeaways was that they have to work really hard to steer students away from a showcase style ePortfolio, to a more reflective critical practice portfolio. I wonder if this is the end goal for us, to have users engage in this critical practice, do we have to get away from the showcase style stuff we’re doing already? Or can we accept that cognitive dissonance, and really push students to use the Learning Portfolio for more than one reason? That is going to be a tough task.

Keynote: Catalyst For Learning

So I’ve heard about the Catalyst for Learning website, it comes up fairly frequently in ePortfolio circles, and it really is a valuable resource. Some interesting ideas brought up during the keynote – the one that really resonated was the preliminary research that suggested that ePortfolio use on campus can aid retention by 20% – which is a huge number. Another was this sub-site for the use of ePortfolio as a Curriculum change agent. The keys for success in implementing ePortfolios is to find opportunities that use inquiry and reflection, and make the ePortfolio component of those teaching acts meaningful (beyond grades).

A small portion of the keynote was spent on scaling up – and that’s something that I’ve struggled with getting my head around. There’s the typical connect ePortfolio use to institutional goals, engage students (well duh!) but two of the scalability points resonated and bodes well for what we’ve done at McMaster. The first was “develop an ePortfolio team” – which I think we’ve done very well. We’re forming a Learning Portfolio advisory committee, which will include students and student government as well as faculty and staff. The second was really nice to hear, and that was “make use of evidence of the impact of ePortfolio use”. That’s the stuff that we’re digging into this year.

Building a Sound Foundation: Supporting University Wide Learning Portfolios

My name in blurry lights.

20140729_150740

This was my presentation about the technical supports that we put in to support campus wide ePortfolio use. We did some informal data collection around the effectiveness of the stuff we built – what students used, and basically the resources I expected to be used were not used as much as the stuff that I felt would not be used. Basically I’m a bad judge of what people will use.

Two tough questions I got that I couldn’t answer: Is there evidence that attendance in the workshops for faculty help delivery? Does faculty taking the workshop filter down to the students?

Make It Do What It Do: The Intersection of Culturally Relevant Teaching, Digital Rhetoric in Freshman Writing Classrooms

I will say, this pairing of presentations was definitely the odd couple of the conference. What is not astounding is that this session blew my mind. I was drawing comparisons to all sorts of communication theory, Walter Ong’s oral tradition, cultural studies, bell hoooks, Public Enemy songs… just a cornucopia of stuff firing off. Also, the quote of the conference right here:

“Uploading Word Documents to a predefined template emulates a violent history of technology that reinforces existing power paradigms”

So what was my takeaway from all this? Being a white dude, I have to remember that this technology and initiative comes from a white dude perspective. How do we diversify this initiative in a way that is respectful and not tokenizing? I guess there’s some element of diversification of ePortfolios – remembering that they are not some panacea, but come from a specific perspective, and while they may be used by any person, the pedagogy that surrounds them is almost certainly from a particular perspective.

How to Design an Assessment Program Using an ePortfolio: Linking Mission to Learning

While this session was stacked at the end of what was an exhausting day, it reinforced a lot of the things that we’re doing at McMaster: ePortfolios allow a channel of communication between institution and student, data from the assessment of ePortfolios (program oriented ePortfolios) aren’t useful for deep analysis, but can reveal opportunities for curriculum improvement, and rubrics used to assess ePortfolios can be linked to program level outcomes.

 

Badging

I’ve been involved, somewhat peripherally, with the Open Badging Initiative for the last six months or so. Initially, it was a way to start thinking about breaking the LMS (Integrated Learning Platform? aw, screw it, I don’t know what the thing is called anymore) out of the box it’s in and communicating what the LMS does well with other parties. I thought it could be a way to communicate skills, think about developing a short-hand language through the badge to communicate with other people. It’s really a way to check all the boxes that get me excited currently. Open standards? Yep. Mutating a system to do something other than what was intended? Yep. Visual design an image that communicates a value to another party? Yep. Explore the value of a systematic education? Yep.

The problem is that I essentially stopped programming in 2004 when I really didn’t need it anymore. Sure I’ve done a few things like hack together a PERL script to parse out values in a text file, and dump it into a database, but using badges at this point, or at least at my institution, I need to get up to speed with programming and handling JSON, XML if I’m going to start tinkering with our LMS and implementing badges. Ouch. Thankfully, I’ve got a few friends and colleagues who’ll help me get there.

For those of you who don’t know, badging is a way of giving value to something by awarding an image that represents that value. At it’s simplest, it works like the Scouts – demonstrate something and get a badge for demonstrating that you know something. It’s basically the same proposition as what grades are in higher education. The neat thing is that the badge doesn’t have to be tied to a number that’s arbitrarily set by someone (a teacher) or something (a computer, schooling system…). It can be tied to evidence or not, depending on the issuer of the badge and what they demand for getting the badge. That’s where badging is cool for me.

When you earn a badge that conforms to the Open Badges Standard, it can be pushed to your backpack. This is the central repository of badges for you. I’ve embedded below a portion of my backpack for you to see how one might display achievements.

What makes badges a little better than a grade value is the evidence of learning which is listed as part of the criteria. Now in many cases this is not as transparent as it should be. For instance, I’ve been working through CodeSchool’s Javascript introduction and JQuery courses that issue badges. Their criteria is displaying on a page that “confirms” I completed a module. Wouldn’t this page be much better if it shared exactly what I did to earn the badge? That would be powerful. I realize that there’s all sorts of considerations for student privacy, and ideally they would be able to control what is shared with the badge (maybe an on/off switch with each badge issuer to allow for a simple description of what earned the badge or a more detailed results page). That might lead to badges being more than a symbol of learning that doesn’t communicate clearly to the viewer what was learned.