Why Can’t Students Opt-Out of Data Collection in Higher Ed?

You know, for all the talk from EdTech vendors about being student centred (and let’s face it, LMS’s and most of the other products are not student centred) and all the focus on data collection from student activity – why don’t products have an easy opt-out (being student centred and all that) to not allow data to be collected?

What makes matters worse in many ways is that the data collection is hidden from student’s view. For instance, in many LMS’s they track time spent on content items or files uploaded. This tracking is never made explicit to the student unless they go and dig into their own data. And doing that is incredibly difficult and you don’t get a complete picture of what is being collected. If I was a more conspiratorial minded person, I’d suggest that it was done on purpose to make it hard to understand the amount of total surveillance that is accessible by a “teacher” or “administrator”. I’m not. I honestly believe that the total surveillance of students in the LMS is really about feature creep, where one request turned into more requests for more information. LMS’s on the other hand want to keep their customers happy, so why not share what information they have with their clients, after all it’s their data isn’t it?

Actually, it’s not. It’s not the client’s data. It’s the individual’s data. To argue otherwise is to claim that what someone does is not their own – it reduces agency to a hilariously outdated and illogical idea.

The individual, human, user should be allowed to share or not share this data, with teachers, with institutions or with external companies that host that data in an agreement with an institution that probably was signed without them even knowing it. There’s an element of data as a human right that we should be thinking about. As an administrator I have a policy I have to adhere to, and a personal set of ethics that frankly are more important to me (and more stringent) than the obscurely written-in-legalese policy. An unscrupulous, or outright negligent LMS administrator would mean that all bets would be off. They could do things in the LMS that no one, except another administrator, could track. Even then, the other administrator would have to know enough to be able to look at all the hundreds of different changelogs, scattered across different tools, across different courses and do essentially a forensic search that could take a good long time to undo any damage. That lack of checks and balances (a turn of phrase that appears purposefully as I think we’ll see what a lack of checks and balances will be like in the US the next few years) which could be implemented as part of using the system, but aren’t, leaves education in precarious situations.

The idea that the data locked in the LMS without the students being able to say, “I only want my data shared with my College” or “I only want my data shared with company X for the purposes of improving the system” shouldn’t be hard to implement. So why hasn’t it been done? Or, even talked about?

In my opinion, the data that gets harvested (anonymously of course) provides more important information to the company about how people use the system than the optics of having an opt-out button. It allows Blackboard to say how instructors use their system. We could talk about how terrifying this blog post is (instructor use is a proxy for how students use the system because LMSs give power to instructors to construct the student’s experience), or devoid of solid analysis. I’ll deal with the analysis later, so let’s just consider how this is entirely without context.  Blackboard hosted sites have been harvested (probably with the consent of the people who signed the contracts, not the instructors who actually create things on the system, or the students who engage in the system) by Blackboard to tell you how you teach. In one of the cited pieces, Blackboard says that they used this data to improve their notifications. If I put this through for ethics review, and said I’m going to look at notifications improvement and then released a post about how people used the whole system, it may very well be in their rights (and I suspect it is) but it is ethically murky. The fact they’ve released it to the public allows the public to ask these questions, but no one really has? Or if they have, I missed it.

The fact that Blackboard can do this (and I’m talking about Blackboard because they did it, but there’s a similar post from Canvas that’s making the rounds about discussion posts with video being more engaging or some such idea) without really clearing this with any client is chilling. It also constrains how we perceive we are able to use the LMS (it sets the standards for how it is used).

D2L Badges (If I Designed the Badges In the Image of the Work I’m Doing)

I’ve been working with badges for over a year now, and we’re finally starting to put together a program that will help identify skills, have students award other students badges, and make up essentially a co-curricular record for students.

Internally we struggled (and still struggle) with the software and the design – early ones had easy to achieve badges which didn’t mean much; later iterating into something a bit more robust. I wanted to share what we’re doing but people are also looking at researching our efforts, so I’ve had to abstract that a little bit to share the design, but not the specifics. So I looked around and thought about the things I know that might be transferrable. At the same time D2L has announced that the Learning Environment for continuous delivery clients will get a badging application in the September update. Aha! The badges of D2L. It’s not a perfect analogy, but bear with me, I think there’s some important parallels that should be easy to decipher. First, some history.

Many people who have been through an LMS review will tell you that selecting the software  is not just about features, it’s more about the people behind the software. Much like badges, it’s not about the image that they project, but the metadata that they contain within. So for the purposes of “badging D2L” I’ve decided to take the well known public faces of D2L and apply badges to them. It works as a parallel for the work we’re doing around building a co-curricular record.

topdogThe top of the hierarchy is easy, it’s the Top Dog badge. This badge is awarded for people who are involved in a wide variety of things, evangelize the D2L experience, are endlessly positive and feel like a trusted companion or a good friend. Who would be the best person to represent the Top badge? Founder and CEO John Baker is clearly the most well-known of all D2L employees. Most customers, industry analysts (amazing that EdTech has analysts now), and will think of John Baker when thinking about D2L. Always interested in his clients, and endlessly positive about the D2L products – John fits the top of the hierarchy (and the characteristics of our badges).

Just underneath the Top Dog badge, we have three other that one has to earn, before getting the Top badge. The Community badge, Productivity badge and Development badge.

The community badge could be earned for contributing and developing a sense of community around D2L products.

After John Baker, Barry Dahl is in the #2 spot of people well known within D2L – that can communitybe within the EdTech community (via the Desire2Blog he authored prior to joining D2L), or the Teaching and Learning community (via the D2L Brightspace Community, or the many regional forums that D2L hosts around North America). He has been a D2L award winner multiple times (prior to going to work for them, of course), been on the main stage at Fusion multiple times, probably upwards of a hundred webinars and is just generally well-known throughout the D2L customer base. In fact, I’d say that if you became a client of D2L in the last 5 or 6 years you’d be very likely to know Barry, much like how I got to know Barry, through the community around D2L’s products. Never mind the fact that Barry was a “face” of D2L before being hired by D2L. He was critical of the product at times, but always out there helping the community around the D2L Learning Environment.

For the Productivity badge, the earner has to be able to be a part of different aspects of a project, responding to other member’s (and community) ideas in an inclusive, respectful and productive manner.

productivityThe Productivity bages is represented by Ken Chapman. Ken as VP of Market Strategy is certainly one of the more prominent D2L people due to his longevity with the company and  his engagement in many different forums; including Fusion, webinars, client visits, and  the like. Funny, I went to D2L’s site to grab a picture of Ken for the badge, but he’s not listed under the  “Leadership” page on their website.

The last badge at the first tier would be the Development badge. This badge would be earned by demonstrating the development of themselves, the community and/or project.

developmentThe Development badge is represented by Jeremy Auger, Chief Strategy Officer at D2L. Another long-term employee (one of the originals, I think) who has had pretty high visibility at Fusion and in all the other venues  where  you  have a chance to get to know these people. One caveat here  is that lots of D2L customers know who Jeremy is, but I’m betting that very few can actually tell you what he does (I’m sure it’s something…). The badge is sufficiently vague, like a title that is bestowed on a chief strategy officer. I kid, I kid, cleary D2L’s strategic efforts in their three markets (K-12, Higher Ed, Corporate) must pass through Jeremy.

As you might well know, badges can be cumulative or solitary. I’d imagine that would be the case if I were explaining D2L’s badging hierarchy, and not McMaster’s approach to badging co-curricular activities. So the next level below might be made up of two or more badges that could earn you the Community, Productivity and Development badges. If we were to further drill down D2L’s hierarchy, lesser badges might be represented by some other well known D2L people.

For the Development side of things: Probably Nick Oddson, Sr. VP of Product Development.  Nick hasn’t been at D2L all that long, but he has had high profile announcements at the past couple of Fusion conferences. If he sticks around, he will more and more become one of the faces of D2L. Another person might be Paul Janzen, who’s very active in the Valence community and on the development side of things.

For the Productivity side of things: Mike Moore another employee who came out of education and who gets lots of face time related to the Insights analytics product line and learning outcomes (which are currently undergoing a revamp).

For the Community side of things: Maybe controversial (depending on when you became a D2L client, you might look for Jason Santos here, but in my opinion he’s too new), but I’d choose Terri-Lynn Brown for this spot since it is based on who the best-known employees are, not the most “important” (whatever that means). Terri-Lynn is a Calgary resident and a former K-12 educator who still is very well known within that market, and across Canada.

After that, we could go on with this exercise, but I think you get the idea. If I had made this post a few years ago, it would be a radically different post – filled with names that resonated maybe a little more. I wonder if the changes we’ve felt as clients of D2L will spill forward into the product.

Like badges spilled forward into the latest update.

Fusion 2014 – Day Two Recap

So anyone who has gone to a conference before will recognize, it’s a bit more like a marathon than a sprint – you really have to try to pace yourself to get everything in and pay attention to the things you want to. I will say, that for the second year in a row, the food at the Fusion conferences were really good. Ended up talking with Paul Janzen of D2L about our impending PeopleSoft integration and the summer of integrations (Blackboard Collaborate, Pearson, maybe McGraw Hill, iClicker and Top Hat) we’re doing at McMaster. Ken Chapman also joined us at breakfast and asked a little bit about what we’d like to see out of e-mail. Frankly, I hadn’t thought about e-mail in years, because we’ve been mired in hell with IT and us trying to get the Google Mail integration working (not that it doesn’t functionally work, but IT has stalled us for admin level access since we asked a year and a half ago). I said that I’d personally prefer the system to not do e-mail at all, but that would be a difficult task considering we have people who segment their academic teaching e-mail on the LMS rather than their institutional e-mail. The problem for us is that we’re currently not configured to allow external e-mail.  It will be interesting to see if IMAP/POP3 support comes to Brightside sometime in the future – which makes a lot more sense.

Insights Focus Group

I wasn’t sure if I was going to be of any help in this but I  thought that seeing as we’ve run some reports with the Insights tool maybe I could glean better ways to deal with it. Basically, people had concerns with the large data not being able to run org-level reports (which is one of ours as well), the interface needs some improvement, and the time it takes to create ad hoc reports is too long. So those issues were at least noted. Let’s see how they get addressed going forward.

Blended Learning, Learning Portfolios and Portfolio Evaluation

Wendy Lawson and I co-presented this – however it was mostly a Wendy show. She lived it, so she should have the floor. Basically this presentation outlines what we collaborated on for Wendy in her Med Rad Sci 1F03 course – which is a professional development course for first semester, first year science students going into the Medical Radiation Sciences (X Ray, Sonography, etc).

We used the ePortfolio tool as a presentation mechanism – which I think worked well, I’m not sure if we had a good flow of what we were going to show on each page, but other than that, it was a risk that we felt was not big enough to impede our presentation.

We talked about how this redesigned course could use the Learning Portfolio to deliver the course in a blended manner (using ePortfolio/Learning Portfolio activities as the one-hour blended component) and how the students did with it. After working with Wendy on this presentation, the stuff her students did were miles above what we saw on average and I think next year, the weaknesses she acknowledged in the course will be addressed.

Vendor Sessions

I honestly skipped these because, well, I’m not interested in getting more spam in my work e-mail. Plus, my wife was having surgery (everything’s good!) at this time so I wanted to call and make sure we connected before surgery began.

Connecting Learning Tools to the Desire2Learn Platform: Models and Approaches

Attending this session was particularly self-serving – I wanted to say hi to the presenter, George Kroner, who I’ve followed on twitter for what seems like a million years, and the Valence stuff is stuff I feel I should know. I have a decent enough programming background – I can hack together things. So why am I not actually building this stuff?

George walked through UMUC’s process for integrating a new learning tool which can be broken down into three steps:

  1. Evaluate tool fit
  2. Determine level of effort to integrate
  3. Do it/Don’t do it

It seems so simple when I re-write out my notes, but it’s a really interesting set of steps – for instance, in the determine the level of effort to integrate – you also have to think of post-integration support, who supports what when you integrate? Is it the vendor? Or your support desk? What’s the protocol for resolving issues – do people put in a ticket with you, then you chase the vendor? Is the tool a one-time configuration and does it import/export nicely, or does it need to be configured every time it’s deployed in a course?

We did a Valence exercise next, and I want to merely link two tools that when I circle back to doing some Valence work (soon, I swear!) I’ll need:

https://apitesttool.desire2learnvalence.com/ and http://www.jsoneditoronline.org/. The API test tool is a no brainer really, and I knew about it before but never knew where it was – incredibly helpful for debugging your calls and what you might get back. JSON editor online is new to me, and something that I really, really needed. I’m a JSON idiot – for some reason, Javascript never resonated with me. I’ve always preferred Python or PHP as web scripts, despite the power of Javascript. Guess I’ll have to put on my big boy pants and learn it all over again. Maybe Dr. Chuck will do a JSON course like he’s done with Python?

Social Event at the Country Music Hall of Fame

The evening social event was great – the Hatch Show print is actually something I might just hang in my office. There was some shop talk, some fun stuff, a drone… Oh yeah, this happened: