Dealing With D2L’s Data Hub with PowerBI

Where I work, we use D2L Brightspace as our LMS. We also used to use Insights, their data dashboard product, but we didn’t find value in it and turned it off in 2016 or so (just as they were revamping the product, which I think has gone through a couple of iterations since the last time I looked at it). The product in 2016 was not great, and while there was the kernel of something good, it just wasn’t where it needed to be for us to make use of it at the price point where we as an institution would see value. I will note, that we also weren’t in a place where as an institution we saw a lot of value in institutional data (which has changed in the last few years).

Regardless, as a non-Performance+ user, we do have access to D2L’s Data Hub, which has somewhat replaced the ancient Reporting tool (that thing I was using circa 2013) and some of the internal data tools. The Data Hub is great, because it allows you access to as much data as you can handle. Of course, for most of us that’s too much.

Thankfully, we’re a Microsoft institution, so I have access to PowerBI, which is a “business intelligence tool” but for the sort of stuff that I’m looking for, it’s a pretty good start. We started down this road to understand LMS usage better – so the kind of things that LMS administrators would benefit from knowing: daily login numbers, aggregate information about courses (course activation status per semester), average logins per day per person and historical trending information.

The idea is that building this proof-of-concept for administrators, we can then build something similar for instructors to give them course activity at a glance, but distinct from the LMS. Why not just license Performance+ and manage it through Insights report builder? Well, fiscally, that’s more expensive than the PowerBI Pro licensing – which is the only additional cost in this scenario. Also Insights report builder is in Domo, and there’s frankly less experts in Domo than PowerBI. Actually, the Insights tool is being revamped again… so there’s another reason to wait on that potential way forward. Maybe if the needs change, we can argue for Performance+, but at this time, it’s not in the cards, as we’d need to retrain ourselves, on top of migrating existing workflows, and managing different licenses. It would probably reveal an efficiency in that once built we don’t have to manually update, and it’s managed inside the LMS so it’s got that extra layer of security.

Where I’m at currently is that I’ve built a couple reports, put that on a PowerBI dashboard. Really, relatively simple stuff. Nothing automated. Those PowerBI reports are built on downloading the raw data in CSV format from Brightspace’s Data Hub, once a week, extracting those CSVs from the Zip format (thankfullly Excel no longer strips leading zeros automatically, so you could migrate/examine the data in Excel as a middle step), placing them in a folder, starting up PowerBI then updating the data.

The one report that I will share specifics about is just counting the number of enrollments and withdrawls each day, but every report we use follows the same process – download the appropriate data set, extract from Zip, place it in the correct folder within the PowerBI file structure on a local computer. PowerBI then can publish the dataset to a shared Microsoft Teams so that only the appropriate people can see it (you can also manage that in PowerBI for an extra layer of security).

I obviously won’t share all the work we’ve done in PowerBI as well, some of the data is privileged, and should be for LMS admins only. However, I can share the specifics of the enrollments and withdrawals counts report because the datasets we need for enrollments and withdrawals is, not shockingly, “Enrollments and Withdrawals” (not the Advanced Data Set). We also need “Role Details”, to allow filtering based on roles. So you want to know how many people are using a specific role in a time period? You can do it. We also need “Organizational Units” because for us, that’s where the code is which contains the semester information for us and if we ever need to see a display of what’s happening in a particular course, you could. Your organization may vary. If you don’t have that information there, you’ll need to pull some additional information, likely in Organizational Units and Organizational Parents.

In PowerBI you can use the COUNTA function (which is similar to COUNTIF in Excel) and create a “measure” for the count of enroll and a separate one for the unenroll. That can be plotted (a good choice is a line chart – with two y-axis lines). Setup Filters to filter by role (translating them through a link between enrollments and withdrawals and role details).

I will note here, drop every last piece of unused data. Not only is it good data security practice, but PowerBI Pro has a 1 gig limit on it’s reporting size, which is a major limitation on some of the data work we’re doing. That’s where we start to actually get into issues, in that not much can be done to avoid it when you’re talking about big data (and the argument for just licensing Performance+ starts to come into play). While I’m a big fan of distributing services and having a diverse ecosystem (with expertise being able to be drawn from multiple sources), I can totally see the appeal of an integrated data experience.

Going forward, you can automate the Zip downloads through Brightspace’s API, which could automate the tedious process of updating once a week (or once a day if you have Performance+ and get the upgraded dataset creation that’s part of that package). Also, doing anything that you want to share currently requires a PRO license for PowerBI, which is a small yearly cost, but there’s some risk as Microsoft changes bundles, pricing and that may be entirely out of your control (like it is where I work – licensing is handled centrally, by our IT department – and cost recovery is a tangled web to navigate). Pro licenses have data limits, but I’m sure the Enrollments data is sitting at 2GB, and growing. So you may hit some data caps (or in my experience, it just won’t work) if you have a large usage. That’s a huge drawback as most of the data in large institutions will surpass the 1GB limit. For one page of visualizations, PowerBI does a good job of remembering your data sanitization process, so as long as the dataset itself doesn’t change, you are good.

Now, if you’re looking to do some work with learning analytics, and generate the same sort of reports – Microsoft may not be the right ecosystem for you as there’s not many (if any) learning analytics systems using Microsoft and their Azure Event Hubs (which is the rough equivalent of the AWS Firehose – both of which are ingestion mechanisms to get data into the cloud). That lack of community will make it ten times harder to get any sort of support and Microsoft themselves aren’t particularly helpful. In just looking at the documentation, AWS is much more readable, and understandable, without having a deep knowledge of all the related technologies.

A Weird Tidbit from 2009

From: Jon Kruithof
(email)
Mohawk College, Hamilton, ON, Canada media education
Date: 12/08/2009 06:22:08
A prevailing undercurrent is that education, like medicine, is a bitter pill to swallow. I agree with Ryan Cameron, and do think that educators have to be media moguls in some sense. It seems to me that e-learning in general will not change, but the educator behind it will have to adapt. The educator will have to be a scriptwriter, web designer, filmaker, image manager and do it well, and for less. With that critical consumption of media and the ability to discern fact from fiction will become more important. As we move further into the future, and into virtual worlds, identity and who one is will become critical to manage and control.

From Downes’ work here: https://www.downes.ca/post/53334

Ryan Cameron might be this particular person, and dang, I got it right in some ways. Look at the successful MOOCs – they all have personality and tell stories while still cramming the learning in. I’m pretty sure this comment pre-dates Khan Academy (actually, no, it started in 2008, but the comment predates it’s ascent into the popular conscience in 2012). And now with the AI apocalpyse (or more likely, the AI hype curve, followed by mainstream acceptance much like Google Search has become the defacto internet search). I don’t know if we’re at the tipping point of every educator needing to be well-versed in media techniques. But consumers and creators of media need to be aware of AI, and everything they do can be taken from them, analyzed, dissected and filed away for acontextual (not a real word) retrieval.

And the one thing computers still don’t have on people is identity – and that’s where we can really retain our humanity. If you’re worried about AI, right now, it’s got a little (prefabricated) personality. Once it starts to develop a cohesive one, one that doesn’t change based on the varied inputs, then you need to worry. In my most nihilistic, I think it’s perfect that this endgame-capitalism that values a race to a zero sum for everything is essentially building an entire replacement for the human that built it. I still have hope that we can cast off this AI-hype like blockchain and VR and treat it like the Google replacement it’s trying to be.

Why are Third Party Vendors Such Arses?

The short answer is that they’re not. They’re experiencing culture shock between higher education and capitalism. Their goals and higher education’s are entirely different, and sometimes diametrically opposed. Sometimes they’re not, but I’ll leave that for the Marxists out there to critique.

I’ll outline a few examples, no names except for where I need a name for a tool because it’s too hard to keep using “middleware” that could mean anything from a database to a API connector to something like IFTTT. I’m not writing this to shame edtech vendors or name call, but if you are a vendor and you do these sorts of things – maybe consider stopping.

Hyper aggressive sales.

You’ve all seen this, or gotten emails day after day from the same vendor telling you about their great product. Or, you’ve been a teacher, and they call you periodically. Or more frequently. Daily even. I’ve gotten relentless edtech bros emailing me on LinkedIn then at work. By the way, if you do this and it’s part of your company culture, you do know that I mark that stuff as spam, right? All it does is create one of two things for a relationship… you either gain someone who just capitulates to you (but resents you) or you anger someone (who then holds a grudge for longer than an eon). Neither of those are great, but one is a sale. In an extreme case, you might get a cease and desist from a CIO who is tired of your harassment.

Circumventing process.

EdTech workers have definitely been asked for this sort of stuff continually. Move fast and break things is not a good mantra for education, nor public institutions. If your company wants to do it your way, rather than a standard LTI 1.3 kind of way, and then refuses to budge because your API way (to simply manage single-sign-on!) is already built, you’re an ass. If you are ever told, “we don’t just enable every option in LTI 1.3 settings” and you turn around and suggest you need all those data options – you most definitely don’t. If we have a process that we tell you takes months to go through, no, it can’t go quicker. It’s literally my job to ensure the security of the data in the system you’re trying to connect to, so work with me, not against me. It’s not my fault you left it to the last minute before semester and are trying to rush the integration through, literally using teachers as a sacrificial wedge to bypass security, privacy and accessibility. You know what that makes you.

Oh, and when the vendor agreement allows an instructor to sign off for an entire institution? That’s no good.

Data greediness.

Outlined above a little bit, but when you ask for an API integration, you should be able to easily answer “What API calls are you making?”. If you have an LTI 1.3 integration, and we ask “what do you use this data for?” you should be able to answer that within minutes of asking. Dancing around that question just raises my suspicions. You might actually need all that data. In 20 years of doing this work, and probably working on 100+ integrations with the LMS and other tools, it’s happened twice. Those two vendors were very quick to respond with what they use each data point for, how long they kept it, and why they needed it for those functions. That’s excellent service. Also that wasn’t the sales person… so yeah. Oh, and 99% of integrations between the LMS and something else can be done with LTI 1.3. Vendors out there, please use the standards. And get certified by IMS Global/1EdTech. It goes a long way to building your reputation.

Third-party malfeasance.

OK, it’s not that bad, but a new trend I’ve started seeing is a vendor using another vendor to manage something (usually data). EdLink is the sort of thing I’m thinking about here. EdLink allows easy connections between two unrelated vendors with no established connection method. So think, connecting McGraw Hill to your Student Information System (not the actual example I’m thinking of to be clear, we don’t have, or want, to connect McGraw Hill to our SIS). To be honest, this doesn’t bother me as much as some of the other grievances I’ve got – but obfuscating your partnerships and running all your data through a third-party that we don’t have an agreement with, is definitely something that raises an eyebrow or three. As one starts to think about what-if scenarios (also my job) it makes clarity around who has your data at what time and for how long all the more difficult. The service doesn’t bother me, as long as the middle-person in the scenario is an ethical partner of the vendor you’re engaging with. In many cases, you need to have a level of trust in the partner, and if they’ve shown themselves as less than trustworthy, then well, you’ve got a problem.
Again, I’m sure EdLink is fine, but when a vendor uses EdLink, and is presented with that fact, it’s a challenge for security experts as they not only have to do one analysis, but two. I understand why a vendor might try to frame EdLink as their own service, but it’s undeniable that it isn’t. So just be honest and upfront. You may pass by a team that doesn’t prioritize this level of detail, but we are not blind. We will figure it out.

One other big challenge with third-parties acting on behalf of a vendor is that if there’s a problem, you typically have to go through the vendor to access the middle person’s support team to get it rectified. This adds a layer of complexity AND time to something that was likely intended to save time and hassle for the vendor.

Engagement = Coercion?

I started writing this in 2018, and I still struggle with the ideas that I’m trying to express with this idea. Ultimately I’m talking about the power structure in classrooms, or online environments and how those who are uncomfortable with those power structures can do very little with the environments themselves to dilute the power differential.

I’m often troubled by the term engagement. If attention to a thing is the most important commodity in modern capitalism, is engagement worth more than mere attention? We have seen with things like video games that attention is one thing, but engagement is a whole other metric. There’s an emotional component to engagement that isn’t there with mere attention.

And is engagement a coercion strategy? Are we asking students to become invested in something based on the value we think it will add to a student’s learning, even using marks as a lever to get students to do what we want them to do?

PAY ATTENTION!

I totally remember having that yelled at me. The lie that’s embedded in that line is that it’s missing the obvious – to what? In most cases, the missive should read “Pay attention to me!” No wonder I did so mediocre in school – I don’t really react well to that, and I think most of us don’t react well to that sort of pandering. That sort of coercive effort of attention grabbing never worked well for me. Maybe it’s the belief I hold that “good work gets noticed” and as I’ve gotten older I still think that’s true somewhat. But it might be 20 years later, after it’s made the rounds and the artist has died. Or the author. Or the creator. You see that on YouTube now, with someone putting out a video in early 2009, and it comes up in a search and it’s great. And has 234 views. You have a whole algorithm in the way of finding things organically now, and with no real way to control how you are served content (and that’s probably the google killer – having an adjustable algorithm that feeds you not only what you want but how you want) content creators that are successful aren’t successful because of their content per se, it’s that they know how to manipulate the algorithm to get you to see the content.

Which brings us back to metrics and measurements. Modern engagement metrics have shifted language such that really it’s attention, and not engagement, that they’re measuring. It literally took close to four years for me to figure out what bothered me about engagement metrics. Now I know it’s not about engaging with someone, it’s about getting them to pay attention to you. It explains a lot about why modern advertising methods are all about “engagement” which is theoretically deeper than “attention” – but the metrics they use (click-throughs, time on page) really don’t speak to engagement, as it does to attention.

Now when I’m talking about engagement in a classroom that’s likely different than engagement in an online environment – but engagement in an online environment is measured using the same methods that are about attention – which is a bit of a passé way to look at engagement in a classroom. Paying attention to a lecture is different than engaging with a lecture – and paying attention to a post in the LMS is different than engaging with a post in the LMS. Engaging is more associated with doing something in an educational context. So in many ways applying the common way of measuring engagement is not going to elicit much useful information in an educational context. Yet I still see people building courses in such a way to try and leverage attention, rather than engagement.

What do you think will engage people better? A well formed discussion question, or a lengthy video with interactive “engagement” in the form of questions? I would think that a well formed discussion question might linger longer in one’s mind.

What’s New?

Well, another semester start, and still the tickets pour in. Another season’s change and change is in the air. The department I’m in has gone through a self-study, and the results suggest that we’re not great at communicating and we need a strategy. Those are good suggestions and things we can enact. In other news, I’ve decided to apply to begin my Master’s, more on that when I’m accepted. I’m also a part of the Open Education Ontario cohort of Open Rangers, which is exciting and a touch scary. Maybe not scary, but I’m such a novice at open, even though I’m wholeheartedly behind the idea of open.

I think I’ll start blogging again, at least intermittently. There’s so much good going on right now, oh wait…. well not politically. But education is in a place that can really help with some of the struggles online right now.

I’ve spent a lot of time in my own head struggling with what I want to do with my life. I have felt trapped, unchallenged and basically in a rut. I couldn’t do what I’ve done in the past, quit and find something else, because I’ve been the primary bread winner and you know, wanted to keep a house and eating. I’m also 45 in a couple weeks, and that’s weighing heavily on me as well. So instead of trying to start something exciting and new with a significant impact on my life, I decided to stay and try to do something. Admittedly, there’s been some things that have made an pathway forward available, but I don’t want to say more in case the powers that be go in a different direction. Needless to say, if things work out, I’ll be challenged. I also think I’m ready for that challenge.

Oh, I started a music blog and podcast called General Admission. It’s only a monthly thing as a “creative” outlet (which involves buying media toys like lavalier microphones and podcast hosting) that really talks about punk primarily. If you only want to sample the best, check out Podcast #2, which talks about a similar riff throughout several different genres of music.  I hope to get to that level of thought and interest for every episode, but admittedly, if you’re not as into the New Bomb Turks, or The Dicks (forthcoming episode) then you might find that not every episode is to your liking.

Dead Drop no.2

A second hodgepodge of things I’ve found on the Internet.

How to Build a Twitter Bot

http://www.zachwhalen.net/posts/how-to-make-a-twitter-bot-with-google-spreadsheets-version-04/

https://www.labnol.org/internet/write-twitter-bot/27902/

https://www.fastcompany.com/3031500/how-twitter-bots-fool-you-into-thinking-they-are-real-people

The third article is the most interesting because it’s from 2014 (remember Klout scores?) and really shows how little (and how much) has changed in the Twitter bot universe. Essentially the heart of the strategy is the same, but mixed with a way to learn (through AI, or natural language processing which is clearly being done at some more sophisticated campaigns) and adapt depending on where we are, I suspect that we are on the cusp of what the future looks like when everyone you meet online could not actually be a person. My Philip K. Dick collection became a lot less sci-fi and more prescient.

More Bots (Cisco Spark/Microsoft Teams):

https://msdn.microsoft.com/en-us/microsoft-teams/botsconversation

https://developer.ciscospark.com/bots.html

Both Cisco Spark and Microsoft Teams are available where I work. Both have vague collaboration mandates, both can be used for communication. At this point, Spark might be more advanced with third party enhancements and bots. Teams might end up being the platform of choice if students are involved. I’m hoping to build a chatbot to handle common sorts of requests that our LMS support team get (ie. the stuff we have a predefined reply in our ticketing system, which isn’t attended to after 4:30 PM).

Natural Language Processing:

https://research.google.com/pubs/NaturalLanguageProcessing.html

https://medium.com/@jrodthoughts/ambiguity-in-natural-language-processing-part-ii-disambiguation-fc165e76a679

https://chatbotsmagazine.com/the-problem-with-chatbots-how-to-make-them-more-human-d7a24c22f51e

Natural language is key for the chatbot above, and of course the first attempt will be ruthlessly primitive I’m sure.

Chatbots in General:

https://www.theguardian.com/technology/2016/sep/18/chatbots-talk-town-interact-humans-technology-silicon-valley

Blackboard Thinks They Know How You Teach

In November of last year a blog post by Blackboard was making the rounds. A Blackboard study documented how instructors use their system which was somehow conflated as being equivalent to knowing how teachers teach. We could talk about how terrifying this blog post is, or how it’s devoid of solid analysis. Instead I’d like to turn over a new leaf and not harp on about privacy, or a company using the aggregate data they’ve collected with your consent (although you probably weren’t aware that when you agreed to “allow Blackboard to take your data and improve the system” that they’d make this sort of “analysis”), but just tear into the findings.

Their arbitrary classifications (supplemental? complementary?) are entirely devoid of the main driver of LMS use: grades, not instructor pedagogical philosophy – students go to the tools where they are going to get marked on.  Social? I guarantee if you did an analysis of courses that value use of the discussion tool (note, that’s not about quality) as greater than 30% of the final grade, you’ll see a  hell of a lot more use of discussions. It’s not that the instructor laid out a social course, with a student making an average of 50 posts throughout the course (which if you break that down to 14 weeks of “activity” that works out to 3.5 posts per week – which is strangely similar to the standard “post once and respond twice” to get your marks – it’s that discussions are the only tool that give some power to students to author.

Time spent is also a terrible metric for measuring anything. Tabs get left open. People sit on a page because there’s an embedded video that’s 20 minutes long. Other pages have text which is scannable. Connections are measured how? Is it entry and exit times? What if there’s no exit, how does the system measure that (typically it measures for a set amount of activity and then if none occurs the system assumes an exit)? Were those values excluded from this study? If so, how many were removed? Is that significant?

Now Blackboard will say that because of the scale of the data collected that those outliers will be mitigated. I don’t buy that for a second because LMS’s are not great at capturing this data from the start (in fact server logs are not much better), because there’s very little pinging for active window or other little tricks that can only be really assessed using eye tracking software and computer activity monitoring. Essentially garbage-in, garbage-out. We have half baked data that is abstractly viewed and an attempt is made at some generalizations.

Here’s a better way to get at this data: ask some teachers how they use the LMS. And why. It’s that simple.

If you look at Blackboard’s findings, you should be frankly scared if you’re Blackboard. Over half of your usage (53% “supplemental” or content-heavy) could be done in a password protected CMS (Content Management System). That’s a system that could cost nothing (using Joomla or any of the millions of free CMS software available) or replicated using something institutions already have. The only benefit institutions have is that they can point at the vendor when stuff goes wrong and save money on outsourcing expertise into an external company.

If you take the findings further, only 12% of courses (“evaluative” and “holistic”) get at the best parts of the LMS, the assessment tools. So 88% of courses reviewed do things in the system that can be replicated better elsewhere. Where’s the value added?

Why Can’t Students Opt-Out of Data Collection in Higher Ed?

You know, for all the talk from EdTech vendors about being student centred (and let’s face it, LMS’s and most of the other products are not student centred) and all the focus on data collection from student activity – why don’t products have an easy opt-out (being student centred and all that) to not allow data to be collected?

What makes matters worse in many ways is that the data collection is hidden from student’s view. For instance, in many LMS’s they track time spent on content items or files uploaded. This tracking is never made explicit to the student unless they go and dig into their own data. And doing that is incredibly difficult and you don’t get a complete picture of what is being collected. If I was a more conspiratorial minded person, I’d suggest that it was done on purpose to make it hard to understand the amount of total surveillance that is accessible by a “teacher” or “administrator”. I’m not. I honestly believe that the total surveillance of students in the LMS is really about feature creep, where one request turned into more requests for more information. LMS’s on the other hand want to keep their customers happy, so why not share what information they have with their clients, after all it’s their data isn’t it?

Actually, it’s not. It’s not the client’s data. It’s the individual’s data. To argue otherwise is to claim that what someone does is not their own – it reduces agency to a hilariously outdated and illogical idea.

The individual, human, user should be allowed to share or not share this data, with teachers, with institutions or with external companies that host that data in an agreement with an institution that probably was signed without them even knowing it. There’s an element of data as a human right that we should be thinking about. As an administrator I have a policy I have to adhere to, and a personal set of ethics that frankly are more important to me (and more stringent) than the obscurely written-in-legalese policy. An unscrupulous, or outright negligent LMS administrator would mean that all bets would be off. They could do things in the LMS that no one, except another administrator, could track. Even then, the other administrator would have to know enough to be able to look at all the hundreds of different changelogs, scattered across different tools, across different courses and do essentially a forensic search that could take a good long time to undo any damage. That lack of checks and balances (a turn of phrase that appears purposefully as I think we’ll see what a lack of checks and balances will be like in the US the next few years) which could be implemented as part of using the system, but aren’t, leaves education in precarious situations.

The idea that the data locked in the LMS without the students being able to say, “I only want my data shared with my College” or “I only want my data shared with company X for the purposes of improving the system” shouldn’t be hard to implement. So why hasn’t it been done? Or, even talked about?

In my opinion, the data that gets harvested (anonymously of course) provides more important information to the company about how people use the system than the optics of having an opt-out button. It allows Blackboard to say how instructors use their system. We could talk about how terrifying this blog post is (instructor use is a proxy for how students use the system because LMSs give power to instructors to construct the student’s experience), or devoid of solid analysis. I’ll deal with the analysis later, so let’s just consider how this is entirely without context.  Blackboard hosted sites have been harvested (probably with the consent of the people who signed the contracts, not the instructors who actually create things on the system, or the students who engage in the system) by Blackboard to tell you how you teach. In one of the cited pieces, Blackboard says that they used this data to improve their notifications. If I put this through for ethics review, and said I’m going to look at notifications improvement and then released a post about how people used the whole system, it may very well be in their rights (and I suspect it is) but it is ethically murky. The fact they’ve released it to the public allows the public to ask these questions, but no one really has? Or if they have, I missed it.

The fact that Blackboard can do this (and I’m talking about Blackboard because they did it, but there’s a similar post from Canvas that’s making the rounds about discussion posts with video being more engaging or some such idea) without really clearing this with any client is chilling. It also constrains how we perceive we are able to use the LMS (it sets the standards for how it is used).

Learning Technologies Symposium 2016 Recap

McMaster University holds a Learning Technologies Symposium every year and this year’s event was spread over two days just after (Canadian) Thanksgiving.  I have a bit of a biased view, as I’ve been one of the organizers over the last four years so unlike my other recap type posts where I share the things I’ve learned attending the sessions, this one will document the things that I did and the things I learned.

A few days before, we had some session cancellations so I, being the diligent jack-of-all-trades offered to run a few sessions to fill in the gaps. The first was on using PebblePad as a peer-review platform, the second was a session I co-presented with a colleague on WebEx, our new web conferencing tool and lastly I ran a quick introduction to digital badges.

I also had to dip back into my history as a media developer to help sort out (with the help of our current digital media specialists) how to get a video camera (actually two video cameras) with HDMI outs into a WebEx broadcast of our keynote (the answer was that it’s best to run it through a video mixer, in this case a Roland VR-50HD). It came together fairly well, without the caveat that WebEx is persnickety when you don’t connect devices in the correct order (ie. it accepted the video feed fine, but didn’t switch audio to match, a quick audio reset fixed the issue).

So I missed most of Barbara Oakley’s keynote, which dealt with a lot of scientific research into how we learn and pay attention, and a lot of the discussion that followed as I was busy with the setup and breakdown of the streaming rig.

The first session I attended was one I was presenting for 15 minutes about peer review – the other session was about Individual Assessment and Personal Learning – in the context of an Italian language course (which as an aside, uses an open textbook). Unfortunately, no one showed up. It was great to catch up with Wendy, who was presenting in the same slot as I, and we chatted about all sorts of educational technology things.

Onto day two… I missed the first block of sessions working on preparing for my own session on WebEx (and catching up on e-mail from the last week, when I was off). The WebEx introduction session went really well, people were happy with the visual fidelity and audio quality. They also seemed intrigued with the testing option, but who knows if anyone will actually take it up. I’m really hoping we could do something with it – maybe review classes at the end of a semester could take advantage of it?

Day two had a series of presentations on some of the VR/AR work that’s being done on campus right now, which is new and exciting even though I’m not involved in it. Every project I’ve heard about is using the technology in a way that makes sense and should enhance learning (unlike in the past where technology is used for flash and wow, without any consideration for it making any sense). I wish I didn’t present twice on day two, because those sessions would’ve been great to see more than for a few seconds.

At lunch David Porter (who I didn’t know I was following on Twitter) new CEO of eCampusOntario (our version of BC Campus) did an overview of how he sees eCampusOntario developing and essentially threw a job recruitment pitch out there as well. We’ll see how the latest round of funding for research and projects goes, because the call was very specific and rigorous – so that vision is really well defined. I did start a proposal, but of course, it was due on October 31st, and I was in Texas watching my favourite band play their last show (NSFW), so I was a little busy.

After lunch I did a presentation on digital badges (do we need them?) that was a brief introduction to digital badges and tried to answer whether we need them (the precis is no, we don’t need them but you may still want to use them in cases where you want to explicitly assess skills or document experiences – transcripts and portfolios can handle the rest). This was paired with an instructor who uses our LMS for exams, in a relatively unique way. The exam is a paper based math exam, mostly generating matricies. The students complete the exam using pencil, paper and calculators, then are given half an hour to transcribe the work into D2L. The instructor then uses the auto marking feature to grade exams using fill-in-the-blanks. It’s a pretty clever way to make marking easier.

After closing remarks – and a giveaway – it was all wrapped up for another year.

 

 

A Buffet of Educational Technology Thoughts

If you’ve read anything in this blog, you know that I’m subject to “oh look, shiny!”, constantly distracted and going in one hundred directions. This post will get as close to the way my brain works.

First up, we’re scrapping Blackboard Collaborate as our web conferencing tool and installing WebEx. As a conferencing tool it’s light years ahead in terms of usability and functionality. I’m sure some of our more advanced users will find the quirks, but hopefully we can manage to stay one or two steps ahead of them. We had been Collaborate clients for years, migrating over from a self hosted Elluminate install.  Over time, the product, and it’s terrible Java interface, caused our users issues. We did integrate it directly with our D2L installation, which solved a lot of the interface issues, but then we’ve been hit with conversion errors that can’t be fixed by the user but prompt a ticket to Blackboard support. While Blackboard support have been excellent in this particular case, they haven’t been great over the years. Combine that with the fact that Blackboard has been promising a lot, and not producing a whit of evidence that they’ll be able to pull it off. If they weren’t so big, I’d be calling all their promises vaporware, but I fully expect they’ll be able to deliver eventually. It’s the eventually part that’s the problem.

Second, I’m working through how we can roll out blogs effectively to faculty who want their students to blog, but want a campus install to do it from. I know WordPress Multisite is the way to go, but it’s going to be a slow going process as we need to work with other groups on campus to make this one happen. I personally think that having an academic blog is an important piece of the process of going to University and becoming an academic – how else do people disseminate their findings to the public without the filter of a news organization? How else do academics form their own personal learning network? I’m a huge believer in blogging as a form; and I see it as a reflective practice more often than not. It’s also a space that I can use to see how ideas sound, and it helps me articulate ideas better (by slowing my brain down to typing speed, which is much slower than my mouth goes).

Third, is the upgrade to Turnitin, will practically force us to convert our existing connection between D2L and Turnitin to the new LTI connection between the two parties. As always, this is a last minute addition to our semester startup, so it’s an added complexity that we didn’t really want to think about but will have to consider over the next few days. While Turnitin is forcing everyone to upgrade, there is an opt-out process, but from what I know (and I’ll know more later this week when we chat with our academic integrity office) we don’t know what that really means? How does opting-out effect us? Can we revert if everything craps out and nothing works post-upgrade?

Fourth, I’ve been asked to sit on a portfolio advocacy committee, that will push portfolio use to “the next level” campus wide. I have a few ideas, but I’ve never been fond of sitting on committees, more fond of the work that needs to get done out of the committees. I guess it’s progress when you have someone who knows what it takes and whether it can be done currently, rather than facing down the fact you can’t do what you had proposed due to technical feasibility. My boss is sneaky good at eliminating my ability to point the finger at other people’s decisions, so I guess this one will partially be on me.