Fusion 2014 – Unconference and Day One Recap

Instead of a big post I’m going to break my experiences up into three distinct posts because a) it’ll get me to post more frequently, b) that’s something I want to do and c) no one wants to read a monolithic block of text.

So I flew out of Buffalo, and it was an interesting time crossing the border where I got the fifth degree about where I was going and what I was doing. I think they thought I was being paid to speak at a conference, next time I’ll have to change the language I use to say something like attending a conference. After the border and the pornoscanners at the airport. I arrive in Nashville. Now, I’m not that worldly, but I’ve been to a few places. Nashville is not one of my favourites, not because the city is particularly terrible, it’s not particularly walkable, and it has well, public transportation issues. Outside of those quibbles (which are big problems for me) it’s a fine city with some fine people.

The Unconference

One of the best things that happens at Fusion for the last 5 or 6 years is the Unconference. I missed the first few because I was never able to actually get to Fusion, but the last couple of years I’ve been able to go, this was the event kick-off that was fun, social, and often leads to previously undiscovered ideas and new ways to break D2L. I didn’t stick around for the full discussion because I was a bit tired, but the one thing that I did learn was that VHS (Virtual High School) use Javascript to develop interactive elements of courses. Now that’s not a particularly shocking example, but combine that with the Valence API and maybe you could do some in situ testing and push results to the gradebook. Later a few of us went out for a nightcap and a good time was had by all.

Fusion Day One

Typically the first day has a ton of beginners and introduction sessions in the morning, so I ended up meeting with my co-presenter to go over our session the next day. The sessions I did attend were incredibly useful for me and I learned a ton about how other places develop in-house solutions. In fact most of my attendance was in sessions that were around External Learning Tools or the Valence API.

Keynote

So John Baker’s keynote threw the audience a little, the big takeaway was that Desire2Learn is now D2L, and the Learning Environment, or the LMS, is now called Brightside Brightspace. I guess there’s a thinly veiled jab at the competitors being the dark side, but I can’t say that I understand the need to change names. Lots of people at the conference have suggested that Desire2Learn seems a very 1990′s thing, reminiscent of the dot com boom/bust. I can’t say that they’re wrong. However, it would’ve been nice to have been told that officially. I’m a bit of a smartass when it comes to names, so my immediate nature is to shorten this to it’s logical shortform, BS. Not necessarily flattering. I don’t think D2L is big enough to have gotten out in front of it to shorten it to B, which in and of itself is not a good acronym either (B product? B movie?).

I’m not the only person who’s looking for a short form for it either. Considering I don’t know the difference between Brightspace and the LE (so is the new version called Brightspace version 1?) or if the existing products are called the LE 10.3 still… so many questions. None of them answered.

I’m sure many will talk about Chris Hadfield’s inspirational speech, it was great and all, and I certainly appreciate what he’s done. I just don’t see the connection to the conference that he brings.

Integrating Neat Tools and Activities into your Course through LTI

This session was all about External Learning Tools – which we’ve had a summer of dealing with so far. This particular session talked about integrations between SoftChalk, SWoRD and TitanPad. I’m familiar with SoftChalk through a series of courses I’m taking at Brock University and I can say that I’ve never been particularly impressed with the product – perhaps that’s the way that Brock is using it, or the way the course was developed, or a limitation of Sakai, Brock’s LMS. Either way, this session demonstrated the connection between SoftChalk activities hosted in Content then connecting to add grades into the Gradebook – certainly a more interesting way to deal with whatever you design in SoftChalk.

SWoRD was a particularly an interesting case – although I don’ t know how robust or deep the integration was (I suspect D2Lwas merely passing enrollment data to SWoRD). SWoRD is a peer assessment tool that might be an alternative to something like PeerScholar.

I’m always happy to see Etherpad clones, and TitanPad was used as an example, but if you’ve hosted an Etherpad clone at your institution you can pass user names to the Etherpad for auto tracking in the document. I’m not sure how robust Etherpad is for say, classes of 600+, but that would be an interesting experiment.

One thing the D2L presenter said was that in the configuration of the external tools, when you check the option to send User ID, it means sending the anonymized version of the username, which is interesting because the language used in the external tools dialog would benefit from adding this tidbit – we’ve turned it off in most cases (and seem to have no issue with students/instructors logging into the external tool) because we thought it would violate our University privacy rules.

The Secret to APIs

The second session of the day for me was around the use of Valence (D2L’s API) to create personal discussions in a course with enrollment of one student and the instructor. The big takeaway for this was that in courses that have enrollments set, you can save a ton of time by writing a script to do the repetitive boring stuff like create a group of one, enroll a student in it, then create a discussion topic and restrict it to that group. Was interesting to see C# used as the middleware programming language – I thought that C# was out of favor but maybe not? PHP would’ve been easier, and PERL/Python might’ve been faster to complete the task. Either way, this is the work that earned Ryan Mistura the Desire2Excel award in the student category. Cool stuff

Solution Spotlight/D2L Year Recap

Nick Oddson and Ken Chapman handled the recap of the D2L year, focusing on the extensibility of the platform. They did point out that there is a 40% faster time to resolution because they’ve increased their support and SAAS service teams. Which is good, because their service was slow before. I have noticed that their support turnaround is probably the best it’s been in years.

The looking forward part of their talk was interesting – it seems like they talked a lot about either 10.3 improvements (that were already announced last year, and available now), or stuff that we can’t see yet. Perhaps a chart:

10.3 Feature Unreleased to the Public
  • Wiggio
  • Discussions (Grid View restore)
  • Binder – Windows 8 support
  • Quizzing UI/UX improvements
  • Content Notifications
  • Student Success System
  • LEaP – Adaptive Path learning
  • Course Catalog (currently being used on Open Courses)
  • Visual Course Widgets (customization, I presume at a cost)
  • Built-in practice questions in Content (contextualized learning)
  • Gamification built into the Learning Environment (I assume 10.4/LSOne/Brightspace)

I suspect that the amount of talk about predictive modelling is something they want to build primarily for remedial use, and for online courses primarily. As a market strategy, that makes some sense. Some of D2L’s bigger clients are primarily online universities.

Blackboard Collaborate Integration with Desire2Learn, Uhh D2L, LE uhh Brightspace 10.3

I think I did that right?

Back in June we took a few weeks and integrated Blackboard Collaborate (our web conferencing tool) with our instance of the Learning Environment (Brightspace just doesn’t feel right). We are currently running 10.2 SP9 of the LE.

Reflections? Well, for such a simple integration (and really the D2L interface is waaaaay better than the Blackboard Collaborate interface) it took a hell of a long time. We had to purchase and get D2L to install the IPSCT pack – so if you’re entering into an agreement with D2L and may way to do this later, definitely spend the cash up front. From start to unveil it was over six weeks – now that’s not solid work on just this. After D2L installed the IPSCT pack, we had to contact Blackboard support to get our credentials. Seeing as we’ve had total turnover in who supports Blackboard Collaborate, our new Collaborate support person was not on the list of approved contacts – which is funny because she’s the one who does all the tickets. So we contact our account manager. No response. It turns out that well, they are no longer our account manager, that’s why we haven’t heard from them in over 9 months. Great. So support can’t do anything, neither can our phantom account manager. Finally we get to the bottom of who our new Blackboard account manager is, they straighten out the mess and our person is now an approved contact. After that it still takes a week to get our credentials for test and prod.

Configuration on test went smoothly enough – if you’ve ever worked with External Learning Tools in the LE, it’s the same as any other configuation in that tool – have the address to make the connection, secret key and password, check a few more boxes, and then off you go. Now everyone who gets enrolled in the LE gets a Default Role at the org level, and then gets assigned a more applicable role at the course offering level, which means for us, you have to go through not only the Instructor/Student and TA roles, but the Default Role as well. While this is a pain to do, it’s often easy to forget to do it – and that’s what we promptly did. A day or two was spent tearing what’s left of my hair out, until the lightning struck and it sparked the engine enough to get it firing again.

Fast forward a couple of weeks and we get some time to implement it on prod, we yet again forget what we did to make it work. A week later we said something to the effect of  “Fudge, Default Role…” ran off to the LE and fixed our error. Sometimes it’s not the technology that fails you…

What I Want in a xMOOC

Listen, I hear you – many will respond to my title and say “nothing, I want nothing from an xMOOC and I hope they all become the passing fad that they are”. I feel your sentiment, but I think there’s value in xMOOCs, as bad as the pedagogy is behind them.

1. As a training program, which most xMOOCs are, they can be incredibly useful. For base knowledge, and introductory subjects, these are appropriate tools to get learners to the next step. What I want to see from the Coursera’s and Udacity’s is that they provide more flexibility – open enrollments, work at your own pace, maybe even mini-credentials per unit (via badging?) and most importantly, multiple pathways for learning. I would love to see an online course develop multiple methods of instruction that as a student, I could opt into if I’m having trouble with a topic. That would signal to me that it’s about learning, not about profit margins – because frankly, developing multiple methods of instruction for an online course is incredibly expensive. Putting on a venture capitalist’s hat (and it’s ill fitting on me, I will admit), this could give a company an advantage in being able to leverage the data to determine which learning method is best suited to a student/learner, based on prior successes.

2. Better mechanisms for assessment. All the MOOC platforms have great testing banks, but not much else in the way of advancing assessment. Coursera’s peer evaluation is a step towards something good – but it relies on peer assessment without any repercussions – I could do a garbage job of assessing someone and it won’t affect me – there needs to be a balancing here to ensure that peer assessment is valued as important. Programatically, it’d be easy to do, make sure that feedback exists, make sure it’s longer than x characters, and make sure that there’s a mechanism for providing ways to improve (could be as simple as an explicit “ways to improve this critera”).

3. Speaking of open, for the courses that are free, I’d like to see a commitment to openness, meaning that the materials created are able to be repurposed in other contexts, easily acquired, clearly labelled and ideally in a repository. Yeah, like that will happen. The only open in xMOOC is open enrollment.

Learning Portfolio Writing Prompts

One of the problems of asking first year students to “reflect” is that, typically, they don’t know how to reflect. It’s not a skill that a lot of students come prepared to University with, nor have developed. Yet, it’s a critical skill to have – to think about what you’ve done, and identify what worked well, what needs improvement and what can change.

Reflection is really not a simple process, but it’s crucial to learning, and really important to deep learning. Think of all the life lessons you’ve learned, and I bet you’ve thought about them often, and sometimes deeply. They change you. Similarly, good educational experiences (whether that’s reading a book, attending a lecture, practicing a lab, or just trying something out) cause you to think about them, and again, sometimes deeply. It’s that deep learning that the Learning Portfolio wants to get at.

The activities we have unveiled in the first year of the Learning Portfolio were good – but mostly course based. Anecdotally, we didn’t see a lot of extracurricular activity, or if we did, it was part of the program. One potential reason was that we didn’t give any student a reason to actually use the tool. So one way to solve that will be to post writing prompts, to offer students something to reflect on and a reason to use the tool on their own. Each writing prompt will help students connect their academic work to their “outside” life, connecting academia to reality. I consider this sort of thing “translational” – an effort to break academia down to understandable language for the average person. In the process, hopefully students will engage with thinking about what they’re doing, set a goal for themselves and maybe get a little bit more out of their experience here.

Learning Portfolio Showcase at McMaster University

The day surrounding the culmination of a significant portion of my work around the learning portfolio at McMaster this year was the Learning Portfolio showcase.

In the morning, Randy Bass gave a talk about where learning might be going – I’ve embedded it here to provide some context for how our university is framing the discussion.

As usual, Randy gave a great talk, this is my second (maybe even my third) time listening to Randy speak and it’s well worth the investment of time. In the afternoon, there was more discussion around the use of the Learning Portfolio – from potential employers, from faculty, from staff and from students. Below is a short overview of some of the things we’ve learned.

One of the  big things I’ve learned this year is that the tool we’re using (D2L’s ePortfolio tool) really dictates how things get done. It would be nice to have a timed reflection option (currently we use Quizzes for this process – and that’s got problems in and of itself), it would be nice to allow students to act as teachers in certain contexts – dictated by the instructor of the course. I know that the tool is getting an upgrade, but the upgrades can’t come soon enough. Once we can use the ePortfolio app (and that’s not D2L’s fault, we can’t use it because of the method of single-sign on that we use) that will change some contexts, but maybe not all.

With all the statements about student-centred learning, faculty are asking for the ability to simplify access to ePortfolio content. I can see some benefit in the K-12 market as well for this behaviour (as much as I don’t particularly like it). If faculty are to guide students in good reflection, they should be able to randomly select someone (maybe blindly select from a group of people).

Survey Says? A: Integration!

We’re in the midst of developing an LMS survey – which has broader implications as we also want to ask about other services my department provides (which is Blackboard Collaborate for web conferencing and iClickers for classroom response). All of a sudden, this quick survey has turned into this potentially really long thing that people will be unlikely to answer. Never mind it’s late in the semester (exams started yesterday) so I expect that user responses will be less than stellar.

With that said, it will be interesting to see what people think of D2L as we’ve introduced it. In the three years I’ve been here we’ve made some significant upgrades (from 9.2 to 10 in one year, and to 10.2 the next) and with those upgrades have come some significant growing pains. Next year should see us integrate Blackboard Collaborate, Pearson and McGraw Hill into our instance of D2L, which will hopefully solve some issues for faculty (namely the butt ugly interface you have to use with Blackboard Collaborate). We’re also planning on fixing the partial e-mail issue and additionally have PeopleSoft integrated into our process as the old Student Information System is going to get shut off. Cripes, that looks like a lot of change on paper – and I’m not quite sure how things will get managed. So yeah, if I’m not blogging that may be why (never mind the two presentations I’m co-presenting at Fusion, or the one I’m doing at AAEEBL!). Busy, busy summer.

Well, what does that mean for you, good reader? Well, probably more of the same – sporadic updates, maybe some neat charts and graphs,  some preliminary findings from our Learning Portfolio initiative in year one and more snarky comments about what MOOCs have become.

 

ePortfolios So Far…

So I’m the (self proclaimed) technical lead for the ePortfolio tool (which McMaster has rebranded Learning Portfolio). Actually I fell into the role when the person who was supposed to manage it ended up not being able to deliver the initial rollout presentation due to illness. The tool has worked admirably, scaled well, and frankly done its job. There are warts on the tool – maybe barnacles are a better imagery (to go a step further, barnacles can be removed  with a lot of hard work). The way one assembles a presentation (a sharable portfolio) is very mid 90′s – which makes the product a hard sell to student used to the ease of Tumblr or WordPress. Despite that major hurdle, we have seen over the first semester a moderate success, just over 21,700 items built in ePortfolio, with close to 3500 of those being reflections. We have over 3200 unique users, with just 17 courses using it. We’re now starting phase two of the first year, which will be periodic reminders to students, and we should see an uptick in usage – as well as an addition of two courses which will impact at least 1500 students. Now the real question is will students start to use it outside of classes?

NMC Horizon Report 2014

Hmmm.

With every passing year I spend in edtech, I always pick this PDF up with some dread. It provides hope on many of it’s long view items – hope that educational technology will get better, less manipulative, less data driven, and more inventive – allowing teachers to do what they love (hopefully) better and differently. This year’s report is pointing at The Quantifiable Self as something teaching and learning will be doing in five years as well as Virtual Assistants.

On the surface, these seem reasonable – however I believe that most institutions are doing this in some form already. The Quantifiable Self is really about identifying trends (mostly around health through tools like FitBit or Nike+ app) and using that physical information to push you to do better, walk more and so on. With education, in a well designed course, students are already doing this – taking self-assessments that build confidence in a field, confirming that the student “knows” something. LMS analytics also contribute to this – in our instance Desire2Learn has the Student Success System, which gives feedback to students individually on how they’re performing in the class and in school in general. This horizon technology is already here, not five years away at all. At some point, there will be pushback (I hope) on all this data collection that is saved in private institutions – you as the creator of that data should be able to control it – it is in fact your intellectual property.

Virtual Assistants? Oh c’mon. That’s here in higher education now. Students are checking Google on their phone, which gives them more information (tailored to their search patterns) that they might need. On the Android phones, Google Now is providing contextual information that can be used in context – if you search for political science information every Wednesday at 7:00, Google Now will start feeding you information about political science at that time. Furthermore, Google then takes your new interest in political science into consideration on future searches. Siri and Iris are omnipresent in classrooms, and used to fact-check, find alternate solutions to problems, or just alert the user to a new deal on shoes. Again, this is not on the horizon, it’s here.

What is on the horizon is faculty using and integrating this technology into their classrooms in creative ways.

Ontario Online Initiative

So, undoubtedly this announcement by the Ontario government to give $42 million dollars to develop online courses, by getting Universities to compete in some sort of race-to-the bottom/bloodsport of course development so that they get money to develop courses that can be delivered online is kinda’ quaint. I mean, I’d be impressed if this was 1997, or even 2003, but 2014 seems a bit shameful.

Let’s face it, I’ve said it before, and I’ll say it again, Ontario is a decade behind the western provinces (who easily lead innovation in this realm in Canada) and probably five or six years behind the maritime provinces, who do online learning out of necessity. The really shameful thing is that Ontario Universities should’ve been able to see this gap growing for years, and sought to address it sooner. I guess the last fifteen years of conservative government at the provincial level didn’t help as they were looking to scale back and demolish education into a private business as they are likely to do. $42 million dollars is really  a drop in the bucket – especially if you’re setting up a whole education system outside of the existing University and College system, that will allow transfer of credits to those systems. Which brings up an interesting issue, if I take a University credit, will Colleges accept that towards a diploma they offer? What about the other way?

Of course all Universities will be in on this action, and some of the more developed online schools will have a leg and hand up but it’s a nice development for someone like me who needs 4 or 5 more credits for my degree, and has had a lot of problem with either a home University accepting a credit, but the visiting University not accepting me as a student (in favour of their own); or taking the course and not having the credit accepted at all.

What worries me most though is that if this initiative is ushered in (as no doubt the government goes to election) and is a fair to midlin’ reception. What happens then? This institute and and initiative is certainly subject to a new government’s impulses and legislation. I can easily see a conservative government saying “this doesn’t work, let’s let private market run it better and cheaper.”

And we know where that will end up.

Happy New Year

In the past I’ve looked at previous posts about what I think will happen, and reflect on those ideas. It’s not that I don’t think reflection is valuable, it’s just that I’m not that interested in navel gazing (hell, I can see my navel getting bigger by day).

This year, I’ll outline some of the projects I’m currently involved with and will try to write about this year.

Work Projects

So for work, I’m working on two large-ish projects. One is a Productivity and Innovation Grant funded project lead by the University of Guelph, around learning outcomes in D2L. What the project encapsulates is ensuring there’s alignment between course, program and ultimately University related outcomes – and the reporting that D2L will suggests where there are holes in the alignment. It seems like it will improve the Analytics/Insights tool greatly with global reporting options – which is something I’ve struggled with greatly.

The other, is around Learning Portfolios. The department that I’m embedded with has gotten some funding from the University to advance Learning Portfolios (the ePortfolio tool in D2L) on campus and it’s looking like we will be responsible for this area from here on out. I think that some improvements to the way the tool works by D2L will only help the adoption of the tool – however there’s still some major hurdles that have to be overcome before there’s widespread adoption. That’s not to say that adoption and use hasn’t grown greatly, it has – just the impact of the use so far has not produced enough of a ripple to spread campus-wide. That’s our job in year two. I’m putting in a Fusion 2014 proposal to co-present one of the really interesting stories from first semester that ties blended learning, learning portfolios and helping students reflect (in this case on career choices).

Personal Projects

Other than the banal things like redo the bathroom and visit more places, I’m putting out a record with my one band and releasing another record with my other band. Not very exciting unless you like hardcore punk.

While this is work-related, I want to put together a rubrics repository (like Rubistar, but much more focused on local courses, and local sharing) that has a series of rubrics saved covering higher education courses that the University teaches. This way, it gathers together some of the best work that faculty have done, recognizes them, allows them to set sharing permissions, and ultimately, choose to export as PDF or into D2L. This is a big project, and really not on anyone’s timeline, but I want it to happen. It’ll have to be open source, and to that end, maybe it doesn’t just spit into D2L but into Blackboard or other systems too. The first iteration will of course work with our system (D2L) and then maybe we can branch out.

I’d love to help update the Feed2JS codebase to get it WCAG 2.0 compliant.  I’d also love to blog more.