Why are Third Party Vendors Such Arses?

The short answer is that they’re not. They’re experiencing culture shock between higher education and capitalism. Their goals and higher education’s are entirely different, and sometimes diametrically opposed. Sometimes they’re not, but I’ll leave that for the Marxists out there to critique.

I’ll outline a few examples, no names except for where I need a name for a tool because it’s too hard to keep using “middleware” that could mean anything from a database to a API connector to something like IFTTT. I’m not writing this to shame edtech vendors or name call, but if you are a vendor and you do these sorts of things – maybe consider stopping.

Hyper aggressive sales.

You’ve all seen this, or gotten emails day after day from the same vendor telling you about their great product. Or, you’ve been a teacher, and they call you periodically. Or more frequently. Daily even. I’ve gotten relentless edtech bros emailing me on LinkedIn then at work. By the way, if you do this and it’s part of your company culture, you do know that I mark that stuff as spam, right? All it does is create one of two things for a relationship… you either gain someone who just capitulates to you (but resents you) or you anger someone (who then holds a grudge for longer than an eon). Neither of those are great, but one is a sale. In an extreme case, you might get a cease and desist from a CIO who is tired of your harassment.

Circumventing process.

EdTech workers have definitely been asked for this sort of stuff continually. Move fast and break things is not a good mantra for education, nor public institutions. If your company wants to do it your way, rather than a standard LTI 1.3 kind of way, and then refuses to budge because your API way (to simply manage single-sign-on!) is already built, you’re an ass. If you are ever told, “we don’t just enable every option in LTI 1.3 settings” and you turn around and suggest you need all those data options – you most definitely don’t. If we have a process that we tell you takes months to go through, no, it can’t go quicker. It’s literally my job to ensure the security of the data in the system you’re trying to connect to, so work with me, not against me. It’s not my fault you left it to the last minute before semester and are trying to rush the integration through, literally using teachers as a sacrificial wedge to bypass security, privacy and accessibility. You know what that makes you.

Oh, and when the vendor agreement allows an instructor to sign off for an entire institution? That’s no good.

Data greediness.

Outlined above a little bit, but when you ask for an API integration, you should be able to easily answer “What API calls are you making?”. If you have an LTI 1.3 integration, and we ask “what do you use this data for?” you should be able to answer that within minutes of asking. Dancing around that question just raises my suspicions. You might actually need all that data. In 20 years of doing this work, and probably working on 100+ integrations with the LMS and other tools, it’s happened twice. Those two vendors were very quick to respond with what they use each data point for, how long they kept it, and why they needed it for those functions. That’s excellent service. Also that wasn’t the sales person… so yeah. Oh, and 99% of integrations between the LMS and something else can be done with LTI 1.3. Vendors out there, please use the standards. And get certified by IMS Global/1EdTech. It goes a long way to building your reputation.

Third-party malfeasance.

OK, it’s not that bad, but a new trend I’ve started seeing is a vendor using another vendor to manage something (usually data). EdLink is the sort of thing I’m thinking about here. EdLink allows easy connections between two unrelated vendors with no established connection method. So think, connecting McGraw Hill to your Student Information System (not the actual example I’m thinking of to be clear, we don’t have, or want, to connect McGraw Hill to our SIS). To be honest, this doesn’t bother me as much as some of the other grievances I’ve got – but obfuscating your partnerships and running all your data through a third-party that we don’t have an agreement with, is definitely something that raises an eyebrow or three. As one starts to think about what-if scenarios (also my job) it makes clarity around who has your data at what time and for how long all the more difficult. The service doesn’t bother me, as long as the middle-person in the scenario is an ethical partner of the vendor you’re engaging with. In many cases, you need to have a level of trust in the partner, and if they’ve shown themselves as less than trustworthy, then well, you’ve got a problem.
Again, I’m sure EdLink is fine, but when a vendor uses EdLink, and is presented with that fact, it’s a challenge for security experts as they not only have to do one analysis, but two. I understand why a vendor might try to frame EdLink as their own service, but it’s undeniable that it isn’t. So just be honest and upfront. You may pass by a team that doesn’t prioritize this level of detail, but we are not blind. We will figure it out.

One other big challenge with third-parties acting on behalf of a vendor is that if there’s a problem, you typically have to go through the vendor to access the middle person’s support team to get it rectified. This adds a layer of complexity AND time to something that was likely intended to save time and hassle for the vendor.

Engagement = Coercion?

I started writing this in 2018, and I still struggle with the ideas that I’m trying to express with this idea. Ultimately I’m talking about the power structure in classrooms, or online environments and how those who are uncomfortable with those power structures can do very little with the environments themselves to dilute the power differential.

I’m often troubled by the term engagement. If attention to a thing is the most important commodity in modern capitalism, is engagement worth more than mere attention? We have seen with things like video games that attention is one thing, but engagement is a whole other metric. There’s an emotional component to engagement that isn’t there with mere attention.

And is engagement a coercion strategy? Are we asking students to become invested in something based on the value we think it will add to a student’s learning, even using marks as a lever to get students to do what we want them to do?

PAY ATTENTION!

I totally remember having that yelled at me. The lie that’s embedded in that line is that it’s missing the obvious – to what? In most cases, the missive should read “Pay attention to me!” No wonder I did so mediocre in school – I don’t really react well to that, and I think most of us don’t react well to that sort of pandering. That sort of coercive effort of attention grabbing never worked well for me. Maybe it’s the belief I hold that “good work gets noticed” and as I’ve gotten older I still think that’s true somewhat. But it might be 20 years later, after it’s made the rounds and the artist has died. Or the author. Or the creator. You see that on YouTube now, with someone putting out a video in early 2009, and it comes up in a search and it’s great. And has 234 views. You have a whole algorithm in the way of finding things organically now, and with no real way to control how you are served content (and that’s probably the google killer – having an adjustable algorithm that feeds you not only what you want but how you want) content creators that are successful aren’t successful because of their content per se, it’s that they know how to manipulate the algorithm to get you to see the content.

Which brings us back to metrics and measurements. Modern engagement metrics have shifted language such that really it’s attention, and not engagement, that they’re measuring. It literally took close to four years for me to figure out what bothered me about engagement metrics. Now I know it’s not about engaging with someone, it’s about getting them to pay attention to you. It explains a lot about why modern advertising methods are all about “engagement” which is theoretically deeper than “attention” – but the metrics they use (click-throughs, time on page) really don’t speak to engagement, as it does to attention.

Now when I’m talking about engagement in a classroom that’s likely different than engagement in an online environment – but engagement in an online environment is measured using the same methods that are about attention – which is a bit of a passé way to look at engagement in a classroom. Paying attention to a lecture is different than engaging with a lecture – and paying attention to a post in the LMS is different than engaging with a post in the LMS. Engaging is more associated with doing something in an educational context. So in many ways applying the common way of measuring engagement is not going to elicit much useful information in an educational context. Yet I still see people building courses in such a way to try and leverage attention, rather than engagement.

What do you think will engage people better? A well formed discussion question, or a lengthy video with interactive “engagement” in the form of questions? I would think that a well formed discussion question might linger longer in one’s mind.

What’s New?

Well, another semester start, and still the tickets pour in. Another season’s change and change is in the air. The department I’m in has gone through a self-study, and the results suggest that we’re not great at communicating and we need a strategy. Those are good suggestions and things we can enact. In other news, I’ve decided to apply to begin my Master’s, more on that when I’m accepted. I’m also a part of the Open Education Ontario cohort of Open Rangers, which is exciting and a touch scary. Maybe not scary, but I’m such a novice at open, even though I’m wholeheartedly behind the idea of open.

I think I’ll start blogging again, at least intermittently. There’s so much good going on right now, oh wait…. well not politically. But education is in a place that can really help with some of the struggles online right now.

I’ve spent a lot of time in my own head struggling with what I want to do with my life. I have felt trapped, unchallenged and basically in a rut. I couldn’t do what I’ve done in the past, quit and find something else, because I’ve been the primary bread winner and you know, wanted to keep a house and eating. I’m also 45 in a couple weeks, and that’s weighing heavily on me as well. So instead of trying to start something exciting and new with a significant impact on my life, I decided to stay and try to do something. Admittedly, there’s been some things that have made an pathway forward available, but I don’t want to say more in case the powers that be go in a different direction. Needless to say, if things work out, I’ll be challenged. I also think I’m ready for that challenge.

Oh, I started a music blog and podcast called General Admission. It’s only a monthly thing as a “creative” outlet (which involves buying media toys like lavalier microphones and podcast hosting) that really talks about punk primarily. If you only want to sample the best, check out Podcast #2, which talks about a similar riff throughout several different genres of music.  I hope to get to that level of thought and interest for every episode, but admittedly, if you’re not as into the New Bomb Turks, or The Dicks (forthcoming episode) then you might find that not every episode is to your liking.

Dead Drop no.2

A second hodgepodge of things I’ve found on the Internet.

How to Build a Twitter Bot

http://www.zachwhalen.net/posts/how-to-make-a-twitter-bot-with-google-spreadsheets-version-04/

https://www.labnol.org/internet/write-twitter-bot/27902/

https://www.fastcompany.com/3031500/how-twitter-bots-fool-you-into-thinking-they-are-real-people

The third article is the most interesting because it’s from 2014 (remember Klout scores?) and really shows how little (and how much) has changed in the Twitter bot universe. Essentially the heart of the strategy is the same, but mixed with a way to learn (through AI, or natural language processing which is clearly being done at some more sophisticated campaigns) and adapt depending on where we are, I suspect that we are on the cusp of what the future looks like when everyone you meet online could not actually be a person. My Philip K. Dick collection became a lot less sci-fi and more prescient.

More Bots (Cisco Spark/Microsoft Teams):

https://msdn.microsoft.com/en-us/microsoft-teams/botsconversation

https://developer.ciscospark.com/bots.html

Both Cisco Spark and Microsoft Teams are available where I work. Both have vague collaboration mandates, both can be used for communication. At this point, Spark might be more advanced with third party enhancements and bots. Teams might end up being the platform of choice if students are involved. I’m hoping to build a chatbot to handle common sorts of requests that our LMS support team get (ie. the stuff we have a predefined reply in our ticketing system, which isn’t attended to after 4:30 PM).

Natural Language Processing:

https://research.google.com/pubs/NaturalLanguageProcessing.html

https://medium.com/@jrodthoughts/ambiguity-in-natural-language-processing-part-ii-disambiguation-fc165e76a679

https://chatbotsmagazine.com/the-problem-with-chatbots-how-to-make-them-more-human-d7a24c22f51e

Natural language is key for the chatbot above, and of course the first attempt will be ruthlessly primitive I’m sure.

Chatbots in General:

https://www.theguardian.com/technology/2016/sep/18/chatbots-talk-town-interact-humans-technology-silicon-valley

Blackboard Thinks They Know How You Teach

In November of last year a blog post by Blackboard was making the rounds. A Blackboard study documented how instructors use their system which was somehow conflated as being equivalent to knowing how teachers teach. We could talk about how terrifying this blog post is, or how it’s devoid of solid analysis. Instead I’d like to turn over a new leaf and not harp on about privacy, or a company using the aggregate data they’ve collected with your consent (although you probably weren’t aware that when you agreed to “allow Blackboard to take your data and improve the system” that they’d make this sort of “analysis”), but just tear into the findings.

Their arbitrary classifications (supplemental? complementary?) are entirely devoid of the main driver of LMS use: grades, not instructor pedagogical philosophy – students go to the tools where they are going to get marked on.  Social? I guarantee if you did an analysis of courses that value use of the discussion tool (note, that’s not about quality) as greater than 30% of the final grade, you’ll see a  hell of a lot more use of discussions. It’s not that the instructor laid out a social course, with a student making an average of 50 posts throughout the course (which if you break that down to 14 weeks of “activity” that works out to 3.5 posts per week – which is strangely similar to the standard “post once and respond twice” to get your marks – it’s that discussions are the only tool that give some power to students to author.

Time spent is also a terrible metric for measuring anything. Tabs get left open. People sit on a page because there’s an embedded video that’s 20 minutes long. Other pages have text which is scannable. Connections are measured how? Is it entry and exit times? What if there’s no exit, how does the system measure that (typically it measures for a set amount of activity and then if none occurs the system assumes an exit)? Were those values excluded from this study? If so, how many were removed? Is that significant?

Now Blackboard will say that because of the scale of the data collected that those outliers will be mitigated. I don’t buy that for a second because LMS’s are not great at capturing this data from the start (in fact server logs are not much better), because there’s very little pinging for active window or other little tricks that can only be really assessed using eye tracking software and computer activity monitoring. Essentially garbage-in, garbage-out. We have half baked data that is abstractly viewed and an attempt is made at some generalizations.

Here’s a better way to get at this data: ask some teachers how they use the LMS. And why. It’s that simple.

If you look at Blackboard’s findings, you should be frankly scared if you’re Blackboard. Over half of your usage (53% “supplemental” or content-heavy) could be done in a password protected CMS (Content Management System). That’s a system that could cost nothing (using Joomla or any of the millions of free CMS software available) or replicated using something institutions already have. The only benefit institutions have is that they can point at the vendor when stuff goes wrong and save money on outsourcing expertise into an external company.

If you take the findings further, only 12% of courses (“evaluative” and “holistic”) get at the best parts of the LMS, the assessment tools. So 88% of courses reviewed do things in the system that can be replicated better elsewhere. Where’s the value added?

Why Can’t Students Opt-Out of Data Collection in Higher Ed?

You know, for all the talk from EdTech vendors about being student centred (and let’s face it, LMS’s and most of the other products are not student centred) and all the focus on data collection from student activity – why don’t products have an easy opt-out (being student centred and all that) to not allow data to be collected?

What makes matters worse in many ways is that the data collection is hidden from student’s view. For instance, in many LMS’s they track time spent on content items or files uploaded. This tracking is never made explicit to the student unless they go and dig into their own data. And doing that is incredibly difficult and you don’t get a complete picture of what is being collected. If I was a more conspiratorial minded person, I’d suggest that it was done on purpose to make it hard to understand the amount of total surveillance that is accessible by a “teacher” or “administrator”. I’m not. I honestly believe that the total surveillance of students in the LMS is really about feature creep, where one request turned into more requests for more information. LMS’s on the other hand want to keep their customers happy, so why not share what information they have with their clients, after all it’s their data isn’t it?

Actually, it’s not. It’s not the client’s data. It’s the individual’s data. To argue otherwise is to claim that what someone does is not their own – it reduces agency to a hilariously outdated and illogical idea.

The individual, human, user should be allowed to share or not share this data, with teachers, with institutions or with external companies that host that data in an agreement with an institution that probably was signed without them even knowing it. There’s an element of data as a human right that we should be thinking about. As an administrator I have a policy I have to adhere to, and a personal set of ethics that frankly are more important to me (and more stringent) than the obscurely written-in-legalese policy. An unscrupulous, or outright negligent LMS administrator would mean that all bets would be off. They could do things in the LMS that no one, except another administrator, could track. Even then, the other administrator would have to know enough to be able to look at all the hundreds of different changelogs, scattered across different tools, across different courses and do essentially a forensic search that could take a good long time to undo any damage. That lack of checks and balances (a turn of phrase that appears purposefully as I think we’ll see what a lack of checks and balances will be like in the US the next few years) which could be implemented as part of using the system, but aren’t, leaves education in precarious situations.

The idea that the data locked in the LMS without the students being able to say, “I only want my data shared with my College” or “I only want my data shared with company X for the purposes of improving the system” shouldn’t be hard to implement. So why hasn’t it been done? Or, even talked about?

In my opinion, the data that gets harvested (anonymously of course) provides more important information to the company about how people use the system than the optics of having an opt-out button. It allows Blackboard to say how instructors use their system. We could talk about how terrifying this blog post is (instructor use is a proxy for how students use the system because LMSs give power to instructors to construct the student’s experience), or devoid of solid analysis. I’ll deal with the analysis later, so let’s just consider how this is entirely without context.  Blackboard hosted sites have been harvested (probably with the consent of the people who signed the contracts, not the instructors who actually create things on the system, or the students who engage in the system) by Blackboard to tell you how you teach. In one of the cited pieces, Blackboard says that they used this data to improve their notifications. If I put this through for ethics review, and said I’m going to look at notifications improvement and then released a post about how people used the whole system, it may very well be in their rights (and I suspect it is) but it is ethically murky. The fact they’ve released it to the public allows the public to ask these questions, but no one really has? Or if they have, I missed it.

The fact that Blackboard can do this (and I’m talking about Blackboard because they did it, but there’s a similar post from Canvas that’s making the rounds about discussion posts with video being more engaging or some such idea) without really clearing this with any client is chilling. It also constrains how we perceive we are able to use the LMS (it sets the standards for how it is used).

Learning Technologies Symposium 2016 Recap

McMaster University holds a Learning Technologies Symposium every year and this year’s event was spread over two days just after (Canadian) Thanksgiving.  I have a bit of a biased view, as I’ve been one of the organizers over the last four years so unlike my other recap type posts where I share the things I’ve learned attending the sessions, this one will document the things that I did and the things I learned.

A few days before, we had some session cancellations so I, being the diligent jack-of-all-trades offered to run a few sessions to fill in the gaps. The first was on using PebblePad as a peer-review platform, the second was a session I co-presented with a colleague on WebEx, our new web conferencing tool and lastly I ran a quick introduction to digital badges.

I also had to dip back into my history as a media developer to help sort out (with the help of our current digital media specialists) how to get a video camera (actually two video cameras) with HDMI outs into a WebEx broadcast of our keynote (the answer was that it’s best to run it through a video mixer, in this case a Roland VR-50HD). It came together fairly well, without the caveat that WebEx is persnickety when you don’t connect devices in the correct order (ie. it accepted the video feed fine, but didn’t switch audio to match, a quick audio reset fixed the issue).

So I missed most of Barbara Oakley’s keynote, which dealt with a lot of scientific research into how we learn and pay attention, and a lot of the discussion that followed as I was busy with the setup and breakdown of the streaming rig.

The first session I attended was one I was presenting for 15 minutes about peer review – the other session was about Individual Assessment and Personal Learning – in the context of an Italian language course (which as an aside, uses an open textbook). Unfortunately, no one showed up. It was great to catch up with Wendy, who was presenting in the same slot as I, and we chatted about all sorts of educational technology things.

Onto day two… I missed the first block of sessions working on preparing for my own session on WebEx (and catching up on e-mail from the last week, when I was off). The WebEx introduction session went really well, people were happy with the visual fidelity and audio quality. They also seemed intrigued with the testing option, but who knows if anyone will actually take it up. I’m really hoping we could do something with it – maybe review classes at the end of a semester could take advantage of it?

Day two had a series of presentations on some of the VR/AR work that’s being done on campus right now, which is new and exciting even though I’m not involved in it. Every project I’ve heard about is using the technology in a way that makes sense and should enhance learning (unlike in the past where technology is used for flash and wow, without any consideration for it making any sense). I wish I didn’t present twice on day two, because those sessions would’ve been great to see more than for a few seconds.

At lunch David Porter (who I didn’t know I was following on Twitter) new CEO of eCampusOntario (our version of BC Campus) did an overview of how he sees eCampusOntario developing and essentially threw a job recruitment pitch out there as well. We’ll see how the latest round of funding for research and projects goes, because the call was very specific and rigorous – so that vision is really well defined. I did start a proposal, but of course, it was due on October 31st, and I was in Texas watching my favourite band play their last show (NSFW), so I was a little busy.

After lunch I did a presentation on digital badges (do we need them?) that was a brief introduction to digital badges and tried to answer whether we need them (the precis is no, we don’t need them but you may still want to use them in cases where you want to explicitly assess skills or document experiences – transcripts and portfolios can handle the rest). This was paired with an instructor who uses our LMS for exams, in a relatively unique way. The exam is a paper based math exam, mostly generating matricies. The students complete the exam using pencil, paper and calculators, then are given half an hour to transcribe the work into D2L. The instructor then uses the auto marking feature to grade exams using fill-in-the-blanks. It’s a pretty clever way to make marking easier.

After closing remarks – and a giveaway – it was all wrapped up for another year.

 

 

A Buffet of Educational Technology Thoughts

If you’ve read anything in this blog, you know that I’m subject to “oh look, shiny!”, constantly distracted and going in one hundred directions. This post will get as close to the way my brain works.

First up, we’re scrapping Blackboard Collaborate as our web conferencing tool and installing WebEx. As a conferencing tool it’s light years ahead in terms of usability and functionality. I’m sure some of our more advanced users will find the quirks, but hopefully we can manage to stay one or two steps ahead of them. We had been Collaborate clients for years, migrating over from a self hosted Elluminate install.  Over time, the product, and it’s terrible Java interface, caused our users issues. We did integrate it directly with our D2L installation, which solved a lot of the interface issues, but then we’ve been hit with conversion errors that can’t be fixed by the user but prompt a ticket to Blackboard support. While Blackboard support have been excellent in this particular case, they haven’t been great over the years. Combine that with the fact that Blackboard has been promising a lot, and not producing a whit of evidence that they’ll be able to pull it off. If they weren’t so big, I’d be calling all their promises vaporware, but I fully expect they’ll be able to deliver eventually. It’s the eventually part that’s the problem.

Second, I’m working through how we can roll out blogs effectively to faculty who want their students to blog, but want a campus install to do it from. I know WordPress Multisite is the way to go, but it’s going to be a slow going process as we need to work with other groups on campus to make this one happen. I personally think that having an academic blog is an important piece of the process of going to University and becoming an academic – how else do people disseminate their findings to the public without the filter of a news organization? How else do academics form their own personal learning network? I’m a huge believer in blogging as a form; and I see it as a reflective practice more often than not. It’s also a space that I can use to see how ideas sound, and it helps me articulate ideas better (by slowing my brain down to typing speed, which is much slower than my mouth goes).

Third, is the upgrade to Turnitin, will practically force us to convert our existing connection between D2L and Turnitin to the new LTI connection between the two parties. As always, this is a last minute addition to our semester startup, so it’s an added complexity that we didn’t really want to think about but will have to consider over the next few days. While Turnitin is forcing everyone to upgrade, there is an opt-out process, but from what I know (and I’ll know more later this week when we chat with our academic integrity office) we don’t know what that really means? How does opting-out effect us? Can we revert if everything craps out and nothing works post-upgrade?

Fourth, I’ve been asked to sit on a portfolio advocacy committee, that will push portfolio use to “the next level” campus wide. I have a few ideas, but I’ve never been fond of sitting on committees, more fond of the work that needs to get done out of the committees. I guess it’s progress when you have someone who knows what it takes and whether it can be done currently, rather than facing down the fact you can’t do what you had proposed due to technical feasibility. My boss is sneaky good at eliminating my ability to point the finger at other people’s decisions, so I guess this one will partially be on me.

PebblePad (Academic) Year One

So as we round out another academic year, now seems to be as good a time to talk about my view of PebblePad.

In this first year, I created 9000+ accounts, who logged in to the system a total of almost 39,000 times (I only count for 319 of those!), working on a total of 8700+ workbooks, or one of 12,500+ templates; creating 9700+ portfolios and submitting almost 7300 things for assessment in 99 active workspaces (PebblePad’s language for a course space). Personally that resulted in 37.5 hours of overtime, 30 consultations with at least 20 different faculty and 30 presentations to students from as few as 20 to as many as 200.

PebblePad has received one major upgrade (and one forthcoming in a few weeks), and a couple of minor patches. The upgrades were smooth as silk from an administrator perspective, because they happened in the wee hours of the morning and were usually done by the time I logged into the system as part of my morning ritual.

How’s it been? Busy. The work in the previous paragraph has been enough to keep me busy were it my only task, but in this time I’ve also had to support the LMS, and bring badging online (that, alone is another story that will be blogged about shortly). I did crunch some numbers and I spent approximately 75% of my time on ePortfolio related work – whether that be documentation, consultations or working with the tool itself.

The administration of PebblePad is dead simple because there’s very little beyond the initial configuration (in fact, I’ve only been back to look for the Turnitin linkage and to run monthly reports). The only glaring thing missing is that the system can’t tell you how many people logged in during a month long period – but I can get that information by looking at the last 28/30/31 days activity page, on the first. It’s not exact, but it’s close enough for now.

I did notice throughout the year that there was very few inquiries from students about how to do anything. They didn’t contact me anyways. They might’ve gone to instructors, but anything complex that had to be done usually had accompanying documentation or a video. I do know that most of my support requests were about getting their login credentials (the system was not connected to our central authentication system for the academic year – due to technical issues we’ve been having that’s mostly out of our control). That issue will be out of the way this month, so it should be clear sailing for folks next year.

We had people use PebblePad (well, Pebble+ specifically) for the sort of thing you’d expect a portfolio platform to do – collect disparate experiences, assemble them into a coherent statement about the experience or themselves and submit it for assessment (or credit for an experience). We also had people use portfolios as a way to assess (and have students self-assess) against programatic outcomes. We also had a few rogues use the peer grading capability of Atlas (the assessment piece of PebblePad) without using the portfolio piece at all, which is great that people are pushing the envelope so early. We had a group use PebblePad as the platform rather than the LMS as they had previously done. We have a group using PebblePad to replace existing paper based workbooks with digital ones. All good foundational work in our third year of ePortfolio/Learning Portfolio initiatives on campus, and the first year of PebblePad.

What could I have done better?

Well, I think I could’ve pushed for more help to ensure that some of the things I did were done the right way. For instance, I ended up doing too much work for faculty, instead of them being trained on how to use the system well and then letting them configure and play. Part of that was the late starting date of the system (September 2nd) – part of that was me trying to make things easy for people – so they could focus on teaching rather than the technical stuff. I wonder if I’ve set a terrible precedent for taking on too much work, which will make the sustainability of what we do, untenable?

I could’ve asked better questions about why students were doing this, rather than getting on with the work – I do that in other aspects of my job, so why I didn’t really challenge people with portfolio work is a bit puzzling. I suppose it could’ve been that there was just so much to do, that I didn’t want to get into it with many people. That has to change. Even though I’m not an Instructional Designer, sometimes I’m the only person (along with my colleagues who hold the same title) instructors will come to with their ideas about using technology.

 

Technology Has Missed Education During the Internet Age

Holy crap! My entire life is a sham! The whole Web 2.0 thing we’ve been writing about for years, didn’t exist! OK snark over. I’d like to point out that this, very slanted to favour the current VC funded educational technology movement, written by a guy who could profit greatly from moving cash out of public education into his privately controlled hands (and we can talk for hours if that’s a good idea or not). I would’ve responded there, but TechCrunch requires you to sign in with your Facebook account, and that’s my personal life attached to that service, not my professional one. Oh well, the flame wars would’ve been epic. Here’s the first juicy quote…

Despite its importance, education seems to have been missed by the Internet revolution. When I walk down the hall of a middle school, not much seems to have changed since I was a student some 15 years ago.

OK, the halls won’t tell you anything. The halls are going to be the same. Although if you were there when students were in the halls you would notice the very common sight of a smartphone or even the odd tablet. But you’d actually have to look for that. However, if you look in the K-12 classroom, you’ll see a lot more instructors using different types of technology. The most interesting change will not be in the schools at all, but in the student’s home, where they connect to the school board LMS, or look up things to help them understand on the Internet at their teacher’s request. Some more advanced teachers (and you can look squarely at Google’s Teacher Academy and awards, Apple’s Certified Educator, Microsoft’s Teacher Academy to see evidence of the contrary). If that doesn’t really hold water for you, what about TeacherTube, EduBlogs or any of the other K-12 friendly sites that have existed since the mid-2000’s?

Luckily, that is changing. There are a growing number of entrepreneurs working to reshape education. Every year, thoughtful new solutions come online to solve a piece of the problem. Innovators are working hard to create students who can learn how to learn, who can think critically and who can lead.

Correct, things are changing. Entrepreneurs trying to reshape education is a real threat to education, much like how hypercapitalism is a threat to our sovereignty. Entrepreneurs will only seek to reinforce the common pedagogies – ignoring those who are working in collaborative or communal modalities. Capitalism likes behaviourist, didactic pedagogies as they’re easy to replicate in a software environment. Thoughtful solutions? No. Profit opportunities? Yes. And 99% of them fail. Look at the first EdTech boom (circa Web 2.0, or 2005) and how many of those companies have lasted? Very few.

But before we can talk about the future, let’s review the past…

Sure, but I don’t trust you to get it complete, right or even anywhere close to objective. Ignoring that just over half of the graphic deals with pre-industrial revolution, and ignores the Gutenberg Press; why do we need to know about pre-industrial revolution information, as education was a privilege of the rich – it could be argued up until the 1950’s but that’s an entirely different blog post – and a radically different thing? Oh, the cherry-picked examples (much like what I’m doing here) are problematic at best, misleading at worst. In between 1996 (when the White House offered 1 billion dollars for computers in schools) and 2006 (Khan Academy) nothing happened.

Except a whole lot happened. There’s this thing called the LMS that happened. Video taping lectures (a practice dating back to the 1970’s) and digitizing them on CD-Rom, then DVD, then on the Internet, happened. Several ePortfolio companies started. Oh yeah this thing called WordPress. What about Wikipedia? Yeah, nothing important there. Open Source software doesn’t really work in a venture capitalist world.

…and the current edtech landscape.

Misses so much that you’d hardly have time to write out what it misses.

Luckily, there are some great companies who are working to change this by focusing on helping parents give their children a solid educational foundation. The space is dominated by apps and gamification, which appeals to children’s natural curiosity and provides research-driven cognitive and non-cognitive activities to help facilitate development. On-demand app-based learning is a great supplement for families who cannot afford pre-K.

It’s not luck, it was a gap that is perceived by startups, and a few companies made a few apps that were decent for pre-K. I don’t know about pre-K efficacy, but I do want to question the idea that poor people are using apps because they can’t afford pre-K. I’d suspect that poor people aren’t using anything because they can’t afford the devices that apps run on, nor the monthy fees (if they exist). So the people using these apps are probably those who would be able to help educate their children, pre-K or no pre-K.

Also the space is dominated by apps because there’s no organization to sell enterprise level software to.

The common school was built for everyone. It was open to all races, classes and backgrounds. It taught a common curriculum to every student. It was designed to process thousands of students and get them to a base level of competency. It was the era of mass production’s answer to educating a mass of students to prepare them to enter the workforce.

While it was built for everyone, it most certainly was not open to all races (uhhh, segregation?) or classes (often the poor chose to send no one to school because the farm needed workers). This whitewashing of educational history needs to stop. Mass production didn’t start in the post-civil war era, it is commonly associated with Fordist principles, which coincide with the factory assembly line in 1910’s. Also, this whole passage glosses over the societal uses of school – socialization, networking, collaboration.. of course it does.

But the era of the assembly line is over. We are in an age of mass customization, fueled by technology. Seth Godin recently asked, “What is school for? If you’re not asking that, you’re wasting time and money.” We need to question the traditional approaches to education and embrace new modes of learning to help create the next generation of leaders.

To create the next generation of leaders? Leaders develop themselves. We don’t need to target leaders, we need to target the other 99%. Seth Godin sucks. You have to ask what is school for to understand what purpose it serves in society (hint, it’s not to create leaders), not to use time well or earn money. At least one thing is right here, mass customization is here, but if we’re just swapping UI interfaces over the old teaching methods, are we really improving things? I don’t think so.

Students are going to university because it is “the right thing to do,” often without a thought to the ROI on their education or the work opportunities after school. Only 19 percent of full-time college students graduate in four years, which dramatically increases the cost of their degree.

Why does education need to have a ROI? It’s not a business, it’s an education. The graduation in four years trope is problematic yes, but it’s mostly because students are working more than previously, just because of the cost of education, and can’t afford to not work. Part of that is the privatization of schools, part of that is the outsourcing of public funds into the hands of profit motivated companies, part of that is the rising costs of administration and part of it is the slowness of universities to adapt to a five year model rather than four. None of these problems can be fixed with an app.

Over the past several years we’ve seen the rise of the modern edtech industry. There have been massive investments in the space, and the success of these firms will dictate the future of the edtech landscape.

More successful exits (like Lynda) will help to propel the industry forward. Investments in the first generation of edtech have also made it difficult for the second generation of companies to attract investment, as investors have been watching this first cohort closely to gauge results.

You do know that Lynda started in the mid-90’s right? You know in that gap you illustrated between 1996 and 2006? It’s “exit”, to which I can only assume means the purchase of Lynda.com by LinkedIn as a value-add to users of LinkedIn, which hasn’t worked out quite so good for them. If you believe that the purchase was meant to prop up the value of LinkedIn’s stock, like some do, then really it’s not about education at all, but finance.