Dealing With D2L’s Data Hub with PowerBI

Where I work, we use D2L Brightspace as our LMS. We also used to use Insights, their data dashboard product, but we didn’t find value in it and turned it off in 2016 or so (just as they were revamping the product, which I think has gone through a couple of iterations since the last time I looked at it). The product in 2016 was not great, and while there was the kernel of something good, it just wasn’t where it needed to be for us to make use of it at the price point where we as an institution would see value. I will note, that we also weren’t in a place where as an institution we saw a lot of value in institutional data (which has changed in the last few years).

Regardless, as a non-Performance+ user, we do have access to D2L’s Data Hub, which has somewhat replaced the ancient Reporting tool (that thing I was using circa 2013) and some of the internal data tools. The Data Hub is great, because it allows you access to as much data as you can handle. Of course, for most of us that’s too much.

Thankfully, we’re a Microsoft institution, so I have access to PowerBI, which is a “business intelligence tool” but for the sort of stuff that I’m looking for, it’s a pretty good start. We started down this road to understand LMS usage better – so the kind of things that LMS administrators would benefit from knowing: daily login numbers, aggregate information about courses (course activation status per semester), average logins per day per person and historical trending information.

The idea is that building this proof-of-concept for administrators, we can then build something similar for instructors to give them course activity at a glance, but distinct from the LMS. Why not just license Performance+ and manage it through Insights report builder? Well, fiscally, that’s more expensive than the PowerBI Pro licensing – which is the only additional cost in this scenario. Also Insights report builder is in Domo, and there’s frankly less experts in Domo than PowerBI. Actually, the Insights tool is being revamped again… so there’s another reason to wait on that potential way forward. Maybe if the needs change, we can argue for Performance+, but at this time, it’s not in the cards, as we’d need to retrain ourselves, on top of migrating existing workflows, and managing different licenses. It would probably reveal an efficiency in that once built we don’t have to manually update, and it’s managed inside the LMS so it’s got that extra layer of security.

Where I’m at currently is that I’ve built a couple reports, put that on a PowerBI dashboard. Really, relatively simple stuff. Nothing automated. Those PowerBI reports are built on downloading the raw data in CSV format from Brightspace’s Data Hub, once a week, extracting those CSVs from the Zip format (thankfullly Excel no longer strips leading zeros automatically, so you could migrate/examine the data in Excel as a middle step), placing them in a folder, starting up PowerBI then updating the data.

The one report that I will share specifics about is just counting the number of enrollments and withdrawls each day, but every report we use follows the same process – download the appropriate data set, extract from Zip, place it in the correct folder within the PowerBI file structure on a local computer. PowerBI then can publish the dataset to a shared Microsoft Teams so that only the appropriate people can see it (you can also manage that in PowerBI for an extra layer of security).

I obviously won’t share all the work we’ve done in PowerBI as well, some of the data is privileged, and should be for LMS admins only. However, I can share the specifics of the enrollments and withdrawals counts report because the datasets we need for enrollments and withdrawals is, not shockingly, “Enrollments and Withdrawals” (not the Advanced Data Set). We also need “Role Details”, to allow filtering based on roles. So you want to know how many people are using a specific role in a time period? You can do it. We also need “Organizational Units” because for us, that’s where the code is which contains the semester information for us and if we ever need to see a display of what’s happening in a particular course, you could. Your organization may vary. If you don’t have that information there, you’ll need to pull some additional information, likely in Organizational Units and Organizational Parents.

In PowerBI you can use the COUNTA function (which is similar to COUNTIF in Excel) and create a “measure” for the count of enroll and a separate one for the unenroll. That can be plotted (a good choice is a line chart – with two y-axis lines). Setup Filters to filter by role (translating them through a link between enrollments and withdrawals and role details).

I will note here, drop every last piece of unused data. Not only is it good data security practice, but PowerBI Pro has a 1 gig limit on it’s reporting size, which is a major limitation on some of the data work we’re doing. That’s where we start to actually get into issues, in that not much can be done to avoid it when you’re talking about big data (and the argument for just licensing Performance+ starts to come into play). While I’m a big fan of distributing services and having a diverse ecosystem (with expertise being able to be drawn from multiple sources), I can totally see the appeal of an integrated data experience.

Going forward, you can automate the Zip downloads through Brightspace’s API, which could automate the tedious process of updating once a week (or once a day if you have Performance+ and get the upgraded dataset creation that’s part of that package). Also, doing anything that you want to share currently requires a PRO license for PowerBI, which is a small yearly cost, but there’s some risk as Microsoft changes bundles, pricing and that may be entirely out of your control (like it is where I work – licensing is handled centrally, by our IT department – and cost recovery is a tangled web to navigate). Pro licenses have data limits, but I’m sure the Enrollments data is sitting at 2GB, and growing. So you may hit some data caps (or in my experience, it just won’t work) if you have a large usage. That’s a huge drawback as most of the data in large institutions will surpass the 1GB limit. For one page of visualizations, PowerBI does a good job of remembering your data sanitization process, so as long as the dataset itself doesn’t change, you are good.

Now, if you’re looking to do some work with learning analytics, and generate the same sort of reports – Microsoft may not be the right ecosystem for you as there’s not many (if any) learning analytics systems using Microsoft and their Azure Event Hubs (which is the rough equivalent of the AWS Firehose – both of which are ingestion mechanisms to get data into the cloud). That lack of community will make it ten times harder to get any sort of support and Microsoft themselves aren’t particularly helpful. In just looking at the documentation, AWS is much more readable, and understandable, without having a deep knowledge of all the related technologies.

A Weird Tidbit from 2009

From: Jon Kruithof
(email)
Mohawk College, Hamilton, ON, Canada media education
Date: 12/08/2009 06:22:08
A prevailing undercurrent is that education, like medicine, is a bitter pill to swallow. I agree with Ryan Cameron, and do think that educators have to be media moguls in some sense. It seems to me that e-learning in general will not change, but the educator behind it will have to adapt. The educator will have to be a scriptwriter, web designer, filmaker, image manager and do it well, and for less. With that critical consumption of media and the ability to discern fact from fiction will become more important. As we move further into the future, and into virtual worlds, identity and who one is will become critical to manage and control.

From Downes’ work here: https://www.downes.ca/post/53334

Ryan Cameron might be this particular person, and dang, I got it right in some ways. Look at the successful MOOCs – they all have personality and tell stories while still cramming the learning in. I’m pretty sure this comment pre-dates Khan Academy (actually, no, it started in 2008, but the comment predates it’s ascent into the popular conscience in 2012). And now with the AI apocalpyse (or more likely, the AI hype curve, followed by mainstream acceptance much like Google Search has become the defacto internet search). I don’t know if we’re at the tipping point of every educator needing to be well-versed in media techniques. But consumers and creators of media need to be aware of AI, and everything they do can be taken from them, analyzed, dissected and filed away for acontextual (not a real word) retrieval.

And the one thing computers still don’t have on people is identity – and that’s where we can really retain our humanity. If you’re worried about AI, right now, it’s got a little (prefabricated) personality. Once it starts to develop a cohesive one, one that doesn’t change based on the varied inputs, then you need to worry. In my most nihilistic, I think it’s perfect that this endgame-capitalism that values a race to a zero sum for everything is essentially building an entire replacement for the human that built it. I still have hope that we can cast off this AI-hype like blockchain and VR and treat it like the Google replacement it’s trying to be.

Why are Third Party Vendors Such Arses?

The short answer is that they’re not. They’re experiencing culture shock between higher education and capitalism. Their goals and higher education’s are entirely different, and sometimes diametrically opposed. Sometimes they’re not, but I’ll leave that for the Marxists out there to critique.

I’ll outline a few examples, no names except for where I need a name for a tool because it’s too hard to keep using “middleware” that could mean anything from a database to a API connector to something like IFTTT. I’m not writing this to shame edtech vendors or name call, but if you are a vendor and you do these sorts of things – maybe consider stopping.

Hyper aggressive sales.

You’ve all seen this, or gotten emails day after day from the same vendor telling you about their great product. Or, you’ve been a teacher, and they call you periodically. Or more frequently. Daily even. I’ve gotten relentless edtech bros emailing me on LinkedIn then at work. By the way, if you do this and it’s part of your company culture, you do know that I mark that stuff as spam, right? All it does is create one of two things for a relationship… you either gain someone who just capitulates to you (but resents you) or you anger someone (who then holds a grudge for longer than an eon). Neither of those are great, but one is a sale. In an extreme case, you might get a cease and desist from a CIO who is tired of your harassment.

Circumventing process.

EdTech workers have definitely been asked for this sort of stuff continually. Move fast and break things is not a good mantra for education, nor public institutions. If your company wants to do it your way, rather than a standard LTI 1.3 kind of way, and then refuses to budge because your API way (to simply manage single-sign-on!) is already built, you’re an ass. If you are ever told, “we don’t just enable every option in LTI 1.3 settings” and you turn around and suggest you need all those data options – you most definitely don’t. If we have a process that we tell you takes months to go through, no, it can’t go quicker. It’s literally my job to ensure the security of the data in the system you’re trying to connect to, so work with me, not against me. It’s not my fault you left it to the last minute before semester and are trying to rush the integration through, literally using teachers as a sacrificial wedge to bypass security, privacy and accessibility. You know what that makes you.

Oh, and when the vendor agreement allows an instructor to sign off for an entire institution? That’s no good.

Data greediness.

Outlined above a little bit, but when you ask for an API integration, you should be able to easily answer “What API calls are you making?”. If you have an LTI 1.3 integration, and we ask “what do you use this data for?” you should be able to answer that within minutes of asking. Dancing around that question just raises my suspicions. You might actually need all that data. In 20 years of doing this work, and probably working on 100+ integrations with the LMS and other tools, it’s happened twice. Those two vendors were very quick to respond with what they use each data point for, how long they kept it, and why they needed it for those functions. That’s excellent service. Also that wasn’t the sales person… so yeah. Oh, and 99% of integrations between the LMS and something else can be done with LTI 1.3. Vendors out there, please use the standards. And get certified by IMS Global/1EdTech. It goes a long way to building your reputation.

Third-party malfeasance.

OK, it’s not that bad, but a new trend I’ve started seeing is a vendor using another vendor to manage something (usually data). EdLink is the sort of thing I’m thinking about here. EdLink allows easy connections between two unrelated vendors with no established connection method. So think, connecting McGraw Hill to your Student Information System (not the actual example I’m thinking of to be clear, we don’t have, or want, to connect McGraw Hill to our SIS). To be honest, this doesn’t bother me as much as some of the other grievances I’ve got – but obfuscating your partnerships and running all your data through a third-party that we don’t have an agreement with, is definitely something that raises an eyebrow or three. As one starts to think about what-if scenarios (also my job) it makes clarity around who has your data at what time and for how long all the more difficult. The service doesn’t bother me, as long as the middle-person in the scenario is an ethical partner of the vendor you’re engaging with. In many cases, you need to have a level of trust in the partner, and if they’ve shown themselves as less than trustworthy, then well, you’ve got a problem.
Again, I’m sure EdLink is fine, but when a vendor uses EdLink, and is presented with that fact, it’s a challenge for security experts as they not only have to do one analysis, but two. I understand why a vendor might try to frame EdLink as their own service, but it’s undeniable that it isn’t. So just be honest and upfront. You may pass by a team that doesn’t prioritize this level of detail, but we are not blind. We will figure it out.

One other big challenge with third-parties acting on behalf of a vendor is that if there’s a problem, you typically have to go through the vendor to access the middle person’s support team to get it rectified. This adds a layer of complexity AND time to something that was likely intended to save time and hassle for the vendor.

(Ex) Twitter

I was an early adopter of Twitter, circa 2007 or so, but didn’t really get going for a couple years. I really loved it. While Jack Dorsey wasn’t an ideal CEO, and his vision and mine often differed, I didn’t believe his motives were anything but to build a business that he could eventually sell and disappear off into the clouds with millions of dollars. So, I understood why he and his company left hateful speech up (it engages/enrages people) but eventually someone would be so heinous to get it taken down. Then 2016 happened, and well, America elected a guy who says the racist, sexist, hateful things out loud. In 2022, that Elon Musk fella bought it and I knew it was going to become a worse place. To be honest, I kinda’ stopped with Twitter (and all social media) during the pandemic. Sure, I’d browse through feeds, and out of all of them, Twitter was still useful – someone would post something at least once a day that would be click-worthy.

Twitter is no longer useful. With different people scattered across the multitudes of places (Mastadon for some, Bluesky for others and any number of other places for many more) educational technology is worse off.

In the face of the rampant racism, being lead by a man-child who does it all for the lols, I can’t be there any more, and I don’t want my 14 year old account to represent me there anymore. I left it in the slight chance this wouldn’t happen, but it did.

I haven’t really thought about how I’ll replace the serendipity of Twitter. Of finding someone working on bullshit detection, or commentary about how Google’s ethics team was rapidly fired – or more important things like the Arab spring uprising and the Hong Kong protests. Or more mundane things like Mandarake’s posts about keshi from their store in Nakano Broadway. If you follow that link, welcome to my new obsession, Onion Fighter.

Now I did the right thing and downloaded my archive. I don’t know if they make sense to republish without the context of the surrounding conversation… or if it’s even feasible. Oh well, end of an era. Hope we find each other, wherever we end up!

Ch-Ch-Changes

I’m trying really hard to write more and this summer has been a challenge because I swore off doing what I normally do, which is too much. I took a semester or two off from my Master’s and got my head down in some work at work. I should probably update y’all on what I am doing because I typically write about that periodically – maybe not often enough?

So, I’m currently in my third year as Lead Learning Technologist – which is kind of middle management, kind of feet on the ground type of role. I don’t have budgetary oversight, but I do oversee the LMS team and play a role in guiding that ship through the stormy weather. The LMS team is small (two front-line support analysts and one senior systems administrator who manages the backend processes) but rockstars from top to bottom. My role is to plan out the projects, lend a hand where possible, do some of the long-term visioning, do some short-term put out fires, and help clean up messes. Ultimately, I’m not entirely responsible, but I am in many ways entirely responsible. That kind of middle spot is somewhat frustrating, but also rewarding. I get some freedom still, but I still have to run the major decisions up the flagpole. I also am the conduit between central IT and the LMS team which sits within Teaching and Learning.

It took a while, but I let go of the last of the old job (which was front-line support and LMS administrator, plus more) which was doing the LMS updates for the university about six months ago. It felt weird, and I will admit I left that until the last because I frankly loved keeping my toe in the water. Perspective is a weird thing in that the closer I come to the top (and it’s only been one minor step up) I can see that everyone is as frustrated as I am at the middle management problem – you’re always constrained by someone else’s decisions. I’m lucky in that I can have that clarity and I couldn’t imagine how I would function if I wasn’t able to see it and it was hidden from me.

We’re doing two large projects at the moment, finally launching a course outline library tool (and re-architecting some decisions that weren’t exactly built into the first iteration of the design of the tool’s functionality) and doing an LMS RFP – which is overdue because we were in a funded project for several years which precluded our ability to RFP and then the pandemic happened. So this is the first time since… forever. I’m acting as technical lead on both those projects. Technical lead is a weird thing that probably deserves it’s own chunk of writing (so wait for it, coming in 2025!).

I’ve been trying my best to help guide some things around AI without getting too involved in the policy, making diversity and equity a component of technological purchasing, and thinking a lot about process. I also spent a good chunk of time on this chapter in this ebook/anthology celebrating the 50th Anniversary of the Teaching and Learning centre at McMaster.

Other than that, it’s just business as usual.

ETEC 512: Applications of Learning Theories to the Analysis of Instructional Settings

This course was good, well designed, facilitated by a helpful, gracious, insightful instructor – and still sapped me of the will to live. I guess the big take-home for me is that I’m not particularly keen to chat about theorists, especially psychologists (who are doing a difficult job in trying to understand the most complex part of the human body).

While I understand the need to fill out this course to ensure that people have some knowledge of how people learn, it missed a lot of the educational theorists (maybe thinking that we’d already know them?) and in our offering, no one took Vygotsky and did a group project on his ZPD, so we missed a huge chunk of what I think a lot of modern course designs take into consideration.

As I’m writing this and reflecting back on the last few courses, I’m feeling like some curmudgeon, complaining about every little thing. While that’s not entirely untrue, I’m prone to that sort of whinging. So here’s a little more balanced attempt at what I think.

The course was structured in a way that gave a sampling of several different viewpoints of how people learn, and while it’s difficult to demand depth and breadth in a subject, this course and it’s readings, as designed, tried to do that. I think ultimately it was unsatisfying (and again, a course doesn’t have to be satisfying in the sense of eating a nice meal) as it felt the things that I wanted out of it (discussion about learning design in the context of online learning, different theories of how and why people learn online, how theories impact educational technology), I didn’t get it explicitly. Now it’s pretty simplistic to understand that LMS’s replicate the very teacher-centric approach to technology in the classroom. Has there not been any more done to expand this in the last two decades? I cannot fault the course, the facilitator or anything else, and there is in fact nothing stopping me (except time) from diving into Vygotsky and just reading the ever-loving hell out of it. Just it felt so jam-packed with theory, that it didn’t really dip into the practical side of it. Again, I could’ve done that myself, but when a course is framed, with grades, well, you’re going to try to achieve good grades and some of the ancillary learning (and reading) will go by the wayside.

Actually another takeaway from the course is that Vygotsky’s ZPD probably has more applications in an online learning context than any other theory outside of the more modern ones (Connectivism, Rhizomatic Learning) that attempt to describe learning.

ETEC 511: New Foundations of Educational Technology

This was a core course and to me the framing of the course was slightly confusing. We talked about tools, and the two phenomenological positions that tools might occupy (tools control and condition us; tools are controlled by us). To me that was the key feature of the course, but it was clouded with some distracting approaches to the readings – there was never a key linkage back to the core concept of the course, and while that makes for a challenging course… it also makes for a confusing effort. The assessments never made a clear connection to the theoretical approach – in fact the rubrics had to be consulted to see the connections, which again could be the way the instructor approaches the course, and could be the way the course itself was constructed. I liked the use of other tools, however, I really really wish this program would be really student-centered and allow US to select the tools we want to use for communication. There’s a lot of hand-waving about student focused (at one point, the instructor made a point of saying “the LMS is terrible for teaching” to which I wanted to respond, the LMS isn’t doing the teaching… it’s the place we the students are looking to keep track of stuff). We used Slack, which I have a personal set of problems with (the threading of the chat is limited at best; search is abyssmal; I really have a problem with the way sub-channels? group conversations? are managed) which seemed to be more of the instructor’s choice rather than a collaborative effort.

And if one was concerned about student data being in a private, for-profit, hosted in the US system like Slack when Mattermost is available free to any UBC user makes a ton of sense…. but alas.

Technical choices aside, although in an educational technology course I don’t think you can put them aside, this course was disjointed, the assessments were all over the place – the individual assignments worth 5% apiece – some were written; some required media elements to be designed. There was no equivalency in the time spent between them. I can write a page in about a couple hours of focused work. I can create a video in about a day. In the end, I didn’t really want to engage with any of them as they were all duplicating effort based on the weekly readings and discussions we had already on the topics. While I did find the variety of topics engaging, some of the assignments made some gross errors of assumption. Like I can’t control the use of my phone. Or I don’t use my technology critically. I’ve been working in technology related fields since the late 90’s. I was early in on designing web pages. I saw some of the first javascripts to alter peoples behaviour on webpages (this was in 1997 advertising to draw people’s mouse pointers to elements, think image maps with gravity wells to slow mouse speed and to subtly draw their pointer to hover over objects with pop up descriptions). I taught a course on searching the web as Google moved to a semantic engine for analyzing search results, thus shifting their focus on quality search to engagement on search and selling advertising. The majority of the general populace may not be attentive to attention; but the people in a Master’s level program about technology should be paying attention. Professionals in the field damn well better be. I’m sure that particular assignment about attention could be framed more neutrally.

I realize the design has to hit both audiences for these courses – teachers new to the field and educational professionals who are seeking a post-graduate level degree (like myself).

I was shocked that there was no readings whatsoever about danah boyd’s work, or Ursula Franklin or Neil Postman (beyond the one article) or well, any of the history of the Internet. I’m lucky to have lived through it, but if you’re talking about the foundations of educational technology, you’re talking about the foundations of the world wide web. If you’re talking about the foundation of educational technology outside of the basic roots of web-based instruction – you really need to start talking about Audrey Waters most recent book, Teaching Machines. If you’re talking about online communities you need to include Howard Rheingold’s works. I guess the foundations course I’d design is far-and-widely different than what UBC has done. That’s fine, and probably the perspective I need to hear, rather than the perspective I’d want to hear. Most of that work was done outside academia. It’s not lost on me that most of the educational technology work is historically at-risk as it’s been published on the open web and not in academic journals.

Outside of that, I really, really loved the first thing we did in the course, which was take time to think about settler relationships with indigenous populations through text analysis. It was a thoughtful exercise and I’m constantly thinking about how I can fold that into our work as educational technologists.

How To Use D2L Awards Across Multiple Courses

Parts of this post were drafted in 2018. I’ve left them as-is and finished the blog post because I think it’s kind of crucial to understand that things have (and haven’t changed) a lot in this space. The D2L Awards tool is in desperate need of improvements, which is related to the underlying release conditions logic limitations and the way courses are positioned in the LMS. Those are structural issues, and entirely not D2L’s fault. In fact, it’s an education problem. Anyways, to the post:

I’ve been working with D2L Awards since they became available at my institution, around the Fall of 2014. I’d spent some time prior to that adding my two cents on the Open Badges Community calls, and tried to add the higher education perspective where I could (around the badges specification and higher education policy). One of the great values of badges is that they are very transportable – they essentially belong to the earner, with some caveats (like the badge hasn’t been revoked, or expired). To me this makes a lot of sense when you think of skills developing and documenting learning, which is the areas I’ve been working in the last few years.

So when D2L announced that the Awards tool would issue Digital Badges, I was very very happy. Well, truth be told, I wasn’t happy at all, because I had been working on installing our own badge issuing server and integrating it with D2L, so all of that work (the previous year or so) was down the tubes. But the upside was that it was an integrated experience and worked out of the box (so to speak). One of the first challenges was to get global badges, or the sort of thing that might transcend a course to work. The theory was that if you earned several badges from several courses (in D2L admin-speak, course offerings) you’d need to have some sort of way to know that. The somewhat simple approach is to use a higher level organization unit to manage that for you.

Typically, in a LMS structure you have the top level, or Organization level; underneath that some form of Department or Faculty, and underneath that Courses.  D2L Brightspace also has these things called Templates above the Courses, other LMS’s might have those structures, maybe not. Much of that structure is determined by your institutions Student Information System (typically Banner or PeopleSoft, but may be renamed to suit the institutions’ whims).

An example organization structure with various levels of groups

To facilitate badges to be issued as a result of other courses, or as part of an unofficial grouping (think HR related training) you will have to create a shadow structure that connects the Courses and Templates to a shadow Department. You could use the existing Departments to do this as well, but it’s generally safer to do this in a shadow organization rather than the real one. There’s little danger of doing anything damaging in this space, but you will need to be in and out of here doing enrollments. Some SIS systems already have enrollments at Department levels (we don’t) – so you definitely don’t want to mess with what your SIS does. If your SIS doesn’t do enrollments at the higher levels (excepting Organization and Course levels) then you could use existing structures, but you then risk breaking things if the SIS changes Departments or enrollments shift.

An example organization structure with various levels of groups, with a shadow structure to facilitate outcome achievement to levels above the course

The other benefit of a shadow structure is that you could combine things in unofficial ways. For instance, you could connect all the Community Education courses together across the institution, or connect experiential learning, or co-op… you get the idea.

Essentially, you don’t use the Awards tool as a relationship, but the Competencies tool with the Award tool as the outcome of the relationship. The Competency feeding up the hierarchy of organizational units, and then you can trigger awarding a Badge or Certificate with the activity that granted (in whole or in part) at a higher level. The student would get a badge from an organization they may or may not see (depending on the D2L permissions at that level, if any).

ETEC 511 – Project Retrospective

I feel that this was an interesting project to be a part of, and coming into it late, eliminated some of my ability to influence the direction of the group, but I think my role became an early critic – asking pointed questions so that I could understand why decision were made – and help support them. One of the things I often did in meetings was try to help tie our decisions back to course readings, course content, and thinking about the project less as a project and more as an assignment. I think I also served as a bit of a wrangler of ideas, trying to limit scope creep.

One other thing I did was in my section (accessibility legislation) really try to refine the information into questions and answers, so that particular section became conversational. I recall talking about making our language plain, simple and understandable and how we could use that as an engagement strategy, and in turn making it more usable, which was I think my most important contribution to the process. I will admit my passenger-ness to the project, and did not feel 100% like it was my project (again, coming in late to it made it a bit of a challenge) so I always saw my role as a bit of a servant to the whole.

From a Nielsen (2003) perspective, Twine was at the same time easy to use and learn the basics of quickly. The simplicity of the design of the tool (questions and answers with expositional text) were easy to construct in Twine. The complexity of the subject matter however, made Twine a difficult choice to manage how each component piece linked. Had we not limited our choices to discuss, we absolutely would have been tangled in a web of, well, Twine. I think that the affordances of Twine’s output, especially in the way we designed the tool, kept the complexity down, which in turn allowed the tool to ultimately be more usable.

I don’t know if Twine ended up being the best choice as we had to bend it quite a lot to make it do what we wanted it to do. The tool ultimately configured us as designers, as we were pretty locked into Twine. While it ended up perfectly fine, it did limit us in some ways, that due to our technical understanding of what was possible and the time constraints it would have taken to further use Twine beyond what it is built for, I wonder if this would have been simpler to build using HTML and delivered a more accessible tool in the end?

References

Nielsen, J. (2003). Usability 101: Introduction to usability. Useit. 

ETEC 511: IP#8 – Attention

This assignment includes a requirement to do an attentional record. Here’s that:

TimeActivityDistractions
9-10Work tasks, email, tickets,Moving from task to task – completed chunks then moved to different task  
10-11Work meetingCataloging was a distractions
11-1211-11:30 lunch 11:30 – 12 work tasks, email, MS TeamsMoving from task to task – completed chunks then moved to different task  
12-1Work meetingDiscussion raised some questions that I searched for. MS Teams message came in from another team member, answered it  
1-2Work meeting (different) 
2-3Video Interview (2-6)Needed to be present during this, phone turned off, no distractions
3-4  
4-5  
5-6  
6-7Dinner prep and dinner, finish workNo phone/computer during dinner prep, work required focus and attention
7-8TV (listen to TV, not actively engaged)Doodle on phone, check email, play games
8-9TV (listen to TV, not actively engaged)Doodle on phone, check email, play games
Attentional Record

This is a bit of a curious exercise as it wants you to turn this data into some visual, but all my visual storytelling skills tell me that it’s not going to add any sort of additional information and abstracting this information one step further is actually obfuscating the information and making analysis harder. So I’m not going to do that.

This exercise for me was interesting, as the exercise was more distracting than my normal process. Typically, I am not a distracted person. I quite often choose not to look at my phone, or check email, or get distracted from what I am doing. If I am “distracted” chances are I’m bored (which is also how I relax, just not pay attention to anything, and stop being actively engaged). Setting out an activity where I have to pay attention to my attention – well, that’s going to be a recipe to double down my already disciplined approach to work, tasks and life. So, I don’t know that I have some great revelatory technique to deal with distractions – I’m not some ascetic monk, I just believe that being in the moment and present is important. In many ways, that’s what Citton (2017) is talking about in the Joint Attention section of the book – “we are always attentive in a particular situation.” (p. 83) In educational situations, attention varies depending on the student and their role – as if attention is social and co-constructed. However, there’s some social norms that drive attention (albeit younger students might adhere to this better than middle school students – who are more likely to be testing social norms). While I don’t necessarily agree that attention is co-constructed, it is (and our current social media world confirms) most certainly socially constructed. Peer groups can “pay attention” to certain musical acts, and ensuring you know those musical acts ensures your social status. Those relationships are social. Families and friends are often the most important people to drive attention and, in my chart, the times where I’m with friends and family, are also the ones where my attention is most undivided.

That sounds so high and mighty to write… but it’s true. The attention that I pay has the most value when I value the people around me. Thinking beyond this particular chart, but into the territory when I do use my phone for entertainment – it’s in transit, between places, and alone.

I will also say that Citton missing out on Neil Postman’s critiques of mass media for entertainment (and thus attention) is a gap that I paid attention to after reading the chapter.