LAK 2016 – Learning Design

We start the next session with a presentation from some Australian researchers, on their OLT-funded project on how (tertiary) teachers can use learning analytics, ie given all of the data that we can collect, what would be meaningful for teachers, and when would they want to use it. They talked about the outcomes that teachers had undertaken from the learning analytics that they had seen, and were disappointed that teachers had not used the analytics in their courses that much – but as we really at the start of this area, I wouldn’t really expect that much else. 

Their framework built the teacher in as a critical factor, providing feedback on all aspects, with a focus on what kinds of learning interventions could be undertaken from the analytics. Some of the analytics that they explored included temporal aspects, such as how the students were accessing materials, and when, including pre and post activity for course events, such as tutorials or assessments, as well as analytics specific to the LMS and social media tools that might be used in the course.

Learning analytics framework


Unfortunately, the presenters didn’t get a chance to talk about the part of their system that was used by the teacher to reflect on their learning models, so I’ll have to re read the paper now to learn more about this.

The second presentation looked at analysing data from a larger number of courses, to explore how students were engaging with different course designs,ranging from traditional chalk and talk to active learning and social learning courses. They found that social constructivist models had a much higher student engagement level at the commencement of the course, although towards the end of the course, it was similar to other learning models. Lots of chalk and talk was negatively correlated with student engagement, active learning was positively correlated. Key factors that correlated with engagement in learning design were communication and experiential learning, although student perceptions of what they liked in learning models were the opposite. Their goal is that by sharing this data with the students, they might be able to motivate students to engage more with learning design components that they tend to dislike. 

Which were correlated with performance? Number one predictor was communication in terms of better retention, but not a clear indicator for any learning model in terms of performance and retention. I’ll have to look more at this paper – a lot of data presented very quickly, but with a journal paper by the same authors recently published to present the data in more detail. Do we want our students to be happy, to be engaged, or to pass? 

The third paper explored self regulation strategies and how analytics can impact those. The presenter discusses some of the difficulties with our current approach to analytics, and the need to move away from a course focus, helping support individual differences, and personalised learning. Students within the same learning environment will approach it in different ways, with some who will only access some of the resources, and some who access none, which can be explored using different user profiles – I was a little disappointed to see the discussion of user profiles littered with unnecessary stereotypes; we should be beyond this. 

What is wrong with our current approach to learning analytics?


The approach taken in this paper was to explore clustering around self-regulated learning strategies, with data on physical attendance, online lecture viewing, online activity from the LMS and online assessments, and a self regulation strategy survey. The clustering identified three groups: (1) no clear strategy, (2) lack of SRL and use of external regulation, and (3) SRL and, when stuck, use of external regulation. They didn’t initially find any correlation in online access, but they did see some clustering in how they were using the resources. Unfortunately, they did not find any correlation with student performance. We need to do more research on how students are using resources, and the order in which they access them – we need to improve the predictive value of LMS usage.

Papers:

A Conceptual Framework linking Learning Design with Learning Analytics, Aneesha Bakharia, Linda Corrin, Paula de Barba, Gregor Kennedy, Dragan Gasevic, Raoul Mulder, David Williams, Shane Dawson and Lori Lockyer

The impact of 151 learning designs on student satisfaction and performance: social learning (analytics) matters, Bart Rienties and Lisette Toetenel

Student differences in regulation strategies and their use of learning resources: implications for educational design, Nynke Bos and Saskia Brand-Gruwel

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s