Wednesday, March 18, 2015

10:15am - 12:00pm - Location: Theater

1A: MOOCs - Assessments, Connections and Demographics

Wednesday, March 18, 2015

10:15am - 12:00pm

1B: Student Engagement and Behaviour

Session Chair:

Wednesday, March 18, 2015

10:15am - 12:00pm

1C: Indicators and Tools for Awareness

Session Chair:

#129 (Long) *Best Paper Nominee*: On the Validity of Peer Grading and a Cloud Teaching Assistant System; Tim Vogelsang and Lara Ruppertz - We introduce a new grading system, the Cloud Teaching Assistant System (CTAS), as an additional element to instructor grading, peer grading and automated validation in massive open online courses (MOOCs). The grading distributions of the different approaches are compared in an experiment consisting of 476 exam participants. 25 submissions were graded by all four methods. 451 submissions were graded only by peer grading and automated validation. The results of the experiment suggest that both CTAS and peer grading do not simulate instructor grading (Pearson's correlations: 0.36, 0.39). If the CTAS and not the instructor is assumed to deliver accurate grading, peer grading is concluded to be a valid grading method (Pearson's correlation:

#17 (Practitioner): How Should We Quantify Student Engagement?; Perry Samson - Student engagement is widely thought to be a key predictor of student motivation and achievement. Engagement has been defined as "both the time and energy students invest in educationally purposeful activities."  Unfortunately this doesn't identify what specific student actions to include in a quantification of engagement.   This interactive presentation invites participants to consider how they would quantify student engagement using technology.  The discussion will be informed from lessons learned at the University of Michigan where a rich database of student participation in class has been collected and related to student outcomes.

#171 (Practitioner): Using an Activity Dashboard to Support Awareness and Reflection in a European Virtual Seminar; Maren Scheffel, Hendrik Drachsler and Marcus Specht - In  order  to  support  students  in  online  learning  environments  to  become  more  aware  of  and  reflect  on  their  activities  we  have  implemented  an  Activity  Dashboard  within  the  learning environment of an online course. The Activity Dashboard provides feedback, visualised in radar diagrams and bar charts. At the end of the course an evaluation will be run and the different learning  groups  will  be  compared  with  one  another.  We  are  also  looking  into  comparing  the  previous runs where students had no dashboard and hope to see differences in the behaviour of the students that have been supported with the Activity Dashboard. 

#57 (Long): Examining Engagement: Analysing Learner Subpopulations in Massive Open Online Courses (MOOCs); Rebecca Ferguson and Doug Clow - Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we use analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.

#119 (Long): Exploring Networks of Problem-Solving Interactions; Michael Eagle, Andrew Hicks, Barry Peddycord III and Tiffany Barnes - Intelligent tutoring systems and other computer-aided learning environments produce large amounts of transactional data on student problem-solving behavior, previous work modeled the student-tutor interaction data as a complex network, and successfully generated automated next-step hints as well as visualizations for educators. In this work we discuss the types of tutoring environments that are best modeled by interaction networks and how the empirical observations of problem-solving result in common network features. We find that interaction networks exhibit the properties of scalefree networks such as vertex degree distributions that follow power law. We compare datasets from two versions of a propositional logic tutor, as well as two different representations of data from an educational programming video game. We find that statistics such as degree assortativity and the scale-free metric allow comparison of the network structures across domains, and provide insight into student problem solving behavior.

#20 (Long): The LATUX Workflow: Designing and Deploying Awareness Tools in Technology-Enabled Learning Settings; Roberto Martinez-Maldonado, Abelardo Pardo, Negin Mirriahi, Kalina Yacef, Judy Kay and Andrew Clayphan - Designing, deploying and validating learning analytics tools for instructors or students is a challenge requiring techniques and methods from different disciplines, such as software engineering, human-computer interaction, educational design and psychology. Whilst each of these disciplines has consolidated design methodologies, there is a need for more specific methodological frameworks within the cross-disciplinary space defined by learning analytics. In particular there is no systematic workflow for producing learning analytics tools that are both technologically feasible and truly underpin the learning experience. In this paper, we present the LATUX workflow, a five-stage workflow to design, deploy and validate awareness tools in technology-enabled learning environments. LATUX is grounded on a well-established design process for creating, testing and re-designing user interfaces. We extend this process by integrating the pedagogical requirements to generate visual analytics to inform instructors' pedagogical decisions or intervention strategies. The workflow is illustrated with a case study in which collaborative activities were deployed in a real classroom. 

#85 (Short): Socioeconomic Status and MOOC Enrollment: Enriching Demographic Information with External Datasets; John D. Hansen and Justin Reich - To minimize barriers to entry, massive open online course (MOOC) providers collect minimal demographic information about users. In isolation, this data is insufficient to address important questions about socioeconomic status (SES) and MOOC enrollment and performance. We demonstrate the use of third-party datasets to enrich demographic portraits of MOOC students and answer fundamental questions about SES and MOOC enrollment. We derive demographic information from registrants' geographic location by matching self-reported mailing addresses with data available from Esri at the census block group level and the American Community Survey at the zip code level. We then use these data to compare neighborhood income and levels of parental education for U.S. registrants in HarvardX courses and the U.S. population as a whole. Overall, HarvardX registrants tend to reside in more affluent neighborhoods. U.S. HarvardX registrants on average live in neighborhoods with median incomes approximately .45 standard deviations higher than the U.S. population. Parental education is also associated with a higher likelihood of MOOC enrollment. For instance, a seventeen year-old whose most educated parent has a bachelor's degree is more than five times as likely to register as a seventeen year-old whose most educated parent has a high school diploma.

#95 (Short): Towards Better Affect Detectors: Effect of Missing Skills, Class Features and Common Wrong Answers; Yutao Wang, Neil Heffernan and Christina Heffernan  - The well-studied Baker et al., affect detectors on boredom, frustration, confusion and engagement concentration with ASSISTments dataset were used to predict state tests scores, college enrollment, and even whether a student majored in a STEM field. In this paper, we present three attempts to improve upon current affect detectors. The first attempt analyzed the effect of missing skill tags in the dataset to the accuracy of the affect detectors. The results show a small improvement after correctly tagging the missing skill values. The second attempt added four features related to student classes for feature selection. The third attempt added two features that described information about student common wrong answers for feature selection. Result showed that two out of the four detectors were improved by adding the new features. 

#116 (Short): Learning Analytics beyond the LMS: the Connected Learning Analytics Toolkit; Kirsty Kitto, Sebastian Cross, Zak Waters and Mandy Lupton - We present a Connected Learning Analytics (CLA) toolkit, which enables data to be extracted from social media and imported into a Learning Record Store (LRS), as defined by the new xAPI standard. A number of implementation issues are discussed, and a mapping that will enable the consistent storage and then analysis of xAPI verb/object/activity statements across different social media and online environments is introduced. A set of example learning activities are proposed, each facilitated by the Learning Analytics beyond the LMS that the toolkit enables.

#54 (Short): How do you connect? Analysis of Social Capital Accumulation in connectivist MOOCs - Srećko Joksimović, Nia Dowell, Oleksandra Skrypnyk, Vitomir Kovanović, Dragan Gašević, Shane Dawson and Arthur C. Graesser - Connections established between learners via interactions are seen as fundamental for connectivist pedagogy. Connections can also be viewed as learning outcomes, i.e. learners' social capital accumulated through distributed learning environments. We applied linear mixed effects modeling to investigate whether the social capital accumulation interpreted through learners' centrality to course interaction networks, is influenced by the language learners use to express and communicate in two connectivist MOOCs. Interactions were distributed across the three social media, namely Twitter, blog and Facebook. Results showed that learners in a cMOOC connect easier with the individuals who use a more informal, narrative style, but still maintain a deeper cohesive structure to their communication. 

#74 (Short): Exploring College Major Choice and Middle School Student Behavior, Affect and Learning: What Happens to Students Who Game the System?; Maria Ofelia San Pedro, Ryan Baker, Neil Heffernan and Jaclyn Ocumpaugh - Choosing a college major is a major life decision. Interests stemming from students' ability and self-efficacy contribute to eventual college major choice. In this paper, we consider the role played by student learning, affect and engagement during middle school, using data from an educational software system used as part of regular schooling. We use predictive analytics to leverage automated assessments of student learning and engagement, investigating which of these factors are related to a chosen college major. For example, we already know that students who game the system in middle school mathematics are less likely to major in science or technology, but what majors are they more likely to select? Using data from 356 college students who used the ASSISTments system during their middle school years, we find significant differences in student knowledge, performance, and off-task and gaming behaviors between students who eventually choose different college majors.  

#118 (Short): Developing an Evaluation Framework of Quality Indicators for Learning Analytics; Maren Scheffel, Hendrik Drachsler and Marcus Specht - This paper presents results from the continuous process of developing an evaluation framework of quality indicators for learning analytics (LA). Building on a previous study, a group concept mapping approach that uses multidimensional scal