Wednesday, March 18, 2015

10:15am - 12:00pm - Location: Theater

1A: MOOCs - Assessments, Connections and Demographics

Wednesday, March 18, 2015

10:15am - 12:00pm

1B: Student Engagement and Behaviour

Session Chair:

Wednesday, March 18, 2015

10:15am - 12:00pm

1C: Indicators and Tools for Awareness

Session Chair:

#129 (Long) *Best Paper Nominee*: On the Validity of Peer Grading and a Cloud Teaching Assistant System; Tim Vogelsang and Lara Ruppertz - We introduce a new grading system, the Cloud Teaching Assistant System (CTAS), as an additional element to instructor grading, peer grading and automated validation in massive open online courses (MOOCs). The grading distributions of the different approaches are compared in an experiment consisting of 476 exam participants. 25 submissions were graded by all four methods. 451 submissions were graded only by peer grading and automated validation. The results of the experiment suggest that both CTAS and peer grading do not simulate instructor grading (Pearson's correlations: 0.36, 0.39). If the CTAS and not the instructor is assumed to deliver accurate grading, peer grading is concluded to be a valid grading method (Pearson's correlation:

#17 (Practitioner): How Should We Quantify Student Engagement?; Perry Samson - Student engagement is widely thought to be a key predictor of student motivation and achievement. Engagement has been defined as "both the time and energy students invest in educationally purposeful activities."  Unfortunately this doesn't identify what specific student actions to include in a quantification of engagement.   This interactive presentation invites participants to consider how they would quantify student engagement using technology.  The discussion will be informed from lessons learned at the University of Michigan where a rich database of student participation in class has been collected and related to student outcomes.

#171 (Practitioner): Using an Activity Dashboard to Support Awareness and Reflection in a European Virtual Seminar; Maren Scheffel, Hendrik Drachsler and Marcus Specht - In  order  to  support  students  in  online  learning  environments  to  become  more  aware  of  and  reflect  on  their  activities  we  have  implemented  an  Activity  Dashboard  within  the  learning environment of an online course. The Activity Dashboard provides feedback, visualised in radar diagrams and bar charts. At the end of the course an evaluation will be run and the different learning  groups  will  be  compared  with  one  another.  We  are  also  looking  into  comparing  the  previous runs where students had no dashboard and hope to see differences in the behaviour of the students that have been supported with the Activity Dashboard. 

#57 (Long): Examining Engagement: Analysing Learner Subpopulations in Massive Open Online Courses (MOOCs); Rebecca Ferguson and Doug Clow - Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we use analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.

#119 (Long): Exploring Networks of Problem-Solving Interactions; Michael Eagle, Andrew Hicks, Barry Peddycord III and Tiffany Barnes - Intelligent tutoring systems and other computer-aided learning environments produce large amounts of transactional data on student problem-solving behavior, previous work modeled the student-tutor interaction data as a complex network, and successfully generated automated next-step hints as well as visualizations for educators. In this work we discuss the types of tutoring environments that are best modeled by interaction networks and how the empirical observations of problem-solving result in common network features. We find that interaction networks exhibit the properties of scalefree networks such as vertex degree distributions that follow power law. We compare datasets from two versions of a propositional logic tutor, as well as two different representations of data from an educational programming video game. We find that statistics such as degree assortativity and the scale-free metric allow comparison of the network structures across domains, and provide insight into student problem solving behavior.

#20 (Long): The LATUX Workflow: Designing and Deploying Awareness Tools in Technology-Enabled Learning Settings; Roberto Martinez-Maldonado, Abelardo Pardo, Negin Mirriahi, Kalina Yacef, Judy Kay and Andrew Clayphan - Designing, deploying and validating learning analytics tools for instructors or students is a challenge requiring techniques and methods from different disciplines, such as software engineering, human-computer interaction, educational design and psychology. Whilst each of these disciplines has consolidated design methodologies, there is a need for more specific methodological frameworks within the cross-disciplinary space defined by learning analytics. In particular there is no systematic workflow for producing learning analytics tools that are both technologically feasible and truly underpin the learning experience. In this paper, we present the LATUX workflow, a five-stage workflow to design, deploy and validate awareness tools in technology-enabled learning environments. LATUX is grounded on a well-established design process for creating, testing and re-designing user interfaces. We extend this process by integrating the pedagogical requirements to generate visual analytics to inform instructors' pedagogical decisions or intervention strategies. The workflow is illustrated with a case study in which collaborative activities were deployed in a real classroom. 

#85 (Short): Socioeconomic Status and MOOC Enrollment: Enriching Demographic Information with External Datasets; John D. Hansen and Justin Reich - To minimize barriers to entry, massive open online course (MOOC) providers collect minimal demographic information about users. In isolation, this data is insufficient to address important questions about socioeconomic status (SES) and MOOC enrollment and performance. We demonstrate the use of third-party datasets to enrich demographic portraits of MOOC students and answer fundamental questions about SES and MOOC enrollment. We derive demographic information from registrants' geographic location by matching self-reported mailing addresses with data available from Esri at the census block group level and the American Community Survey at the zip code level. We then use these data to compare neighborhood income and levels of parental education for U.S. registrants in HarvardX courses and the U.S. population as a whole. Overall, HarvardX registrants tend to reside in more affluent neighborhoods. U.S. HarvardX registrants on average live in neighborhoods with median incomes approximately .45 standard deviations higher than the U.S. population. Parental education is also associated with a higher likelihood of MOOC enrollment. For instance, a seventeen year-old whose most educated parent has a bachelor's degree is more than five times as likely to register as a seventeen year-old whose most educated parent has a high school diploma.

#95 (Short): Towards Better Affect Detectors: Effect of Missing Skills, Class Features and Common Wrong Answers; Yutao Wang, Neil Heffernan and Christina Heffernan  - The well-studied Baker et al., affect detectors on boredom, frustration, confusion and engagement concentration with ASSISTments dataset were used to predict state tests scores, college enrollment, and even whether a student majored in a STEM field. In this paper, we present three attempts to improve upon current affect detectors. The first attempt analyzed the effect of missing skill tags in the dataset to the accuracy of the affect detectors. The results show a small improvement after correctly tagging the missing skill values. The second attempt added four features related to student classes for feature selection. The third attempt added two features that described information about student common wrong answers for feature selection. Result showed that two out of the four detectors were improved by adding the new features. 

#116 (Short): Learning Analytics beyond the LMS: the Connected Learning Analytics Toolkit; Kirsty Kitto, Sebastian Cross, Zak Waters and Mandy Lupton - We present a Connected Learning Analytics (CLA) toolkit, which enables data to be extracted from social media and imported into a Learning Record Store (LRS), as defined by the new xAPI standard. A number of implementation issues are discussed, and a mapping that will enable the consistent storage and then analysis of xAPI verb/object/activity statements across different social media and online environments is introduced. A set of example learning activities are proposed, each facilitated by the Learning Analytics beyond the LMS that the toolkit enables.

#54 (Short): How do you connect? Analysis of Social Capital Accumulation in connectivist MOOCs - Srećko Joksimović, Nia Dowell, Oleksandra Skrypnyk, Vitomir Kovanović, Dragan Gašević, Shane Dawson and Arthur C. Graesser - Connections established between learners via interactions are seen as fundamental for connectivist pedagogy. Connections can also be viewed as learning outcomes, i.e. learners' social capital accumulated through distributed learning environments. We applied linear mixed effects modeling to investigate whether the social capital accumulation interpreted through learners' centrality to course interaction networks, is influenced by the language learners use to express and communicate in two connectivist MOOCs. Interactions were distributed across the three social media, namely Twitter, blog and Facebook. Results showed that learners in a cMOOC connect easier with the individuals who use a more informal, narrative style, but still maintain a deeper cohesive structure to their communication. 

#74 (Short): Exploring College Major Choice and Middle School Student Behavior, Affect and Learning: What Happens to Students Who Game the System?; Maria Ofelia San Pedro, Ryan Baker, Neil Heffernan and Jaclyn Ocumpaugh - Choosing a college major is a major life decision. Interests stemming from students' ability and self-efficacy contribute to eventual college major choice. In this paper, we consider the role played by student learning, affect and engagement during middle school, using data from an educational software system used as part of regular schooling. We use predictive analytics to leverage automated assessments of student learning and engagement, investigating which of these factors are related to a chosen college major. For example, we already know that students who game the system in middle school mathematics are less likely to major in science or technology, but what majors are they more likely to select? Using data from 356 college students who used the ASSISTments system during their middle school years, we find significant differences in student knowledge, performance, and off-task and gaming behaviors between students who eventually choose different college majors.  

#118 (Short): Developing an Evaluation Framework of Quality Indicators for Learning Analytics; Maren Scheffel, Hendrik Drachsler and Marcus Specht - This paper presents results from the continuous process of developing an evaluation framework of quality indicators for learning analytics (LA). Building on a previous study, a group concept mapping approach that uses multidimensional scaling and hierarchical clustering, the study presented here applies the framework to a collection of LA tools in order to evaluate the framework. Using the quantitative and qualitative results of this study, the first version of the framework was revisited so as to allow work towards an improved version of the evaluation framework of quality indicators for LA.

Wednesday, March 18, 2015

1:00pm - 2:30pm - Location: Theater

2A: Institutional Perspectives

Wednesday, March 18, 2015

1:00pm - 2:30pm - Location:

2B: Students At Risk

Wednesday, March 18, 2015

1:00pm - 2:30pm

2C: Practice Across Boundaries

#146 (Practitioner): Developing policy for the ethical use of learning analytics at the Open University; Sharon Slade and Avinash Boroowa - Institutions are increasingly collecting, analysing and using student data with the aim of improving student satisfaction and success.  The use of learning analytics within the Open University is relatively new and, as such, existing policies relating and referring to potential uses of student data received fresh scrutiny to ensure continued relevance and completeness. In response, the Open University has addressed a range of ethical issues relating to its approach to learning analytics via the implementation of a new policy.  This presentation details the process undertaken and summarises the key principles on which the policy is built. 

#142 (Practitioner): OU Analyse: Analysing at-risk students at The Open University; Jakub Kuzilek, Martin Hlosta, Drahomira Herrmannova, Annika Wolff and Zdenek Zdrahal -  The OU Analyse project aims at providing early prediction of ‘at-risk' students based on their demographic data and their interaction with VLE. Four predictive models have been constructed from legacy data using machine learning methods. In Spring 2014 the approach was piloted on two introductory university courses with about 1500 and 3000 students, respectively. Since October 2014 the predictions have been extended to include 10+ courses of different level. For presenting predictions, the OU Analyse dashboard providing course overview and a view of individual students has been implemented. The presentation will include demonstration of the OU Analyse system.

#60 (Research Panel): Learning Analytics: European Perspectives; Rebecca Ferguson, Adam Cooper, Hendrik Drachsler, Gábor Kismihók, Alejandra Martínez Monés, Kairit Tammets and Anne Boyer- Since the emergence of learning analytics in North America, researchers and practitioners have worked to develop an international community. The organization of events such as SoLAR Flares and LASI Locals, as well as the move of LAK in 2013 from North America to Europe, has supported this aim. There are now thriving learning analytics groups in North American, Europe and Australia, with smaller pockets of activity emerging on other continents. Nevertheless, much of the work carried out outside these forums, or published in languages other than English, is still inaccessible to most people in the community. This panel, organized by Europe's Learning Analytics Community Exchange (LACE) project, brings together researchers from five European countries to examine the field from European perspectives. In doing so, it will identify the benefits and challenges associated with sharing and developing practice across national boundaries.

#145 (Practitioner): Riding the tiger: dealing with complexity in the implementation of institutional strategy for learning analytics; Kevin Mayles - Implementing strategy for learning analytics across an institution is a complex task. The Open University UK is undertaking a major change programme to enhance its use of analytics to drive student success. The strategy has been developed around capability in three areas: data availability, creation of insight and the ability to apply analytics in practice to impact the student learning experience. Lessons learned from this case study in institutional strategy formulation and the management of structural, socio-political and emergent complexities during implementation will be highlighted and shared.

#96 (Long): Who, When, and Why: A Machine Learning Approach to prioritizing Students at Risk of Not Graduating High School on Time; Everaldo Aguiar, Himabindu Lakkaraju, Nasir Bhanpuri, David Miller, Ben Yuhas, Kecia Addison, Shihching Liu, Marilyn Powell and Rayid Ghani - Several hundred thousand students drop out of high school every year in the United States. Interventions can help those who are falling behind in their educational goals, but given limited resources, such programs must focus on the right students, at the right time, and with the right message. In this paper, we describe an incremental approach that can be used to select and prioritize students who may be at risk of not graduating high school on time, and to suggest what may be the predictors of particular students going off-track. These predictions can then be used to inform targeted interventions for these students, hopefully leading to better outcomes.

#23 (Long) *Best paper Nominee*: Student privacy self-management: implications for learning analytics; Paul Prinsloo and Sharon Slade - Optimizing the harvesting and analysis of student data promises to clear the fog surrounding the key drivers of student success and retention, and provide potential for improved student success. At the same time, concerns are increasingly voiced around the extent to which individuals are routinely and progressively tracked as they engage online. The internet, the very thing that promised to open up possibilities and to break down communication barriers, now threatens to narrow it again through the panopticon of mass surveillance. Within higher education, our assumptions and understanding of issues surrounding student attitudes to privacy are influenced both by the apparent ease with which the public appear to share the detail of their lives and our paternalistic institutional cultures. As such, it can be easy to allow our enthusiasm for the possibilities offered by learning analytics to outweigh consideration of issues of privacy.  This paper explores issues around consent and the seemingly simple choice to allow students to  optin or optout of having their data tracked. We consider how 3 providers of massive open online courses (MOOCs) inform users of how their data is used, and discuss how higher education institutions can work toward an approach which engages and more fully informs students of the implications of learning analytics on their personal data.

#164 (Practitioner): Open-source Academic Early Alert and Risk Assessment API; Sandeep Jayaprakash, Josh Baron, Gary Gilbert, Eitel Lauria, Erik Moody and James Regan -  The presentation sums up the technical decisions and design strategies in building an open source academic early alert system. The system is an automation of the Open Academic Analytics Initiative (OAAI), a multi-year EDUCAUSE research grant focused on impacting the student retention rates. The system uses predictive analytics at its core to identify student population who are potentially at academic risk of not completing the course. The presentation details the data integration and data mining stages to formulate a potent prediction model using big data approaches. It also introduces an open source early alert Application Program Interface (API) developed using a generalized Learning analytics processor framework that can potentially support a wide range of learning analytics solutions.

#56 (Long): OpenCourseWare Observatory – Does the Quality of OpenCourseWare Live up to its Promise?; Sahar Vahdati, Christoph Lange and Sören Auer - A vast amount of OpenCourseWare (OCW) is meanwhile being published online to make educational content accessible to larger audiences. The awareness of such courses among users and the popularity of systems providing such courses are increasing. However, from a subjective experience, OCW is frequently cursory, out-dated or non-reusable. In order to obtain a better understanding of the quality of OCW, we assess the quality in terms of fitness for use. Based on three OCW use case scenarios, we define a range of dimensions according to which the quality of courses can be measured. From the definition of each dimension a comprehensive list of quality metrics is derived. In order to obtain a representative overview of the quality of OCW, we performed a quality assessment on a set of 100 randomly selected courses obtained from 20 different OCW repositories. Based on this assessment we identify crucial areas in which OCW needs to improve in order to deliver up to its promises.