Dr. Dan Davis (https://dan7davis.github.io/) recently received his doctorate from Delft University of Technology for his thesis titled Large Scale Learning Analytics: Modeling Learner Behavior and Improving Learning Outcomes in Massive Open Online Courses. The PhD project was funded by the Leiden-Delft-Erasmus Centre for Education and Learning. In his project, he uses and advances learning analytics techniques in open, online education at-scale by designing and experimenting with personalized and adaptive learning environments. Dan designs, develops, and evaluates instructional interventions at scale to gain a deeper understanding of how the design of online learning environments affect learner engagement and success. He is currently a director of product management at Pearson,
In addition to congratulating him on his recent accomplishment, we have asked him a few questions to gather his latest insights on the developments of open, online education:
Q1: Could you briefly explain how you came to research the topics you have chosen in your thesis? What is it about these topics that first interest you?
During my bachelors and masters, I studied mass communication and technology. I’ve always been intrigued by how information moves and is transferred from person to person. And during my masters, Georgetown had just begun their partnership with edX and were in the process of planning their first MOOCs. I started by helping film the video lectures for Georgetown’s Bioethics MOOC, and then that progressed into me helping build the course out in edX, and that ultimately culminated in my masters thesis, there I analyzed the data from that course I had invested so much time creating. That was my first exposure to learning analytics, and I found it fascinating enough to pursue it further in a PhD.
Q2: What are the key learnings from your research and from doing research?
There are two key takeaways I hope my research contributes to the online education industry and research community: (i) there is a responsibility for online course instructors and designers to constantly be experimenting with their course design and delivery mechanisms---the space is still very much in its infancy and many questions of efficacy still arise; the online learning community is uniquely positioned to innovate and experiment with new designs in learning and pedagogy. (ii) One cannot simply apply learning science principles that have been found to be effective in traditional learning environments “out of the box” to online learning environments; there are many nuances that much be achieved to enact the same cognitive processes among online learners for the desired effects to take place.
Q3: What were some of the difficulties you encountered and how did they influence your research topic?
The first and primary difficulty I encountered was the issue of non-compliance. When we would deliver an experimental treatment to learners in an online course, we observed compliance rates ranging from 14-33%. That becomes very problematic for statistical power if one wants to make generalizable conclusions about the effect of an intervention. The other difficulty I encountered was the fact that I did my PhD in a computer science department. As I mentioned before, my background was in communication, so it took a lot of hard work to play catch-up and acquire some computer science skills which were integral to performing large-scale learning analytics.
Q4: One of your research question is “How can MOOC environments be improved to advance the possibilities of experimentation?”. How should MOOC environments be improved and where do you think the development of MOOC environments will be or should be heading towards?
I think the more pressing question here is related to the field of learning analytics. I am not really concerned with the future of MOOCs as much as I am the future of online education. And for online education to flourish, we need the mindset and data to drive positive change towards improving efficacy. To do that, we need to focus the field of learning analytics towards actionable research---how can we harness educational data to improve learning outcomes and engage learners more effectively?
Q5: Do you see any tension between open, online education at-scale and personalized adaptive learning? How far do you think we are from personalized adaptive learning?
I think we are decades away from truly personalized, adaptive learning. In order to reach that milestone, we have to know precisely what every type of learner needs at every possible situation in their journey towards a learning outcome. Integral to that journey towards adaptivity, I believe, is a vast series of randomized trials like the ones in my thesis that aim to uncover what tends to work for which learners in certain situations. However, while I ran around 11 of these experiments, I believe hundreds of thousands more are required to reach the level of granularity required for personalized and adaptive learning to be attainable.
Q6: Of all the propositions that you have provided, which proposition do you find most difficult to defend? And why?
I find proposition 7 (The face-to-face student-teacher relationships formed at a university campus are irreplaceable and will become a luxury available only to a select few as more education is moved online.) the most difficult to defend. This is due to a tension I have experienced throughout my experience researching MOOCs and online education in general. My formal education background consists exclusively of small-scale, intimate learning environments. For my bachelors and masters, I never had a class with more than 30 students enrolled per professor. And now, here I am trying to find new ways to scale-up online education and, in some ways, create more of a distance between professor and student. I mitigate that tension by trying to account for a future where both large-scale / distance learning can achieve those sort of deep connections.
Q7: What advice would you give universities regarding the use of technology to enhance education?
Always look for new ways to innovate and test new designs in learning and teaching. And then share your results no matter how positive, negative, or boring they may seem. Every bit of information is valuable for pushing the industry forward and serving learners as best we can.