Process notes for an evaluation study

As I mentioned yesterday, I have just begun a study to find out how refugees can best learn from MOOCs and other online resources, with the very kind support of the Kiron Higher Education Organisation‘s staff, volunteers and learners, who have agreed to participate as a case study. I will give a bit more background and context to the study here for anyone who may be interested in following it. First though, it’s important to note that Kiron themselves, and several of their Higher Education (HE) partners are already undertaking research in this area, for example by comparing the performance of refugees in MOOCs with that of enrolled students in a parallel mainstream HE course (where both groups take the identical assessment), and mapping the performance outcomes of Kiron’s own learners who have participated in live online tutorials against those who have not. I hope that my research will augment and supplement this work.

I plan to carry out the research openly (following Pitt et al, 2016, and many other friends and colleagues in the open education sector) – maintaining strict anonymity for the refugee participants so as not to compromise them in any way – but sharing my processes, dilemmas and interim findings here under an open licence. The current study will continue until the end of 2017, and I aim to have a paper ready to share in the new year. This is to fit in with the requirements of my PhD programme, which has three modules in the first two years, each one culminating (hopefully!) in a publishable piece. This paper is for the second module, which is on evaluation in HE. (The first was on policy and change in HE – I will be presenting it at the ALT conference in September and blogging about it soon.) My research plan, as described here, has been developed with help from my PhD module tutor, Murray Saunders.

The focus of the research will be on what, in the learners’ perception, has helped them to succeed in learning online. The evaluation will be ‘developmental’ (Patton, 2011), meaning that the findings could be used to trigger change or enhancements to Kiron’s support programme for refugee learners. Further potential uses of the evaluation by other groups might include informing support programmes aimed at inclusivity or widening participation in higher education through open education.

I hope that this study will be the start of a more in-depth investigation into the learning strategies and open education support processes that enable refugees to succeed in higher education. Whilst I am individually responsible for the outputs of my PhD, I will be working in close partnership with the Kiron team and the research participants (a group of Kiron’s learners I met last weekend), and I hope to collaborate with other researchers working in the fields of open education and migration; I will also be sharing work-in-progress with my PhD peers at Lancaster and with fellow members of the GO-GN (Global OER Graduate Network).

IMG_3816

Clarifying points in the brainstorm during discussion in focus group session with Kiron learners on 5/8/2017 (Photo by Donya Zikry)

Next – a few notes about my research methodology and methods for this project. This will be an in-depth qualitative evaluation, with a relatively small number of participants (approximately 15). My methodology will be grounded theory/ categorical analysis. The main idea behind this approach is that the researcher constructs themes from what the research participants have said, and then organises the data accordingly, thereby providing a new lens through which to look at the question or issue being explored.

My data generation (to use a term from Pat Thomson, who makes a well-argued case for not talking about ‘data collection’) methods will involve progressive focusing, meaning that I will have a series of communication events with the participants, each time gathering a bit more information from them, analysing what they have said, and selecting points for more in-depth focus. I will then ask the participants for further information and repeat the analytic process. My data-gathering ‘events’ will include focus groups (which I briefly described in the previous blog post), emails to the participants with prompts for them to reply via email, WhatsApp messages with prompts for them to send me text responses or recorded voice messages, face-to-face interviews (where possible) and online interviews. I will focus some of the research on how these different formats may have furthered (or hindered!) the generation of data. I will audio record all verbal responses and interviews, but will not make full transcripts of these; instead, in the interests of efficiency, I will make notes and will tag specific points in my notes where I want to listen to the recording and transcribe a segment of speech for content analysis.

My analysis of the data will be informed by conceptual frameworks from the literature, particularly Garrison et al’s Community of Inquiry framework (2010), with possible adaptations as recommended by Jaffer et al (2017), and Saunders et al’s (2005) notion of ‘provisional stabilities’. The evaluation will also focus on the usefulness of these conceptual frameworks.

I would love to receive thoughts and questions on this process as I go along, and any links to similar research in progress, so please feel free to comment below.

References

Garrison, D.R., Anderson, T. & Archer, W., 2010. The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13, pp.5–9. Available at: http://dx.doi.org/10.1016/j.iheduc.2009.10.003.

Jaffer, T., Govender, S. & Brown, C., 2017. “The best part was the contact!”: Understanding postgraduate students ’ experiences of wrapped MOOCs. Open Praxis, 9(2), pp.207–221. Available at: http://openpraxis.org/index.php/OpenPraxis/article/view/565/312.

Patton, M.Q., 2010. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, New York: Guilford Press.

Pitt, R. et al., 2016. Open Research, Milton Keynes: OER Hub. Available at: http://oro.open.ac.uk/48035/1/OpenResearch.FINAL_.pdf.

Saunders, M., Charlier, B. & Bonamy, J., 2005. Using Evaluation to Create “Provisional Stabilities”: Bridging Innovation in Higher Education Change Processes. Evaluation, 11(1), pp.37–54. Available at: http://evi.sagepub.com/content/11/1/37.abstract.

About Gabi Witthaus

Open educator. Learning and Teaching Facilitator for School of Business and Economics, Loughborough University. Consultant & blogger at Art of E-learning. Previously Research Associate at University of Leicester (Beyond Distance Research Alliance and Institute of Learning Innovation); Distance Learning Manager at Bradford University School of Management. Masters in Training and Development (USQ, Australia); Masters in English Education (Wits University, South Africa). PhD in Higher Education: Research, Evaluation and Enhancement through Lancaster University - in progress.
This entry was posted in Refugees and open education and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s