Narrative case studies (2x 600 words)

(1) Outputs and outcomes: A reflection on facilitating learning workshops

This case study reflects on my role as a Teaching Assistant in designing and facilitating weekly seminars for eleven MPA students on Making Decisions: Evidence and Evaluation, led by Professor David Eaves. It illustrates how I planned, adapted, and delivered sessions as students learned to analyse complex policy problems (A1, A2, K2, K3, V1). Each 90-minute seminar aimed to integrate readings and lectures by blending academic theory with the craft of policy analysis. For the module assignment, students were required to develop individual policy memos addressing a self-defined policy problem. The seminars therefore required formats beyond plenary discussion – creating space for experimentation, application, and reflection as students shaped their analysis (A2, K3, V1).

After a positive introductory seminar, I entered the second session intent on sustaining momentum – discussing all core readings, drawing insights, and quickly moving into shaping students coursework. Coming from a professional policy background, where workshops prioritise outputs and deliverables, I imported the same pace into a pedagogical setting. In retrospect, this misjudged the cognitive demands of learning. Attempting to cover dense readings and then switch into applied coursework created overload (K3). Discussion remained lively, but I felt I dominated the conversation, driving rather than facilitating learning. Without clear learning objectives, the session blurred into mini-lecturing rather than active inquiry (A2, V3).

This became a turning point. Drawing on insights from the Teaching Associate Programme, I redesigned the third seminar as a workshop with alternating paces – silent reflection, peer feedback, and group discussion – structured around Moore’s Strategic Triangle (Donahue, 2017), the week’s key reading. I defined a clear learning objective, aligned with Bloom’s revised taxonomy (Anderson & Krathwohl, 2001): for students to analyse their policy context through Moore’s Strategic Triangle and evaluate how imbalances across its dimensions shape policy feasibility. This intentional use of constructive alignment between objectives, activity, and reflection marked a shift from output-driven to learning-centred facilitation (A1, K2, K3, V3).

This also de-centred my role, positioning me as a facilitator rather than an instructor – designing my voice out discussions. The result was sharper reasoning and more collaborative exchange amongst the students; each shift in activity re-energised the room and prompted them to interrogate one another’s assumptions. I also found that preparing slides and prompts in advance reduced cognitive load on me, allowing me to focus on dialogue rather than the sequencing of tasks – enhancing the seminar’s flow and inclusivity (A2, A4, K4, V1). Subsequent student feedback from this session highlighted a clearer structure, enjoyment and greater confidence in applying the framework (K5).

While this session was successful and I received positive feedback, it still felt over-scoped. I was still trying to cover everything rather than allow students the space to dwell and play with it. What I learnt here was that in a learning context, the objective is cognitive development rather than artefact generation. This contrasted with professional facilitation, where tangible artefacts and actions often define success. This experience suggests that effective teaching depends on balancing process and productivity – a point I have since been very conscious of when planning subsequent seminars (A5, K5, V5).

This experience reframed my perception of learning. Clear objectives anchor the seminar, but pacing defines its depth. Students need space to exhaust their initial reasoning before new prompts can build insights. Shifting my mindset from output to process made seminars more engaging, inclusive, and intellectually generative. This approach reduces cognitive overload and fosters deeper engagement (K3, V3). Moving forward, this reflection has defined how I design and facilitate learning – prioritising quality of thinking over coverage and continuing to refine my practice through reflective development (A5, K5, V5).

References

(2) Designing feedback mechanisms to build evaluative judgement

This case study reflects on my role as a Teaching Assistant in designing and facilitating weekly seminars for eleven Masters of Public Administration students on Making Decisions: Evidence and Evaluation, led by Professor David Eaves. It examines how I adapted feedback mechanisms in response to students’ varied needs and levels of confidence, and how I refined my guidance to strengthen both the giving and receiving of feedback (A2, A3, K2, K3, V1).

My background in design shapes my belief that everything is a prototype to be shared early, revised often, and improved through critique. Initially, I applied this principle directly in seminars, encouraging students to share early drafts, test ideas informally, and use peer critique as a low-stakes rehearsal for their thinking. I quickly learned that this assumption was limited. Some students embraced early sharing; others found it anxiety-inducing, especially when English was not their first language or when they felt less secure in their analytical skills (V1, V2).

This reflection prompted me to reconsider how to ask students to share their thinking with me and their peers, both inside and outside of seminars. Pushing students too early (conceptually and to be open) risked overwhelming them; limiting feedback to supportive comments (and opportunities for them) risked flattening the analytical demands of the module. The question became not simply when to give feedback but what kind of feedback would best support learning at different stages and encourage sharing. This is where I felt leaning on peer feedback could balance this exchange: recipients gain direction, while feedback-givers develop evaluative judgement – all while students become more comfortable with sharing. My task, then, was to design environments that enabled thoughtful critique while protecting psychological safety and creating opportunities to offer one-to-one support where needed (A1, A2, A3, K3, V1, V3).

These reflections shaped the design of the final seminar, where students needed to rehearse their policy memo defence. At this stage they required less direct feedback from me and more structured opportunities to critique each other. Working iteratively through several options with the other PGTAs – triads, whole-group pitches, rapid rotations, and full-class simulations – we assessed their strengths and limitations. Triads mirrored the assessment but offered uneven visibility; whole-group pitches built confidence but constrained depth; rapid rotations maximised participation but sacrificed nuance. Through this iteration we realised that what was fundamentally needed was structured practice in defending arguments, deliberate exposure to high-quality questioning, opportunities to observe and analyse others’ reasoning, and time for reflection (A1, K5).

We adopted a single full defence simulation involving three students (A, B, and C), where each defended once and acted as a peer critic twice. The remaining students served as examiners: they read the memo in advance, observed the defence, and contributed to a structured reflection after each round – effectively critiquing the defence and questioning. This format balanced authenticity and inclusivity: it mirrored the real assessment while ensuring that every student occupied an active cognitive role – defending, questioning, or analysing (A2, A3, A4, K2, K3, V1, V2).

What made this mechanism robust was its capacity to activate evaluative judgement across the whole class. Students learned to pose sharper questions by watching peers do so; they refined their reasoning by observing others defend their arguments; and they strengthened reflective capacity through post-round discussions. My role shifted from directing answers to scaffolding inquiry, prompting students to probe assumptions, identify evidence gaps, and articulate trade-offs (A2, K3, V3). To support this, I drew on an earlier seminar activity using de Bono’s Thinking Hats (2017), which gave students analytical lenses through which to frame their questions and critiques.

This approach represents a deliberate, evidence-informed design choice aligned with PSF principles of constructive alignment, inclusive participation, and effective facilitation of learning. It reinforced that feedback is not a bolt-on component but a process that must be designed, paced, and mediated if it is to build genuine analytical capacity (A5, K5, V5).