When designing my methodology for this action research project, I carefully considered my fast-paced working environment and daily responsibilities. The 3D workshop sees students coming and going throughout the day, so I needed an efficient method to collect feedback about the new leaflets.
Initially, I considered observations as a primary data collection method. Observations would allow me to gather real-time insights into how the leaflet affected students’ ability to set up their files for laser cutting. This approach could capture environmental and contextual data often missed in surveys or interviews. However, several challenges made observations impractical. Observer bias, where behaviours aligning with expectations are emphasised, was a concern. Additionally, observations require time-intensive data collection, which wasn’t feasible during the workshop’s busiest period. Ethical concerns about privacy and obtaining consent further complicated the process, especially given the high demand in the workshop.
After careful consideration, I opted for surveys as my primary methodology. Surveys offer a time-efficient, flexible, and unbiased approach to data collection. They allow for gathering diverse types of data, including opinions and behaviours, while providing participants anonymity to reduce social desirability bias. As accessibility is a key goal of this project, surveys provide a comfortable way for participants to share honest feedback. Surveys can also be distributed widely, enabling students to complete them off-site and at their convenience—a perfect fit for their busy schedules.
Designing Survey Questions
Evaluating teaching and learning resources is essential to improving teaching practices. As Rowley (2003) highlights, student surveys, while not fully objective, can provide valuable feedback. My survey questions focused on five key areas for analysis:
- Visual Design – layout, impact, and composition.
- Text and Content – readability and messaging.
- Structure – organisation and clarity.
- Colour Use – visual appeal and effectiveness.
- Typography – font selection and impact.
To collect a range of data, I included both open and closed questions. Closed questions, such as tick boxes and rating scales, simplify responses and increase participation, while open-ended questions provide qualitative insights. Keeping the survey concise, with just five questions, ensured it was manageable for students without overwhelming them.
The Final Survey Questions (https://forms.office.com/e/38K74tMTEH?origin=lprLink):
- How helpful was the laser cutting file preparation leaflet? Choose one. (Scale from Unhelpful & Unclear to Excellent, very clear and helpful)
2. What aspects of the laser cutting file prep leaflet worked well? Tick the ones you agree with.
- -Clear language
- -Infographics
- -Colours
- -Images
- -QR codes
- -Information
- -Layout
- -Accessible
- -Other (space to write answer)
3. Would you feel confident preparing your file for laser cutting using just this leaflet? Yes, I am confident, No, I’d need additional help or Unsure.
4. What would you add or change to the leaflet to improve it? Enter your answer
5. How likely are you to recommend the 3D workshop at LCC to a friend or classmate? Scale 1 to 10 (10 being extremely likely)
Survey Distribution
Initially, I used printed surveys (see fig.1) in the workshop for immediate feedback. However, discussions with my tutor and peers highlighted potential biases and accessibility issues with in-person surveys. To address these concerns, I transitioned to a Microsoft Forms survey, enabling off-site completion and ensuring anonymity and consent. A QR code (see fig.2) was also displayed in the workshop for easy access. While this approach risked lower response rates due to timing flexibility as the students would have to complete it in their own time, it better aligned with ethical and accessibility considerations.


Final Thoughts
Choosing surveys as my methodology balanced practicality and ethical standards within my busy workshop environment. This approach supported my goal of improving accessibility and developing resources that meet student needs effectively.
References
Rowley, J. (2003), “Designing student feedback questionnaires,” Quality Assurance in Education, Vol. 11 No. 3, pp. 142-149.
Bibliography
https://researchmethodscommunity.sagepub.com/blog
https://www.intrac.org/resources/me-universe/
https://www.pewresearch.org/writing-survey-questions/
https://www-emerald-com.arts.idm.oclc.org//insight/content/doi/10.1108/09684880310488454/full/html