About

This section describes the practice of using research and evidence to inform the design of interventions, and draw conclusions from evaluation and measurement.

We reference a number of papers that discuss consideration, frameworks and perspectives in the area of evaluation and evidence led practice, before showing some case studies that indicate how research and evidence has been used or gathered. These range from small scale pilots to larger programmes.

In proposing some selected publications, we have noted the insight they might contribute to career practitioners, service managers and policymakers.

Download PDF

Contents


  1. Practices and outcomes - Demonstrations of achieving different outcomes in a range of settings

  2. Further illustrations and perspectives - Sources of further perspectives, from discussions, podcasts, video etc

  3. Future research questions - Candidate topics for future research based on the CDI’s discussions with stakeholders.

1. Practices and outcomes

Selected publications that describe practices and outcomes for different challenges are listed below, with links in the title column. We have mostly included open access sources, but where the sources requires payment, it is noted next to the link by “(Paid)”.

Title

Insights

Brief description

Bimrose, J. (2004), What is effective guidance. Report by the University of Warwick for the Department of Education (Link)

Defining successful outcomes of guidance

This 204-page report, funded by the Department of Education, was commissioned over two decades ago, but contains a wealth of information from n=57 international case studies that provide perspectives on the question posed by the report’s title. A combination of literature reviews and research into client expectations is used to provide a range of potential outcome metrics by which to judge the success of guidance, noting also that expectations differ between stakeholders.

Maguire, M. (2004). Measuring the outcomes of career guidance. International Journal for Educational and Vocational Guidance, 4, 179-192. (Link)(Paid)

Considering measures in light of contextual factors that affect outcomes from guidance

This paper is oft-cited in evaluation research. The author draws attention to the various factors that can characterise a career guidance intervention and can influence an outcome. From such reflections, the author proposes how to consider the selection of suitable evaluation measurements. The implications are discussed for both practice research and policy-making. 

Crust, G. (2007). The impact of career related interventions in higher education. Journal of the National Institute for Career Education and Counselling, 17(1), 16-22. (Link)

Making the case for evaluation

The paper sets out the case for evaluating career services and their effectiveness, using the context of a higher education setting. (Similar arguments could be made in many settings to the ones proposed in this paper). Topics are covered that span commercial (cost effectiveness), effectiveness (the necessity to target capability gaps of potential users to effectively help them e.g. career management skills), and standards (the critical value of implementing an underlying process of change and to elicit feedback to drive further improvement.)

Baudouin, R., Bezanson, L., Borgen, B., Goyer, L., Hiebert, B., Lalande, V., Magnusson, K., Michaud, G., Renald, C., & Turcotte, M. (2008). Demonstrating Value: A Draft Framework for Evaluating the Effectiveness of Career Development Interventions. Canadian Journal of Counselling and Psychotherapy, 41(3). (Link)

Creating an effective evaluation framework for practice

This article was written in Canada, set against a backdrop where evaluation of practice was viewed as an exception rather than a norm. The authors develop and propose an evaluation framework for evaluation that permits linking the services provided with the client outcomes that are being achieved. The paper starts with a review of some existing evaluation frameworks from the literature, but recognises that “no one evaluation model is “best” in all regards”. Criteria are suggested for what makes for a “good” evaluation framework. The paper thereby offers both a practical tool and insights to consider the criteria of an effective evaluation framework.

Dany, F. (2014). Time to change: The added value of an integrative approach to career research. Career Development International, 19(6), 718-730. (Link)(Paid)

Conducting and Interpreting research to reach deeper insights 

The paper calls for an integrative approach to research. A criticism is levelled at some research approaches in the careers field, which are reviewed, that they “stick to narrow views of {chat a} career {is}”. The paper provides examples that invite career differences to be re-examined which seek alternative explanations than offered through some treatments. The authors propose that adopting their wider perspectives will create richer discussions amongst researchers.

Hiebert, Bryan, Karen Schober, and Lester Oakes. "Demonstrating the impact of career guidance." Handbook of career development: International perspectives. New York, NY: Springer New York, 2014. 671-686. (Link)(Paid)

Making the case for evaluation, and reviewing approaches to determine a choice of framework

This book chapter is contextualised by an agee in which there is an onus on career guidance practitioners to “prove it works”. The importance of the topic is described. The article describes some alternative approaches for documenting the impact of career guidance services that account for the emphasis on evidence-based practice and outcome-focused intervention, observed by the researchers. Drawing from examples in Canada, the US and Europe, the authors draw attention to a number of different frameworks that have been employed to measure impact.

Haug, E. H., & Plant, P. (2016). Research-based knowledge: researchers’ contribution to evidence-based practice and policy making in career guidance. International Journal for Educational and Vocational Guidance, 16, 137-152. (Link)(Paid) 

Considering the role and opportunity for valuable practitioner- research 

The paper focuses on researchers’ contribution to evidence-based practice and policymaking in career guidance. The article puts a specific focus on the need for a stronger involvement of the voice of users.

Neary, S., & Johnson, C. (2016). CPD for the career development professional. Crimson Publishing. (Link)

Recognising the role of research in practitioner CPD.

This publicly available book chapter does not introduce new research per se, but draws together thinking to explain the role of research in a practitioner's professional practice. A broad definition is taken of what constitutes “research”, with varied suggestions on where to source it. References are provided to other texts which discuss the wider concepts and rationale for “evidence based practice”.

Spokane, A. R., & Nguyen, D. (2016). Progress and prospects in the evaluation of career assistance. Journal of Career Assessment, 24(1), 3-25. (Link)

Reviewing the historical literature and adopting community based evaluation criteria

The paper reviews research into the evaluation of career assistance from 1970 to 2014 and finds n=23 studies, which are then examined for common conclusions and recommendation. The author suggests that the current body of work could be usefully augmented with research that showed community-level impacts, as well individual. 

Ali, S. R., Flanagan, S., Pham, A., & Howard, K. (2017). Translating the career development knowledge base for practitioners and policy makers. In The Handbook of Career and Workforce Development (pp. 227-242). Routledge. (Link)(Paid)

Communicating practice research outcomes in a way to influence public policy 

This book chapter discusses the potential for research in practice to be “translated” so that it can be used by a policy audience. It provides a framework by which research outcomes can be communicated and a three step process - “research; translation; and institutionalization” - to see the evidence implemented. The chapter reviews situations in, amongst others, the US Departments of Labour and Education to show the importance of putting evidence into a form that is understandable, about what types of programs work in different contexts for different populations. (The format also, implicitly, suggests considerations at the stage of designing the study such that it can generate outcomes in this format.)

Childs, R. (2019), Developing a methodology for evaluating the impact of Career Guidance in the modern age. Thesis submitted in partial fulfilment for the degree of Professional  Doctorate in Occupational and Business Psychology (DOBPsych). Kingston University UK (Link)

The process of developing and testing a methodology and instrument for evaluating the impact of career guidance.

This thesis describes the process for developing a methodology for evaluating the impact of Career Guidance interventions. There is a literature review which summarises methodologies published in peer reviewed journals since 1987. The recommendations that arose from the study were to develop a framework that could be used to guide and combine results from different studies together with the development of a measure that could be used as a benchmark by a wide range of researchers and practitioners. This led to an empirical study which involved the development of a potential benchmark measure which was then piloted on two very different samples to establish usability, acceptability, reliability and sensitivity.

Elliott, J., Stankov, L., Lee, J., & Beckmann, J. F. (2019). What did PISA and TIMSS ever do for us?: The potential of large scale datasets for understanding and improving educational practice. Comparative Education, 55(1), 133-155. (Link)

Using large public datasets alongside qualitative studies to increase insight and impact

The authors of the paper suggest that a gulf exists between researchers who use large data sets and other researchers who develop deeper qualitative understanding of individuals and groups and how they make career choices. The authors suggest how analysis of large data sets can be employed alongside the latter type of research to provide richer and deeper insights into, for instance, cross-cultural and regional differences between the career experiences of different groups. Specific examples are noted in this paper of using the PISA and LEO data which means outcomes at two stages of early adulthood. 

Whiston, S. C., Mitts, N. G., & Li, Y. (2019). Evaluation of career guidance programs. International handbook of career guidance, 815-834. (Link) (Paid)

Designing an evaluation activity having critically considered past evaluation studies

The authors first examine previous research related to the effects of career guidance programs or interventions with a discussion of the effectiveness of career guidance programs, which modalities are preferable in providing career guidance, which clients benefit from these interventions, and outcome measures that are typically used in the evaluation of career guidance programs. Second, the authors provide a summary of how to conduct an evaluation of a career guidance program. This overview utilises a six-step process for evaluating career counselling programs proposed in previous studies

Hansen, J. S. (2021). Critical Reflection and Ethical Responsibility in Career Counselling Practice. In H.

Koštálová, & M. Cudlínová (Eds.), A Practitioner's Guide to Uncharted Waters of Career Counselling: a Critical

Reflection Perspective (pp. 87-89). EKS. (Link)

Applying critical reflection and ethics in practice

This is a chapter from a book that promotes critical reflection on practice, but only one area where the book references aspects of evidence led and ethnical practice. Ethical dilemmas are noted, such as when career guidance places undue additional pressure on secondary school students, and proposes ways to think around the problem. 

Robertson, P. J. (2021). Evidence-based practice for career development. In P. J. Robertson, T. Hooley, & P. McCash (Eds.), The Oxford handbook of career development (pp. 353–370). Oxford University Press. (Link)(Paid)

Thinking through an evidence-led strategy for career guidance and mitigating risks and obstacles

The author discusses the ambition of having an evidence led approach to career guidance, and highlight some obstacles and challenges to the goal. Firstly, there are innate differences between the medical profession, which is often seen as a “standard” for such practice. Secondly, there “policymaking and practice are political processes and research evidence is necessary but not sufficient to influence decision-making.” Therefore, is it suggested that “to best inform practice, research evidence should be combined with local knowledge, practitioner experience, and input from service users”

Rice, S., Hooley, T., & Crebbin, S. (2021). Approaches to quality assurance in school-based career development: policymaker perspectives from Australia. British Journal of Guidance & Counselling, 50(1), 110–127. (Link)

Managing quality (in secondary schools)

This paper reports on research on how Australian secondary schools manage quality in career guidance. The paper answers questions on how policy makers define quality, the measures and mechanisms they use, and how these approaches should be classified. Policy-makers are found to have a wide range of approaches to embedding  quality in schools. These are arranged in terms of their frequency of use.

Cedefop et al. (2022). Towards European standards for monitoring and evaluation of 

lifelong guidance systems and services (Vol. I). Luxembourg: Publications Office of 

the European Union. Cedefop working paper; No 9.. (Link)

Establishing routines for evaluation and quality (in the context of adult guidance and lifelong learning)

Evaluation approaches developed over 2009-15 made good progress in Europe, but “are still short of providing clear methodological indications on the implementation of the suggested indicators and the reality and context of current monitoring and evaluation practices”. In this first report of three examines “individual support to careers and learning to shed light on the efficacy of current upskilling, reskilling and activation  measures by examining career development and guidance systems and services for adults.”

Konuk, M. and Yimaz, A. (2023). Investigation of experimental studies in the field of career counselling. International Journal of Education Technology & Scientific Researches, 8(25). (Link)

Designing research to evaluate career guidance

This study reviewed the approaches and models used in the studies conducted in the field of career counselling and vocational guidance between 2013-2023. showing both  themes and new areas. The content analysis also showed themes such as the most common dependent variable (which was topped with ‘career decision’).

Percy, C. and Hooley, T., (2023) Lessons for career guidance from return-on-investment analyses in complex education-related fields, British Journal of Guidance & Counselling  (Link)

Building ROI investment cases for careers guidance to influence policy or institutional decision making

The authors tackle the problem of finding a methodology to calculate the return on investment for career guidance, as a foundation of making cases for investment. A review was carried out of 32 studies in different countries that measured return on investment in education and related settings, but concluded there was a high degree of inconsistency. A practical method was ultimately proposed, while the discussion int the paper provides insights that can stimulate a critical appraisal of different approaches.

Winter, David. "A framework for analysing careers and employability learning outcomes." Journal of the National Institute for Career Education and Counselling 51.1 (2023): 15-25. (Link)

Taking a strategic and critical perspective on measuring outcomes, that draw awareness to inadvertent biases

Set against the context of seeing greater integration of careers education within the curriculum, the authors questions how to create suitable measurement frameworks that overcome ideological biases, and account for different forms of “capital” that a graduate might accrue on the path towards the labour market (e.g. sociological). A framework is suggested which profiles career interventions in terms of “depth” of learning (we can either ‘discern’, ‘acquire’, ‘adapt’ or ‘enhance’ while learning, across various “domains” where learning could occur (e.g. forms of capital like social networks and personal identity).

Bridgeman, J., & Giraldez-Hayes, A. (2024). Using artificial intelligence-enhanced video feedback for reflective practice in coach development: benefits and potential drawbacks. Coaching: An International Journal of Theory, Research and Practice, 17(1), 32-49. (Link)

Using AI to provide augmented feedback to coaches and counsellors from videos of client interactions

One of the applications mentioned for AI within coaching and counselling is the opportunity to provide feedback to the practitioner. One way to do this would be to use AI to ‘watch’ and ‘analyse’ videos of client interactions. In this paper, such a practice is explored. In this study, n=15 coaches were interviewed who had deployed it. Benefits were reported in terms of the insights it offered, leading to greater self-awareness. Drawbacks included the nervousness around using new technology and on seeing one’s own performance. Future research is suggested.

Cedefop (2024). Learning outcomes going global: a multifaceted phenomenon. Luxembourg: Publications Office. (Link)

Building learning outcomes into guidance to enable global comparisons for better learning

This study from Cedefop examines the international trend for measuring learning outcomes across different education based systems, in careers and beyond. The move to learning outcomes is described as “one of the most significant trends to have influenced European VET over the past two decades.”  The learning outcome approach facilitates new benefits, such as the ability to compare international policies, and also to create foundations for designing lifelong learning systems. A stakeholder analysis of these measures is also included.

Hughes, D., Mc Cormack, D., Neary, S., & King, P. (2024). Praxis in guidance and counselling: new frontiers. British Journal of Guidance & Counselling, 1-6. (Link)

Understanding the value of involving practitioners in research studies to clients, professionals and the evidence base

The authors draw attention to the fact that much research is conducted by academics without the input of practitioners - a feature of past studies that has been observed by other researchers. Reasons for this ‘praxis gap’ are cited. However, the authors make the case for involving practitioners more, drawing attention to the ways this can be done and the way that it can increase the value of the study: For instance, “a credible and sustainable model of professionalisation in careers practice depends on narrowing the gaps between both theory and practice.”



2. Further illustrations and discussions

The following publications represent examples and case studies of evaluation studies, to test the impact of career interventions. These straddle smaller scale pilots through to larger scale programmes, using multiple-stakeholder inputs. Papers also often refer to the use of an underlying theory or framework to guide the evaluation, which differs based on context.


Title

Insights

Brief description

Hughes, D. and Gration, G. 2006. Performance Indicators and Benchmarks in Career Guidance in the United Kingdom. University of Derby (Link)

Selecting indicators by which to measure career guidance effectiveness

The report reviews the use of indicators for the effectiveness of career guidance in Europe, and benchmarks the UK against other nations. The paper reviews a range of potential indicators that could be used and explains that they cover inputs, processes and outputs. The use of indicators by different UK institutions involved in the career landscape are described.

ETI (2009), Evaluating the Quality of Careers Information, Advice and Guidance provided by Career Information, Advice and Guidance Providers (Link)

Assessing information quality used in guidance using a holistic framework of measures

While this study is now dated in terms of the findings, it provides a systematic method and approach to assessing the quality of information (as well as wider CIAG) provision from different service providers, looking at both the information itself as well as policy and infrastructure dimensions (amongst others).

Frigerio, G. (2010). “Narratives of Employability: Effective Guidance in A Higher Education Context. A Qualitative Evaluation of the Impact of Guidance.” Higher Education Career Services Uni (Link)

Conducting a small scale case study project to better understand expectations of service users and the impact of the service.

This study is an example of a small-scale research study at Warwick University that took the form of six case study students who underwent a career consultation. Expectations were elicited before the event, and then outcomes reviewed two months later. While the authors stress the limited applicability of the specific study to the context where it was deployed, it shows a practical example of a small exploratory study to increase understanding of students’ expectations, build them into practice, and review the effectiveness of practice.

Reese, R.J., & Miller, C. (2010). Using Outcome to Improve a Career Development Course: Closing the Scientist-Practitioner Gap. Journal of Career Assessment, 18, 207 - 219. (Link)

Anticipating unexpected or anomalous outcomes

In this study, the authors test a follow up to a previous study to understand if modification that had been made led to desired improvements. The authors found a large increase in the effect size of a career class in terms of the self-efficacy of students, which were sustained into a second year of the course. The “uneven” nature of results prompted the researchers to evaluate improvements, however, and include measurements for outcome data. The paper provides an example of interactive learning.

Hiebert, B., Schober, K., & Oakes, L. (2014). Demonstrating the impact of career guidance. In Handbook of career development: International perspectives (pp. 671-686). New York, NY: Springer New York. (Link)(Paid)

Reviewing the different ways that evidence for career guidance impact and value has been presented

This chapter discusses the challenge of demonstrating the value of career guidance services. The authors review international practices and present alternative approaches for documenting the impact of career guidance services that embrace the current emphasis on evidence-based practice and outcome-focused intervention. In particular, to address a “prove it works” challenge, the authors provide examples of two approaches to dealing with this situation. (More studies have followed since this one in 2014).

Jacquin, P., & Juhel, J. (2017). An individual mixed‐evaluation method for career intervention. The Career Development Quarterly, 65(1), 16-28. (Link)

Demonstrating impact on clients over the course of a career programme

The paper addresses the challenge that career counsellors face to demonstrate the value of their services from policy makers. This method proposes a mix method approach to demonstrate impact: The method used 5 items related to a client’s career decision self-efficacy and studied the evolution of those items throughout the intervention of 1 career counsellor (43 days) to show improvements.

Frigerio, G. (2018). Making connections through practitioner research. In Graduate Careers in Context (pp. 179-192). Routledge.

Embarking on a practitioner research initiative to measure impact

This chapter focuses on the role of the career development practitioner integrating theory with their practice through engaging in practitioner research. It uses the systems theory framework developed to show the complexity of career development, where individuals are mutually influenced by a range of other people. 

Whelan, Nuala, et al. "EEPIC-Enhancing Employability through Positive Interventions for improving Career potential: the impact of a high support career guidance intervention on the wellbeing, hopefulness, self-efficacy and employability of the long-term unemployed-a study protocol for a randomised controlled trial." Trials 19 (2018): 1-18. (Link)

Designing an evaluation activity with a control group

The paper provides a detailed outline for an example study that involves a single-centre randomised, controlled, partially blinded trial. A total of 140 long-term unemployed job-seekers from a disadvantaged urban area will be randomly assigned to two groups: (1) an intervention group; and (2) a ‘service as usual’ group. Each group will be followed up immediately post intervention and six months later.

Maree, J. G. (2019). Group career construction counseling: A mixed‐methods intervention study with high school students. The Career Development Quarterly, 67(1), 47-61.(Link)(Paid)

Measuring a an intervention using multiple tests and synthesising the results

This study investigated the value of group career construction counselling in a high school with (n = 57) students. The paper describes the intervention, but also the range of tests and evaluations completed by students to measure different facets of pre- and post-intervention attitudes and capabilities: a) the Career Adapt‐Abilities Scale–South Africa (CAAS‐SA), b) the Career Interest Profile and c) the Maree Career Matrix. Results showed that the students’ career adaptability scores had improved meaningfully and no gender‐based differences had been introduced. However, differences were detected between both the boys’ and the girls’ pre‐ and posttest Control and Confidence subscale scores.

Hanson, J., Moore, N., Neary, S., & Clark, L. (2021). An evaluation of the North East of England pilot of the Gatsby Benchmarks of good career guidance. University of Derby (Link)

Designing a comprehensive evaluation into a major or complex intervention.

The evaluation used longitudinal research to examine the impacts of a career programme implemented in schools in the North East of England. It was particularly focussed on the impact of the programme on helping schools and students to reach Gatsby Benchmarks, which are indicators of progress in career knowledge, experience and capability. There were six different components to the measurement and evaluation exercise, which combined to give a robust and holistic understanding of how well the pilot performed. The evidence used for this work has helped to “make the case” for careers education in other schools, showing the wider value of such work.

The Careers & Enterprise Company (2021).Swindon and Wiltshire Careers Hub: Evaluation Guide for Careers Activities and Programmes: Edition 1 – February 2021. London: The Careers & Enterprise Company. (Link)

Creating a matrix of research subjects and research instruments to evaluate a programme

This document offers an example of a research approach employed by the Careers and Enterprise Company to measure the impact of a schools intervention programme. The document describes a matrix of stakeholders and research instruments that were used to give a rounded view of the programme, and come to robust conclusions that account for multiple stakeholder perspectives. Several methods are described for getting feedback, with some innovative ways offered for engaging target respondents.

Dodd, V., Hanson, J., & Hooley, T. (2022). Increasing students’ career readiness through career guidance: measuring the impact with a validated measure. British Journal of Guidance & Counselling, 50(2), 260–272. (Link)

Measuring career readiness amongst secondary school students

This research (1) details the development of a career readiness measure and (2) tests the relationship between career guidance interventions and career readiness among secondary school students over three separate studies. One factor across nine items was found to effectively capture career readiness. Greater participation in career guidance activities was also found to be significantly associated with increased career readiness.

HM Treasury (2024), The Green Book (2022). (Link)

Making cases that align with government practice

The Green Book is guidance issued by HM Treasury on how to appraise policies, programmes and projects. It also provides guidance on the design and use of monitoring and evaluation before, during and after implementation. The book is not prescriptive but provides a range of accepted methodologies and practices.

3. Future research questions

From the CDI’s discussions with stakeholders, we have heard an appetite for more practitioner based research, particularly recognising that there are very many variables to test. Related suggestions for future research activities in this area included:

  • A pilot fund to facilitate practitioners or researcher-practitioners  - for conducting a series of faster, small scale tests - that can shed light on the impact of changing different key variables in interventions (e.g. online vs offline, group size, dosage effects with different client groups etc).
  • Deconstruction of the career guidance interview into components, noting the decisions taken by a practitioner at different junctures, and evaluating the best decisions that practitioners could in different scenarios.
  • Development of metrics that can be used as useful proxies for mid and longer term outcomes from career interventions, that can be asked of clients shortly after a career intervention, and which mitigate the need to do costly, longitudinal studies.
  • Increasing the use of appropriate, proven career theories by practitioners, for both designing services and evaluating them, through training and support.
Having any trouble?

Having any trouble?

Do not hesitate to reach out to us anytime.