EDPJ 5020 UM Impact Evaluation of Teach for Australia Leadership Program Paper

User Generated

KVNAT979798

Humanities

EDPJ 5020

the university of manchester

EDPJ

Description

.Submission instructions: Task must be submitted online in a 'word' format document (Pdf can

be submitted in addition but not as replacement for word doc.)


Task: Prepare an outline of a program evaluation for an educational policy or program in your area of interest. This may be the same program that you chose for the logic model and presentation - in which case ensure you include improvements based on feedback from your logic model presentation. It is also possible to do this task on a new program, however, in that case you will need to create a new logic model.

Your outline should be no more that 2000 words (excluding references). Use the template provided. Evaluation plans generally include the following elements:
• A description of the program that will be evaluated
• An outline of the stakeholder group(s) for whom the evaluation was conducted
• A logic model encorporating program logic and the planned logic for the evaluation.
• A clearly identified Evaluation type and strategy
• The evaluation research questions and a rationale for your plans to answer them
• Planned data access and collection, with rationale for these choices.
• The positioning of the evaluation ( e.g. insider for insider, outsider for insider, etc....)
and any ethical issues
• Some attempt to explain how you plan to analyse the data (only preliminary work on
this is expected)
• Concluding comments, including some reflection on the strengths and weaknesses of
your approach
• Citations and a full reference list (including a direct reference for the program itself)
• A reference to the program you will evaluate (citation and url link to website)
Your outline is not limited to these elements but it should include them. Additional guidence
for this assignment will be provided in class.

Additional notes:
You should ensure that work which you submit for assessment is your own. If you use
specific ideas and arguments from authors you have read, as well as direct or reworded
quotations from them, always clearly acknowledge them and include the reference in the
reference list. This applies to electronic sources, such as web pages, as well as hard-copy
material.

Unformatted Attachment Preview

Adapted from: USA CDC Program evaluation resources EDPA5013: Program Evaluation Assignment 3: Evaluation plan (Individual) Stronger Smarter Leadership Program Impact evaluation of Stronger Smarter Leadership Program Commented [RW1]: This exemplar is a credit grade. Nice clear plans and well communicated but some detail, (design and sampling are neglected and rationale for PE approach missing) A good logic model, but further detail is possible. Further – could be better use of references, more concise writing reflecting evaluative and critical thinking. Prepared by: 1 Adapted from: USA CDC Program evaluation resources 1. INTRODUCTION The Stronger Smarter Institute was started in 2006 by Dr. Chris Sarra. The institute’s program is built on the successful practices that impacted the Indigenous students greatly during Dr. Sarra’s tenure as a principal. Commented [RW2]: Needs reference Evaluation Purpose This evaluation strives to determine if the Stronger Smarter Leadership Program has resulted in its intended outcomes. In particular, the purpose of the evaluation is to assess, after a period of three years, if the program has resulted in the following:  Enhanced leadership capabilities and cultural competence of school leaders  Improvement in attitudes and results of Indigenous students The evaluation would provide insight as to how effective the Stronger Smarter Program is in achieving its intended outcomes. It also aims to inform the Stronger Smarter Institute on future program development. Stakeholders The stakeholders for this evaluation would comprise of the Stronger Smarter Institute, the Indigenous community, the participating school leaders, students and funders. Table F.1 shows the detailed breakdown of the stakeholders’ assessment and engagement plan. Commented [A3]: Therefore this is an Impact evaluation? should make the PE type clearer. And provide a rationale for its choice Commented [A4]: for example….. cold be clearer on this specific PE approach and how this relates to the purpose. A committee involving Stronger Smarter Institute, the Indigenous community, the participating school leaders, funders and parents would be formed to facilitate discussion throughout the evaluation. Table F.1. Stakeholder Assessment and Engagement Plan Stakeholder Name Stronger Smarter Institute Participating schools leaders (school leaders in Queensland) Stakeholder Category Primary Primary Interest or Perspective  May fear program may not have its intended impact  May see program evaluator as a personal judgment  May want to determine the success of the program  May want to identify areas for Commented [A5]: Very thorough and clear     Role in the Evaluation Defining program context Identifying data sources Interpreting findings Assisting to formulate questions for community, students and school leaders  Assisting to formulate questions for students  Providing participant’s How and When to Engage  Consulted in evaluation planning and process  Consulted in crafting of questionnaires  Consulted in interpretation and analysis of findings  Consulted throughout the process  Consulted in evaluation planning and process  Crafting of questions for 2|Page Adapted from: USA CDC Program evaluation resources Stakeholder Name Stakeholder Category Interest or Perspective improvement  May want to know whether to participate/ continue to participate in program Role in the Evaluation perspective  Identifying data sources from school  Interpreting findings Participating students and their parents Secondary  May want to know the benefits of participating in program  Providing student’s perspective Indigenous community Secondary  May want to know the benefits of participating in program  Providing community’s context  Interpret findings Funders Tertiary  May want to know the benefits of funding the program  Providing funding context Evaluators Primary  Want to determine effectiveness of program  Defining information needed  Developing evaluation  Developing recommendations 3 How and When to Engage students during the evaluation  As a data source in the survey (questionnaire) and Focus group discussion during the evaluation  Inform of findings  Consulted in evaluation planning and process  As a data source in the survey (questionnaire) and interviews during the evaluation  Inform of findings  Consulted in evaluation planning and process  As a data source in the interviews during the evaluation  Inform of findings  Consulted in evaluation planning and process  Inform of findings  Direct role in evaluation  Involved throughout the process Commented [A6]: Where, could be more specific – which state? States? Commented [A7]: In this case….? Adapted from: USA CDC Program evaluation resources 2. DESCRIPTION OF WHAT IS BEING EVALUATED Need Indigenous students in Australia are faring worse compared to their non-Indigenous counterparts in schooling (Klenowski, 2009). While there have been improvements shown over the years, the number of Indigenous Australian students enrolled in universities constitute only 0.7% of the total enrollment (Chancellor & Anderson, 2013). More alarmingly, a large proportion of Indigenous students are not meeting the benchmarks for literacy at Year 7 and the number of Indigenous students completing a Year 12 certificate are significantly lower than their non-Indigenous counterparts (Klenowski, 2009). The Stronger Smarter Program aims to help Indigenous students do better in schools. It delivers a program that seeks to build capacity of schools and transform the learning outcomes of Indigenous students. Since it has being established in 2006, the program has been implemented in many schools and it is now timely to determine the effectiveness of the program. Context Many reasons are suggested for the underachievement by Indigenous students. Firstly, there may be a lack of local community support for Indigenous students’ education (Tripcony, 2000). Elders in the community may be concerned that schooling could corrupt the Aboriginal values (Tripcony, 2000). In addition, the elders were highly likely to have not attended school themselves, and as such, they may find schooling irrelevant for their children. Commented [A8]: Should also comment on the context for the EVALUATION , including other programs that may be concurrent,etc…. Another reason could be that schools may lack the capacity to engage Indigenous students (Tripcony, 2000). It has been found that Indigenous children have found schooling to be irrelevant to them due to the cultural differences (Tripcony, 2000). Schools would need to develop skills and knowledge in these areas to engage Indigenous students and make schooling relevant to them. The Stronger Smarter program aims to improve Indigenous student outcomes by building the capacity of leadership and cultural competence of the school. It would also target community involvement to help the Indigenous students to be engaged in school. Target Population The target population for the Stronger Smarter program is the schools in Australia with Indigenous students. Stages of development The program has been in place since 2006 and is well into its seventh year in operation. It has been implemented initially for schools in Queensland, but has now expanded to cover the whole of Australia. Resources/Inputs Staff from Stronger Smarter Institute, funding from sponsors, Ration Shed Museum for the venue to conduct the program, schools (school leaders and teachers), the local community, Indigenous Education, students and parents, Queensland University of Technology, online 4|Page Commented [A9]: Who is this? How many? In what settings? Provide a brief description Commented [A10]: Well researched, detailed Adapted from: USA CDC Program evaluation resources forums for communicating with staff from Stronger Smarter, and a time period of one year are the key inputs of the Stronger Smarter Program. Activities The Stronger Smarter program comprises of five phases for participants to attend (see P1 to P6). Phase one is a six day residential camp in Queensland where the participants focus on building collaborative and cultural competence and leadership capacity to enhance ability to facilitate change, and engage Indigenous community (Stronger Smarter Institute, 2013). In phase two, participants will return to their schools and engage the school and community in the plans to transform schools (Stronger Smarter Institute, 2013). Participants will then discuss these plans with the Stronger Smarter Institute back in Queensland for phase three (Stronger Smarter Institute, 2013). The plans are then implemented in phase four followed by a ceremony in phase five where the Stronger Smarter Institute would acknowledge and celebrate school transformation plans and positive student outcomes (Stronger Smarter Institute, 2013). Outputs As a result of participating in the Stronger Smarter Program, schools would have conceptualised a school transformation plan that would engage Indigenous students in the school. Outcomes The main outcomes that would be focused on for the evaluation would be Intermediate outcomes (outcomes) as follows: 1. School leaders and community with better leadership capacity and cultural competence; 4. Increased numeracy and literacy for Indigenous students; 5. Increased attendance for Indigenous students; 6. Increased engagement for Indigenous students; For the rest of the outcomes, please refer to Logic model F.2. Ultimately, the Stronger Smarter program seeks to improve schooling for Australian students. After attending the Stronger Smarter Program, participating schools would continue to work on their plans and engage the students and community to seek a better school experience for the students. 5 Commented [RW11]: How many? What’s the ‘scale’ of this program? Commented [RW12]: How long does the whole program take? Adapted from: USA CDC Program evaluation resources Logic Model Figure F.2: Logic Model Commented [A13]: Clear and concise…well done, additional detail on the quantum would be helpful. Commented [RW14]: Number 1 box output looks more like outcomes. Number 8 I would change to ‘additional school activities for execution of plan’ 3. EVALUATION DESIGN Evaluation Questions The following questions were prioritised for the evaluation 1. Have the school leaders in SSSP schools in Qld demonstrated increased capacity in leadership and cultural competence over a period of three years? 2. Haves the Indigenous students in SSSP schools in Qld shown enhanced student achievement over a period of three years? Q1 stems from Outcomes 1 (see F.2) while Q2 follows from Outcomes 4, 5 and 6 (see F. 2). 6|Page Commented [A15]: How are these defined? Commented [A16]: In what- be specific Adapted from: USA CDC Program evaluation resources Evaluation Design The evaluation stems from following an impact evaluation model and would adopt a mixed methods approach, with a pre-post with comparison group design (Owen, 2006). A mixed methods approach allows for collection of data from multiple sources. By triangulating these data, the evaluation would be more robust. The pre-post with comparison group design would consider the intention-to-treat effects in its application. As such, it will involve all schools in the state of Queensland. This design not only compares the effect of schools that have participated in the program with those that have not participated, it also allows for causality deductions. Commented [RW17]: Could be sharper, more concise. sampling approach not evident.. (apart from ‘intention to treat’ approach) will all participant be included in the research data collection? Commented [RW18]: This is a Quasi-experiment, make that clear and acknowledge the weaknesses. More detail on the comparison group is needed. Commented [A19]: Good clear design and focus – should be echnosed in the questions,etc. 4. DATA COLLECTION Data Collection Methods To address Q1, a questionnaire adapted from the Revised Self-leadership Questionnaire (Houghton & Neck, 2002) and the Multicultural Counseling Awareness Scale: Form B (Kocarek, Talbot, Batka, & Anderson, 2001) would be administered to school leaders before the start of the program and again at the end of the three years. These questionnaires have been found to be reliable and valid (Houghton & Neck, 2002; Kocarek et al., 2001), but need to be re-phrased in the context of schools. The purpose of the questionnaire seeks to address how the school leaders perceive their leadership and cultural competence after three years. It would target all the school leaders in the state of Queensland. Next, a focus group discussion would be conducted that would involve the school leaders, community and staff separately. This discussion would focus on whether the school leaders have demonstrated cultural awareness in their leadership. A representative sample (by school) would be involved in the focus group discussion. For Q2, a questionnaire adapted from the National Student Engagement Survey (Kuh, 2001a) would be administered to the students before the commencement of the program and again after three years. The survey is reliable and valid (Kuh, 2001b), and would be adapted for the local Australian context. The purpose of the survey attempts to find out whether the students are engaged in school. In addition, a representative sample of students, teachers and parents would be interviewed to determine the students’ attitudes towards school. Finally, secondary data pertaining to attendance, NAPLAN, conduct grades would be sought from schools to be analysed. 7 Commented [RW20]: A good section Commented [A21]: Well researched and specific Adapted from: USA CDC Program evaluation resources Data Collection Method – Evaluation Question Link Table F.3: Evaluation Questions and Associated Data Collection Methods Attribute Evaluation Question Q1. Have the school leaders demonstrated increased capacity in leadership and cultural competence over a period of three years? Q2. Hasve the Indigenous students shown enhanced student achievement over a period of three years? Data Collection Method Source of Data Leadership competence Questionnaire Adapted from Revised Selfleadership Questionnaire School leaders Cultural competence Questionnaire Adapted from Multicultural Counseling Awareness Scale: Form B School leaders Leadership integrated with cultural awareness Focus group discussion School leaders, Staff, Community Student Engagement Questionnaire adapted from the National Student Engagement Survey Students Student attitudes towards education Interview Students, Parents, Teachers Is it a student questionnaire or teacher completed? Student achievement and performance Secondary Data Attendance, NAPLAN results, Conduct grades Commented [A24]: Good idea Commented [A22]: Need to be defined somewhere. Commented [A23]: Well researched. 5. DATA ANALYSIS Indicators and Standards Table F.4: Indicators and Success Evaluation Question Q1. Have the school leaders demonstrated increased capacity in leadership and cultural competence over a period of three years? Q2. Has the Indigenous students shown enhanced student achievement over a period of three years? Criteria or Indicator Standards (What Constitutes “Success”?) Cultural competence component in Questionnaire Medium effect size Leadership capacity component in Questionnaire Medium effect size NAPLAN Medium effect size Student engagement component in Questionnaire Medium effect size Attendance Significant difference at 95% confidence level Commented [A25]: -Give details or a ref to explain this to your readers. Cohen’s ‘d’ ? % improvement? Or a shift in the proportion not meeting minimum national standards on NAPLAN? 8|Page Adapted from: USA CDC Program evaluation resources Commented [A26]: A strong plan. Analysis The effectiveness of the program would be measured on growth in the following 1. Cultural competence of school leaders 2. Leadership capacity of school leaders 3. Student engagement 4. Student results (NAPLAN) Should also mention comparison group and comparative analyses. Data will be analysed using the following software Excel and STATA. We will report descriptive summary statistics for each of the measures for both before and after program implementation along with national norms where available (NAPLAN and Student engagement). Inferential statistics such as ANOVA/ MANOVA and paired sample t-tests would be used to study the differences in before and after program implementation and across schools that have participated in program and have not. Exploratory studies targeting the effects of Indigenous students and the rest of the student population would also be studied. Effect sizes would also be explored to determine the magnitude of the effects. Qualitative data will be coded and categorized according to themes using NVivo (Braun & Clarke, 2006). We would present these themes with illustrative quotes from the data. 6. EVALUATION COMMUNICATION AND MANAGEMENT The findings of the study would be disseminated in many ways. Firstly, a concise report of 10 pages would be provided for Stronger Smarter Institute and funders. A brief report would also be published for participating schools and made available online for the public’s information. To encourage schools to participate in the evaluation, schools can also request for their school’s analysis. In addition, Stronger Smarter Institute, participating schools and funders would be invited to attend a sharing session on the findings from the evaluation. Commented [A27]: Good, clear specific plans. A proposed budget of $314 000 is proposed for the three year study. This will come in sponsorship from the Stronger Smarter Institute and research funds from the University of Sydney. Commented [RW28]: Very detailed. A strong example. Timeline table Table F. 5: Timeline and Budget S/N 1 Activity Formation of steering committee for evaluation 2 Discussion and firming up of evaluation plans Including participants involved data collection methods and sources Crafting of questionnaire for school leaders and students 3 Stakeholders Representatives from school leaders, parents, Stronger Smarter representatives, funders, Indigenous community members and evaluation team Steering committee Timeline October 2013 to November 2013 Budget ($) 1000 For travelling expenses November 2013 to Feb 2014 1000 For travelling expenses Stronger Smarter representatives, school leaders, evaluation team March 2014 to June 2014 50 000 For purchasing copyrights to questionnaire and travelling expenses 9 Adapted from: USA CDC Program evaluation resources S/N 4 Activity Pilot testing of questionnaires Stakeholders Stronger Smarter representatives, school leaders, students, evaluation team Timeline July 2014 to August 2014 5 Finalisation of questionnaires Evaluation team August 2014 to November 2014 6 Pre-testing of questionnaires School leaders, students (include Indigenous and population at large), evaluation team March 2015 to May 2015 7 Pre-testing Focus group discussion and interviews Selected parents, teachers, school staff, community members, school leaders, and evaluation team March 2015 to May 2015 8 Analysis of Pre-test Evaluation team 9 Post-testing of questionnaires School leaders, students (include Indigenous and population at large), evaluation team June 2015 to September 2015 March 2018 to May 2018 10 Post-testing Focus group discussion and interviews Selected parents, teachers, school staff, community members, school leaders, and evaluation team March 2018 to May 2018 11 Analysis of Post-test Evaluation team 12 Preparation of findings Steering committee June 2018 to September 2018 October 2018 to December 2018 13 Publication of reports and sharing session All January 2019 Total Budget ($) 10 000 For printing of questionnaires and travelling expenses 1000 For travelling expenses 50 000 For printing of questionnaires and travelling expenses 50 000 For room rental, refreshments and travelling expenses Free 50 000 For printing of questionnaires and travelling expenses 50 000 For room rental, refreshments and travelling expenses Free 1000 For travelling expenses 50 000 For publication fees $ 314 000 10 | P a g e Adapted from: USA CDC Program evaluation resources References Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77-101. doi: 10.1191/1478088706qp063oa Chancellor, A. V., & Anderson, I. (2013). INDIGENOUS AUSTRALIANS AND HIGHER EDUCATION. Forward thinking: emerging answers to education’s big questions/by Ian Anderson, 6. Houghton, J. D., & Neck, C. P. (2002). The revised self-leadership questionnaire: Testing a hierarchical factor structure for self-leadership. Journal of Managerial Psychology, 17(8), 672-691. Klenowski, V. (2009). Australian Indigenous students: addressing equity issues in assessment. Teaching Education, 20(1), 77-93. doi: 10.1080/10476210802681741 Kocarek, C. E., Talbot, D. M., Batka, J. C., & Anderson, M. Z. (2001). Reliability and Validity of Three Measures of Multicultural Competency. Journal of Counseling & Development, 79(4), 486-496. doi: 10.1002/j.1556-6676.2001.tb01996.x Kuh, G. D. (2001a). Assessing What Really Matters to Student Learning Inside The National Survey of Student Engagement. Change: The Magazine of Higher Learning, 33(3), 10-17. Kuh, G. D. (2001b). The National Survey of Student Engagement: Conceptual framework and overview of psychometric properties. Bloomington, IN: Indiana University Center for Postsecondary Research, 1-26. Owen, J. M. (2006). Program evaluation: forms and approaches. Crows Nest, N.S.W: Allen & Unwin. Stronger Smarter Institute. (2013, July 2013). Stronger Smarter Leadership Program. Retrieved 17 September, 2013, from http://strongersmarter.com.au Tripcony, P. (2000). The most disadvantaged? Indigenous education needs. New Horizons in Education(December 2000), 59-81. 11 Commented [A29]: Good use of references. EDPA5013: Program Evaluation Assignment 3: Program Evaluation plan {Evaluand/Program Name} Title for Evaluation Plan Prepared by: {Your name (usyd enrolment name)} {submission Date} 1 1. INTRODUCTION (500 WORDS MAX, INCL. TABLE) This section provides information about the purpose of the evaluation, and how it will meet stakeholders needs. Evaluation Purpose • What does this evaluation strive to achieve? • What is the purpose of this evaluation? • What type of evaluation will you use? And why? • How will findings from the evaluation be used? Stakeholders do NOT repeat information provided in your table, but you can make additional comment/clarification where necessary. Table elements: • Who are the stakeholders for this evaluation? • How do you plan to engage these stakeholders when implementing the evaluation plan (e.g., participate in collecting data, help to interpret findings)? Table F.1. Stakeholder Assessment and Engagement Plan Stakeholder Name Stakeholder Category {Primary, secondary, tertiary} Interest or Perspective 2 Role in the Evaluation How and When to Engage 2. DESCRIPTION OF WHAT IS BEING EVALUATED (600 WORDS MAX, NOT INCLUDING LOGIC MODEL – WHICH SHOULD BE PASTED AS A PICTURE/JPG OR GIF. MODEL SHOULD BE NO LARGER THAN 1 PAGE, NORMAL MARGINS.) This section provides detailed information about what you are evaluating, the evaluand. In this section describe the need, context, target population, and stage of development of what is being evaluated. You will also provide information on inputs, activities, outputs, and outcomes and will develop a logic model (tabular or graphical depiction) of what you are evaluating. Context • What context/environment exists for what is being evaluated? (i.e., what environmental factors may affect the performance of what is being evaluated) Logic model elements: do NOT repeat what is in the logic model but make any necessary comment, clarification. Resources/Inputs • What resources are available to support what is being evaluated (e.g., staff, money, space, time, partnerships, technology, etc.)? Activities • What specific activities are undertaken (or planned) to achieve the outcomes? Outputs • What products (e.g., materials, units of services delivered) are produced by your staff as a result of the activities performed? Outcomes and Impacts • What are the program’s intended outcomes (intended outcomes are short-term, intermediate, or long-term)? • What do you ultimately want to change as a result of your activities (long-term outcomes or impacts)? • What occurs between your activities and the point at which you see these ultimate outcomes (short-term and intermediate outcomes)? Logic Model • Provide a logic model for what is being evaluated. 3 3. EVALUATION DESIGN (APPROX. 500 WORDS) This section provides information on how you will design your evaluation. Provide information on evaluation questions and the evaluation design. Evaluation Questions • What specific research questions do you intend to answer through this evaluation? Evaluation Design • What is the strategy and design for this evaluation? (e.g., survey, case study, experimental , pre-post with comparison group, time-series, post-test only) • Why was this design selected? • Will a sample be used? If so, how will the sample be selected? 4. DATA COLLECTION (400 WORDS MAX. + INCL. TABLE) This section provides information on how you will collect/compile data for your evaluation. I In the table provide as much detailed information as you can the data you hope to access/collect and align these plans to the evaluation/research questions you have. Data Collection Methods • What methods will be used to collect or acquire the data? (make it if clear new data will be collected/compiled or whether secondary data will be used) • Use consistent language and pay attention to specific alignment between data collection and the evaluation/research questions proposed? • From whom or from what will data be collected (source of data)? Data Collection Method – Evaluation Question Link Table F.3: Evaluation Questions and Associated Data Collection Methods Evaluation Question Data Collection Method 1. 2. 4 Source of Data 5. DATA ANALYSIS (100 WORDS, TABLE ONLY) In this section provide information on what indicators/standards you will use to judge success, how you will analyse your evaluation findings. Table elements Indicators and Standards • What are some measurable or observable elements that can tell you about the performance of what is being evaluated? • What constitutes “success”? (i.e., by what standards will you compare your evaluation findings?) Analysis • What method will you use to analyze your data (e.g., descriptive statistics, inferential statistics, content analysis)? Table F.4. Indicators and Success Evaluation Question Criteria or Indicator 1. 2. 5 Standards (What Constitutes “Success”?) 6. EVALUATION COMMUNICATION AND MANAGEMENT (100 WORDS MAX. + TIME-TABLE IN APPENDIX) This section provides information about how the evaluation will be managed and implemented and who will participate in what capacity. It will also provide a timeline for conducting activities related to this evaluation. You may find that some of the tables suggested here fit better in other sections of the plan. Regardless of how you structure your plan, it is important that you carefully think about each of these implementation steps and who is responsible for doing what by when. Dissemination plan (50 words) How will the evaluation be reported and disseminated to stakeholders? (50 words) Evaluation Budget (50 words) o What is the cost for this evaluation? o Where will the monetary resources come from to support the evaluation? Timeline table or figure (include as an appendix) Develop a table that summarizes the major activities included in implementing the evaluation, the persons involved in this implementation, and associated timelines. (timetable table or figure) Include… • When will planning and administrative tasks occur? • When will any pilot testing occur? • When will formal data collection and analysis tasks occur? • When will information dissemination tasks occur? • Upon mapping all of the above on a single timeline, are there any foreseeable bottlenecks or sequencing issues? 6
Purchase answer to see full attachment
User generated content is uploaded by users for the purposes of learning and should be used following Studypool's honor code & terms of service.

Explanation & Answer

Hello, here is the solution. Let me know if you have any question

EDPA5013: Program Evaluation

Assignment 3: Program Evaluation plan

Teach for Australia Leadership Program

Impact Evaluation of Teach for Australia Leadership Program

Prepared by:

{Your name (usyd enrolment name)}

{submission Date}

1

1. INTRODUCTION
Teach For Australia is a non-profit institute founded in 2009 by Melodie Potts Rosevear
in partnership with the University of Melbourne. The institute's Leadership Development
Program aims at addressing educational equality in Australia ("Teach For Australia," 2020).
Their vision is developing an educational environment where all the learners, irrespective of their
background, achieve an exceptional education ("Our vision and values - Teach For Australia,"
2020).
The institute is guided by strong values such as empowering greatness, collaboration,
humility and learning, resilience, and innovation in realizing its mission of recruiting future
leaders and inspire, empower and connect them to the importance of equity in education ("Our
Mission - Teach for Australia," 2020).
Evaluation Purpose
The evaluation's main aim is to assess the effectiveness of the Teach For Australia
Leadership Development Program in achieving the intended vision and objectives. Specifically,
the program will evaluate the following aspects of the program;
1. Have the associates become high-quality leaders?
2. Has the program helped to enhance equity in schools?
3. The effectiveness of associates in meeting the school's expectations?
The type of evaluation chosen for this program is impact evaluation because it aims to
evaluate its effectiveness and inform improvement decisions on the program. The information
from this evaluation will help identify the program's milestones and suggest possible changes to
enhance its effectiveness.
Stakeholders
2

This evaluation's main stakeholders would be, Study for Australia organization, selected
school leaders, sponsors, evaluators, and students. Table F.1. shows the stakeholders engagement
plan and assessment. The stakeholders will engage in active discussions throughout the
evaluation process.

Table F.1. Stakeholder Assessment and Engagement Plan
Stakeholder Stakeholder Interest or
Name
Category
Perspective
Study for
Primary
• Are concerned
Australia
about the
organization
effectiveness of the
program
• May want to learn
new ideas.

Selected
school
leaders

Primary





Students

Secondary

Identifying
improvement areas
Determine the need
to participate in the
program.
Determining the
program's
effectiveness


Role in the
Evaluation
• Defining the
program
• Identifying
sources of
data
• Explaining
findings



Providing
leaders'
perspective
• I...


Anonymous
Goes above and beyond expectations!

Studypool
4.7
Trustpilot
4.5
Sitejabber
4.4

Similar Content

Related Tags