Sunday, 6 December 2009
Tuesday, 17 November 2009
Assignment # 5 Final Survey Questionaire
Survey Questionnaire
To take this survey please click here:
https://survey.usask.ca/survey.php?sid=17783
Please read below explanation of previous work done: plan and preliminary versions of this survey questionnaire.
To take this survey please click here:
https://survey.usask.ca/survey.php?sid=17783
Please read below explanation of previous work done: plan and preliminary versions of this survey questionnaire.
Friday, 6 November 2009
Three Preliminary Versions of Survey Questionnaire
Assignment # 5- ECUR 809
The focus of this assignment is on clients’ level of satisfaction and their suggestions for the improvement of the Spanish Intermediate Program (SIP) offered by the Community Center in the City of Ottawa.
Overview:
This assignment aims the improvement of SIP in light of the clients' suggestions and responses to the survey questionnaire in The Community Center in the City of Ottawa. Clients in this particular case refers to students or participants of the Program “Spanish Intermediate and Conversation” (this program is complemented with Tennis Teaching) offered in the Community Center in Ottawa. According to Jody L. Fitzpatrick, James R. Sanders and Blaine R. Worthen (2004 p.100-111), there are standards and criteria to judge programs which may have a marked effect in improving the efforts of teachers and the whole community to improve programs: the purpose of the program (s), its rationale and objectives, its content, process and implementation (instructional technology and activities or strategies), and evaluation, form overall judgments about its effectiveness. Below each one is briefly explained:
1. Purpose and objectives: Community Center Programs are offered by season for all members of the community. Their rationale states that the community believes that the programs enhance growth and quality of life of the clients and their academic achievement; the teachers have the primary responsibility for their growth. Rationale of the City of Ottawa and the Community Center believe that the teachers’ performance enhances instruction, clients’ satisfaction and achievement.
Objectives or specific statements of what the project sets out to accomplish:
each teacher shall develop a program for their students in his/her area of expertise and under general guidelines provided by Ottawa City.
Specific objectives are outputs to be achieved; these are immediate or specific, concrete results (direct products of project activities). Each program is to be reviewed by the teacher and students to make changes or improvements, according to own immediate needs, for example: enhancing student achievement, performance and satisfaction, a more productive use of time, to increase professional and personal interactions and discussions, greater sharing of responsibility and leadership, increasing knowledge, involvement, and continuous learning.
2. Content: specific tasks to complete the course content in the classroom; content included in the Program kit or educational package created and published by the City of Ottawa and the Community Center.
3. Process and Implementation: include strategies for goal achievement, activities or actions taken to achieve the program: registering to the course, attending lectures or lessons, seminars and workshops, reading or doing research, peer coaching, mentorship, creation of personal portfolios, videos and recordings, mentorship, understanding Lesson planning, workshops, and meetings, discussing contents, organizing groups and setting ‘drills’, examples, exercises, models, instructional media, training opportunities; using equipment and educational materials and supplies, etc; reflective journal, developing collaboration and learning group, speaking the language (Spanish) in a real setting (playing tennis) or other initiatives that enhance instruction and student achievement.
4. Satisfaction relates to effectiveness: Impacts of SIP on students’ satisfaction shows in some way the effectiveness of the Program. Also, there must be long-term outcomes such as increasing knowledge, involvement, and ownership, continuous and permanent learning, increasing progressive knowledge, involvement, and ownership
Plan of Evaluation:
Who should be involved?
Engage Stakeholders:Teachers, administrators, supervisors, coordinators, volunteers and students; however, the focus of this work will be on the students or participants of the Community Center, particularly of Spanish Intermediate and Conversation (SIP).
How might they be engaged?
Students will be invited in meetings, email, survey-questionnaires.
Focus of the Evaluation:
What are you going to evaluate?
Description of Community Center: Programming (see below logic model 1)
Clients’ satisfaction – students’ reactions (see below logic model 2)
What is the purpose of the evaluation?
The purposes of this evaluation are to evaluate the level of satisfaction about (a) the organization, design and implementation (teaching) of the program; (b) their suggestions and recommendations to better achieve the goals of the program, namely, the enhancement of student learning and achievement.
Who will use the evaluation? How will they use the information?
- Teachers, administrators, supervisors, coordinators, volunteers and students.
- To assess the level of satisfaction with the organization, design and teaching of the program and to propose improvements or changes that can help teachers and students to meet the goals.
What questions will the evaluation seek to answer?
General questions:
Do Community Program (CP) helps participants or clients in their personal and professional growth and satisfaction? Is the Program meeting the goals set out by the Center? What are the reactions of students regarding the program? Are they satisfied with their achievement of goals and performance?
Specific questions:
Do objectives, content and activities match properly? How the teacher implement it?
Do Programs encourage students to develop personally and professionally?
Is the Program being used in the way that it was intended?
How is the Program perceived by students? Are they satisfied with their performance in the programs?
What information do you need to answer the questions?
Indicators – How will I know it? Level of satisfaction of participants.
When is the evaluation needed? At the beginning, in the middle and at the end of the program (s).
What evaluation design will you use? Takes in consideration the Consumer-Oriented Evaluation Approach but the focus is on the Goal-Oriented Approach.
Assessment and evaluation are best addressed from the viewpoint of the students’ reactions to 1) teachers & teaching, 2) class-assignments and 3) assignments-materials.
Collect the information
What sources of information will you use?
Existing information:
Web site – Programs – Written materials provided by the Community Center – Teachers materials – samples of students’ work and/or experiences (videos, photos, etc).
People:
Teachers, administrators and the focus will be on the students' satisfaction.
What data collection method(s) will you use?
E-mail survey questionnaire to students - and teachers - (a larger sample).
Questionnaire- Interview (a small sample of four students).
________
Bibliography:
Jody L. Fitzpatrick, James R. Sanders and Blaine R. Worthen, Program Evaluation: AlternativeApproaches and Practical Guidelines (Boston: Allyn and Bacon, 2004) p.100
City of Ottawa: www.ottawa.ca The Glebe Community Center
Plan of evaluation: http://learningstore.uwex.edu/pdf/G3658-02.pdf
Versions of Survey Questionnaire: Different versions were developed during the process of designing and testing of the survey. Each one included a sample checklist, a variety of question types such as scale rating, short answer, and open-ended; the final version is now in the web site. Two preliminary versions were designed and evaluated together with four students. They provided suggestions regarding the following issues: clarity of questions, wording, style, and importance. A third version was developed and posted in the web site of the University of Saskatchewan. The students answered the questionnaire but their concern was on correcting some discrepancies between items and the characteristics of the actual program. They also made suggestions to improve clarity, wording and style. The fourth version is the final one, which is now on the web site of the University of Saskatchewan. Below is the preliminary versions as they were presented to the students for their evaluation:
A. – Version One or Preliminary version
Short Answer: yes or not
1. Does the content cover a significant portion of the program competencies?
2. Is the content up-to-date
3. Is the course level appropriate for most students?
4. Are objectives, competencies, or tasks stated in the student materials?
5. Are tests included in the materials?
6. Are performance checklists included?
7. Is a student’s guide included that offers how to manage and perform the course theory and practice?
8. Is the material presented in a logical sequence?
Scale rating: Quality and satisfaction Judgments. Use +, 0, - to rate or degree of the quality or your satisfaction with specific aspects of the course:
Quality and satisfaction of objectives, competencies, and/or tasks_____
Degree or match between learning activities and objectives______
Quality of test tests and degree of match with objectives________
Quality and satisfaction with of performance checklists and degree of match with objectives________
Quality and satisfaction of directions for how students are to proceed through the materials_______
Quality of visuals, videos, games, experiences, practices_______
Overall design of the learning activities for individualized instruction_____
Quality and satisfaction on safety practices_____
Satisfaction with degree of freedom from bias with respect to sex, race, origin, age, religion, etc,?________
Quality and satisfaction of content list or the course content-map and competencies covered by the course_________
Short answer: brief comment
Does the course have basic elements, such as those listed below? Please mark with an “x” and make a comment if necessary:
a) Clearly stated outcomes objectives____
b) Sufficient directions_____
c) Other materials required____
d) Prerequisite knowledge based and existing programs___
e) Fit with knowledge base and existing programs___
Process information: what is the nature and frequency of interactions among students or clients relevant others? ________________________________
Have these interactions being evaluated?____________________________
Open ended questions: Please explain or illustrate
- Is evaluation an integral part of (a) the development and (b) the implementation of the program?
- Is there evidence of effectiveness available regarding the course?
B. Second Version: the modified version based on the testing of the survey with four individuals:
Short answer: Yes or not
1. Is the program content of Intermediate Spanish up-to-date?
2. Is the program level appropriate for most students?
3. Are objectives, competencies, or tasks satisfactory stated?
4. Is the program presented in a logical sequence?
5. Are you satisfied with the program have basic elements, such as those listed below?
Scale rating: Choice decision-making. Please write on the spaces below: Very Good (VG), Good (G) or Bad (B), and make a comment if necessary:
a) Outcomes, objectives, competencies or tasks____
b) Directions or instructions for how students are to proceed through the materials___
c) Materials ____
d) Prerequisite knowledge based ___
e) Performance checklists____
f) Student’s guide_____
g) Fit with knowledge base and program___
h) Tests
Scale rating: Judgments. Use +, 0, - to rate or degree of the quality or your satisfaction with specific aspects of the course:
Degree or match between learning activities and objectives______
Quality of test tests and degree of match with objectives________
Quality and satisfaction with of performance checklists and degree of match with objectives________
Quality of visuals, videos, games, experiences, practices_______
Overall design of the learning activities for individualized instruction_____
Quality and satisfaction on safety practices_____
Satisfaction with degree of freedom from bias with respect to sex, race, origin, age, religion, etc,?________
Quality and satisfaction of content list or the course content-map and competencies covered by the program_________
Open ended question:
Please feel free to make any suggestions, comments that can help us to improve our Program on Spanish Intermediate (complemented with Tennis instruction) for the next Spring/Summer:
________________________________________________________________
Third Version of the survey questionnaire: It was posted in the web site and the students evaluated it. The students made different corrections and suggestions to improve it. Based on their comments, most items were re-written and the questionnaire was re-organized and redesigned.
Preview of Survey_ #17783_ ..
Please see below Planning Program Evaluation Steps that served as the basis for the whole work.
The focus of this assignment is on clients’ level of satisfaction and their suggestions for the improvement of the Spanish Intermediate Program (SIP) offered by the Community Center in the City of Ottawa.
Overview:
This assignment aims the improvement of SIP in light of the clients' suggestions and responses to the survey questionnaire in The Community Center in the City of Ottawa. Clients in this particular case refers to students or participants of the Program “Spanish Intermediate and Conversation” (this program is complemented with Tennis Teaching) offered in the Community Center in Ottawa. According to Jody L. Fitzpatrick, James R. Sanders and Blaine R. Worthen (2004 p.100-111), there are standards and criteria to judge programs which may have a marked effect in improving the efforts of teachers and the whole community to improve programs: the purpose of the program (s), its rationale and objectives, its content, process and implementation (instructional technology and activities or strategies), and evaluation, form overall judgments about its effectiveness. Below each one is briefly explained:
1. Purpose and objectives: Community Center Programs are offered by season for all members of the community. Their rationale states that the community believes that the programs enhance growth and quality of life of the clients and their academic achievement; the teachers have the primary responsibility for their growth. Rationale of the City of Ottawa and the Community Center believe that the teachers’ performance enhances instruction, clients’ satisfaction and achievement.
Objectives or specific statements of what the project sets out to accomplish:
each teacher shall develop a program for their students in his/her area of expertise and under general guidelines provided by Ottawa City.
Specific objectives are outputs to be achieved; these are immediate or specific, concrete results (direct products of project activities). Each program is to be reviewed by the teacher and students to make changes or improvements, according to own immediate needs, for example: enhancing student achievement, performance and satisfaction, a more productive use of time, to increase professional and personal interactions and discussions, greater sharing of responsibility and leadership, increasing knowledge, involvement, and continuous learning.
2. Content: specific tasks to complete the course content in the classroom; content included in the Program kit or educational package created and published by the City of Ottawa and the Community Center.
3. Process and Implementation: include strategies for goal achievement, activities or actions taken to achieve the program: registering to the course, attending lectures or lessons, seminars and workshops, reading or doing research, peer coaching, mentorship, creation of personal portfolios, videos and recordings, mentorship, understanding Lesson planning, workshops, and meetings, discussing contents, organizing groups and setting ‘drills’, examples, exercises, models, instructional media, training opportunities; using equipment and educational materials and supplies, etc; reflective journal, developing collaboration and learning group, speaking the language (Spanish) in a real setting (playing tennis) or other initiatives that enhance instruction and student achievement.
4. Satisfaction relates to effectiveness: Impacts of SIP on students’ satisfaction shows in some way the effectiveness of the Program. Also, there must be long-term outcomes such as increasing knowledge, involvement, and ownership, continuous and permanent learning, increasing progressive knowledge, involvement, and ownership
Plan of Evaluation:
Who should be involved?
Engage Stakeholders:Teachers, administrators, supervisors, coordinators, volunteers and students; however, the focus of this work will be on the students or participants of the Community Center, particularly of Spanish Intermediate and Conversation (SIP).
How might they be engaged?
Students will be invited in meetings, email, survey-questionnaires.
Focus of the Evaluation:
What are you going to evaluate?
Description of Community Center: Programming (see below logic model 1)
Clients’ satisfaction – students’ reactions (see below logic model 2)
What is the purpose of the evaluation?
The purposes of this evaluation are to evaluate the level of satisfaction about (a) the organization, design and implementation (teaching) of the program; (b) their suggestions and recommendations to better achieve the goals of the program, namely, the enhancement of student learning and achievement.
Who will use the evaluation? How will they use the information?
- Teachers, administrators, supervisors, coordinators, volunteers and students.
- To assess the level of satisfaction with the organization, design and teaching of the program and to propose improvements or changes that can help teachers and students to meet the goals.
What questions will the evaluation seek to answer?
General questions:
Do Community Program (CP) helps participants or clients in their personal and professional growth and satisfaction? Is the Program meeting the goals set out by the Center? What are the reactions of students regarding the program? Are they satisfied with their achievement of goals and performance?
Specific questions:
Do objectives, content and activities match properly? How the teacher implement it?
Do Programs encourage students to develop personally and professionally?
Is the Program being used in the way that it was intended?
How is the Program perceived by students? Are they satisfied with their performance in the programs?
What information do you need to answer the questions?
Indicators – How will I know it? Level of satisfaction of participants.
When is the evaluation needed? At the beginning, in the middle and at the end of the program (s).
What evaluation design will you use? Takes in consideration the Consumer-Oriented Evaluation Approach but the focus is on the Goal-Oriented Approach.
Assessment and evaluation are best addressed from the viewpoint of the students’ reactions to 1) teachers & teaching, 2) class-assignments and 3) assignments-materials.
Collect the information
What sources of information will you use?
Existing information:
Web site – Programs – Written materials provided by the Community Center – Teachers materials – samples of students’ work and/or experiences (videos, photos, etc).
People:
Teachers, administrators and the focus will be on the students' satisfaction.
What data collection method(s) will you use?
E-mail survey questionnaire to students - and teachers - (a larger sample).
Questionnaire- Interview (a small sample of four students).
________
Bibliography:
Jody L. Fitzpatrick, James R. Sanders and Blaine R. Worthen, Program Evaluation: AlternativeApproaches and Practical Guidelines (Boston: Allyn and Bacon, 2004) p.100
City of Ottawa: www.ottawa.ca The Glebe Community Center
Plan of evaluation: http://learningstore.uwex.edu/pdf/G3658-02.pdf
Versions of Survey Questionnaire: Different versions were developed during the process of designing and testing of the survey. Each one included a sample checklist, a variety of question types such as scale rating, short answer, and open-ended; the final version is now in the web site. Two preliminary versions were designed and evaluated together with four students. They provided suggestions regarding the following issues: clarity of questions, wording, style, and importance. A third version was developed and posted in the web site of the University of Saskatchewan. The students answered the questionnaire but their concern was on correcting some discrepancies between items and the characteristics of the actual program. They also made suggestions to improve clarity, wording and style. The fourth version is the final one, which is now on the web site of the University of Saskatchewan. Below is the preliminary versions as they were presented to the students for their evaluation:
A. – Version One or Preliminary version
Short Answer: yes or not
1. Does the content cover a significant portion of the program competencies?
2. Is the content up-to-date
3. Is the course level appropriate for most students?
4. Are objectives, competencies, or tasks stated in the student materials?
5. Are tests included in the materials?
6. Are performance checklists included?
7. Is a student’s guide included that offers how to manage and perform the course theory and practice?
8. Is the material presented in a logical sequence?
Scale rating: Quality and satisfaction Judgments. Use +, 0, - to rate or degree of the quality or your satisfaction with specific aspects of the course:
Quality and satisfaction of objectives, competencies, and/or tasks_____
Degree or match between learning activities and objectives______
Quality of test tests and degree of match with objectives________
Quality and satisfaction with of performance checklists and degree of match with objectives________
Quality and satisfaction of directions for how students are to proceed through the materials_______
Quality of visuals, videos, games, experiences, practices_______
Overall design of the learning activities for individualized instruction_____
Quality and satisfaction on safety practices_____
Satisfaction with degree of freedom from bias with respect to sex, race, origin, age, religion, etc,?________
Quality and satisfaction of content list or the course content-map and competencies covered by the course_________
Short answer: brief comment
Does the course have basic elements, such as those listed below? Please mark with an “x” and make a comment if necessary:
a) Clearly stated outcomes objectives____
b) Sufficient directions_____
c) Other materials required____
d) Prerequisite knowledge based and existing programs___
e) Fit with knowledge base and existing programs___
Process information: what is the nature and frequency of interactions among students or clients relevant others? ________________________________
Have these interactions being evaluated?____________________________
Open ended questions: Please explain or illustrate
- Is evaluation an integral part of (a) the development and (b) the implementation of the program?
- Is there evidence of effectiveness available regarding the course?
B. Second Version: the modified version based on the testing of the survey with four individuals:
Short answer: Yes or not
1. Is the program content of Intermediate Spanish up-to-date?
2. Is the program level appropriate for most students?
3. Are objectives, competencies, or tasks satisfactory stated?
4. Is the program presented in a logical sequence?
5. Are you satisfied with the program have basic elements, such as those listed below?
Scale rating: Choice decision-making. Please write on the spaces below: Very Good (VG), Good (G) or Bad (B), and make a comment if necessary:
a) Outcomes, objectives, competencies or tasks____
b) Directions or instructions for how students are to proceed through the materials___
c) Materials ____
d) Prerequisite knowledge based ___
e) Performance checklists____
f) Student’s guide_____
g) Fit with knowledge base and program___
h) Tests
Scale rating: Judgments. Use +, 0, - to rate or degree of the quality or your satisfaction with specific aspects of the course:
Degree or match between learning activities and objectives______
Quality of test tests and degree of match with objectives________
Quality and satisfaction with of performance checklists and degree of match with objectives________
Quality of visuals, videos, games, experiences, practices_______
Overall design of the learning activities for individualized instruction_____
Quality and satisfaction on safety practices_____
Satisfaction with degree of freedom from bias with respect to sex, race, origin, age, religion, etc,?________
Quality and satisfaction of content list or the course content-map and competencies covered by the program_________
Open ended question:
Please feel free to make any suggestions, comments that can help us to improve our Program on Spanish Intermediate (complemented with Tennis instruction) for the next Spring/Summer:
________________________________________________________________
Third Version of the survey questionnaire: It was posted in the web site and the students evaluated it. The students made different corrections and suggestions to improve it. Based on their comments, most items were re-written and the questionnaire was re-organized and redesigned.
Preview of Survey_ #17783_ ..
Please see below Planning Program Evaluation Steps that served as the basis for the whole work.
Friday, 30 October 2009
Wednesday, 14 October 2009
Logic Models ECUR 809 Assignment # 4
(A)Logic Model 1 - Flow Chart: General
Worksheet Flowchart 1
Logic Model 2 - Flow Chart: More Specific
21010006-WorksheetFlowchart-ECUR-809
B. Description of the Logic Model- Assignment # 4:
- scope of logic model (how much they cover);
- the number of levels included;
- the description of levels included;
- the direction of information flow
- the amount of text;
- the visual layout.
Each of these variables is described in turn, below.
Scope of Logic Model: The flow chart is a logic model designed for evaluation purposes of the whole programs at the Community Center. It begins with vision, mission, values, motivations, expectations, etc and at the end of the day, the purpose of evaluation is to find out if the programs are making the difference. The community center offers complex, multi-component programs that may require the development of separate logic models for each program component or activity. Thus, I designed one general and a more specific one that can help me to promote the need of evaluating ‘clients satisfaction’ in the community programs (which include Spanish & Tennis Teaching). Assessment and evaluation are best addressed from the viewpoint of the students’ reactions to 1) teachers &teaching, 2) class-assignments and 3) assignments-materials.
Number of Levels: The first flow chart logic model includes several ‘levels’ (goals, population of interest, long and short term objectives and indicators). The second one, includes strategies, activities, process indicators.
Description of Levels: There is no standard set of terminology for our logic models. So, the first one includes general terms and the second one applies more specifics. The discussion will begin with stakeholders:Who should be involved or engaged? Teachers, administrators, supervisors, coordinators, volunteers and students;the focus of my work will be on the students or participants of the Community Center Programs. How might they be engaged? Students will be invited in staff meetings, email, survey-questionnaires.
Direction of Information Flow: Both flow charts-logic models flows from moving from left to right starting with objectives and focus of the Evaluation: What am I going to evaluate?
The Community Center Programming (logic model 1) and the clients’ satisfaction – students’ reactions and satisfaction (logic model 2).
What is the purpose of the evaluation?
The purposes of this evaluation are to evaluate the extent to which (a) the organization and programs help the members of the Community in their personal and professional growth; (b) the participants or students are meeting the goals of the programs, namely the enhancement of student satisfaction and achievement.
Amount of Text: It is well known that the amount of text included in a logic model can vary greatly between logic models. It can be sparse and in point form, or highly detailed. As a matter of preference and the function, my logic models include the information needed for our purposes of presenting the most important issues:
Who will use the evaluation? How will they use the information?
- Teachers, administrators, supervisors, coordinators, volunteers and students.
- To assess the effectiveness of the programs and make changes and improvements to help teachers and students to meet the goals.
-To improve students achievement and satisfaction.
What questions will the evaluation seek to answer?
General questions:
Do Community Programs (CP) help participants or clients in their personal and professional growth and satisfaction? Are Programs meeting the goals set out by the Center? What are the reactions of students regarding those programs? Are they satisfied with their achievement of goals and performance?
Specific questions:
Does the community offer varied programs? Do teachers have adequate resources to implement them? Do they see growth in their students as a result of their CP?
Do Programs encourage students to develop personally and professionally?
Are Programs being used in the way that they are intended?
How are Programs perceived by students? Are they satisfied with their performance in those programs? What are the benefits to students?
Visual Layout: As we know there are many ways to approach visuals and overall layout. This is a highly subjective issue, but an important one as good visual design can greatly enhance the understandability of a logic model. In these cases I tried to avoid confusion and focus on the following questions:
What information do I need to perform evaluation? or to answer the questions?
Indicators – How will I know it? Level of satisfaction of participants.
When is the evaluation needed? At the beginning, in the middle and at the end of the program (s).
What evaluation design will you use? Consumer-Oriented Evaluation Approach.
Assessment and evaluation are best addressed from the viewpoint of the students’ reactions to 1) teachers & teaching, 2) class-assignments and 3) assignments-materials.
Collect the information:
What sources of information will you use?
Existing information:
Web site – Programs – Written materials provided by the Community Center – Teachers materials – samples of students’ work and/or experiences (videos, photos, etc).
People:
Teachers, administrators and the focus will be on students satisfaction.
What data collection method(s) will you use?
E-mail survey questionnaire to students - and teachers - (a larger sample).
Questionnaire- Interview (a small sample of four students).
About Assignment # 5- ECUR 809: The focus for my assignment will be on students’ testimonials of their experience with CP (a sample of four students).My focus will be on Programs: "Spanish Intermediate and Tennis Instruction"
Source: "Logic Models" Online:
http://www.thcu.ca/infoandresources/publications/logicmodel.wkbk.v6.1.full.aug27.pdf
Worksheet Flowchart 1
Logic Model 2 - Flow Chart: More Specific
21010006-WorksheetFlowchart-ECUR-809
B. Description of the Logic Model- Assignment # 4:
- scope of logic model (how much they cover);
- the number of levels included;
- the description of levels included;
- the direction of information flow
- the amount of text;
- the visual layout.
Each of these variables is described in turn, below.
Scope of Logic Model: The flow chart is a logic model designed for evaluation purposes of the whole programs at the Community Center. It begins with vision, mission, values, motivations, expectations, etc and at the end of the day, the purpose of evaluation is to find out if the programs are making the difference. The community center offers complex, multi-component programs that may require the development of separate logic models for each program component or activity. Thus, I designed one general and a more specific one that can help me to promote the need of evaluating ‘clients satisfaction’ in the community programs (which include Spanish & Tennis Teaching). Assessment and evaluation are best addressed from the viewpoint of the students’ reactions to 1) teachers &teaching, 2) class-assignments and 3) assignments-materials.
Number of Levels: The first flow chart logic model includes several ‘levels’ (goals, population of interest, long and short term objectives and indicators). The second one, includes strategies, activities, process indicators.
Description of Levels: There is no standard set of terminology for our logic models. So, the first one includes general terms and the second one applies more specifics. The discussion will begin with stakeholders:Who should be involved or engaged? Teachers, administrators, supervisors, coordinators, volunteers and students;the focus of my work will be on the students or participants of the Community Center Programs. How might they be engaged? Students will be invited in staff meetings, email, survey-questionnaires.
Direction of Information Flow: Both flow charts-logic models flows from moving from left to right starting with objectives and focus of the Evaluation: What am I going to evaluate?
The Community Center Programming (logic model 1) and the clients’ satisfaction – students’ reactions and satisfaction (logic model 2).
What is the purpose of the evaluation?
The purposes of this evaluation are to evaluate the extent to which (a) the organization and programs help the members of the Community in their personal and professional growth; (b) the participants or students are meeting the goals of the programs, namely the enhancement of student satisfaction and achievement.
Amount of Text: It is well known that the amount of text included in a logic model can vary greatly between logic models. It can be sparse and in point form, or highly detailed. As a matter of preference and the function, my logic models include the information needed for our purposes of presenting the most important issues:
Who will use the evaluation? How will they use the information?
- Teachers, administrators, supervisors, coordinators, volunteers and students.
- To assess the effectiveness of the programs and make changes and improvements to help teachers and students to meet the goals.
-To improve students achievement and satisfaction.
What questions will the evaluation seek to answer?
General questions:
Do Community Programs (CP) help participants or clients in their personal and professional growth and satisfaction? Are Programs meeting the goals set out by the Center? What are the reactions of students regarding those programs? Are they satisfied with their achievement of goals and performance?
Specific questions:
Does the community offer varied programs? Do teachers have adequate resources to implement them? Do they see growth in their students as a result of their CP?
Do Programs encourage students to develop personally and professionally?
Are Programs being used in the way that they are intended?
How are Programs perceived by students? Are they satisfied with their performance in those programs? What are the benefits to students?
Visual Layout: As we know there are many ways to approach visuals and overall layout. This is a highly subjective issue, but an important one as good visual design can greatly enhance the understandability of a logic model. In these cases I tried to avoid confusion and focus on the following questions:
What information do I need to perform evaluation? or to answer the questions?
Indicators – How will I know it? Level of satisfaction of participants.
When is the evaluation needed? At the beginning, in the middle and at the end of the program (s).
What evaluation design will you use? Consumer-Oriented Evaluation Approach.
Assessment and evaluation are best addressed from the viewpoint of the students’ reactions to 1) teachers & teaching, 2) class-assignments and 3) assignments-materials.
Collect the information:
What sources of information will you use?
Existing information:
Web site – Programs – Written materials provided by the Community Center – Teachers materials – samples of students’ work and/or experiences (videos, photos, etc).
People:
Teachers, administrators and the focus will be on students satisfaction.
What data collection method(s) will you use?
E-mail survey questionnaire to students - and teachers - (a larger sample).
Questionnaire- Interview (a small sample of four students).
About Assignment # 5- ECUR 809: The focus for my assignment will be on students’ testimonials of their experience with CP (a sample of four students).My focus will be on Programs: "Spanish Intermediate and Tennis Instruction"
Source: "Logic Models" Online:
http://www.thcu.ca/infoandresources/publications/logicmodel.wkbk.v6.1.full.aug27.pdf
Tuesday, 13 October 2009
ECUR 809 Assignment # 3
ECUR 809 Assignment # 3: Evaluation of Organization -Performing an Evaluation Assessment.
Determining the feasibility and direction of my evaluation:
I have selected a community center as organization to use as a model for the rest of the course its programs (Adult General Interest programs such as Intermediate Spanish): http://www.gnag.ca/index.php
http://www.gnag.ca/index.php?page=154 I live close by; so, I can access individuals for input in my work. I chose the City of Ottawa: http://www.city.ottawa.on.ca/ specifically a neighborhood as organization because during the Spring/Summer I taught Spanish (complemented with tennis lessons)(www.moretennis.blogspot.com) as part of the "Ultra Play" program: http://www.ottawatennis.com/detail.php?news_id=294
Kindly please see below overview of my chosen organization:
Organization: A Community Center in the City of Ottawa, ON Canada
Program: "Adult General Interest - Spanish: Intermediate/conversational"
Model of Evaluation Assessment:student-centered evaluation Assessment.
According to the Student Evaluation: A Teacher Handbook(Saskatchewan Education, 1991) student evaluation should focus on the collection and interpretation of data which would indicate student progress. This, in combination with teacher self-evaluation and program evaluation, provides a full evaluation. Chapter one states that, "Assessment and evaluation are best addressed from the viewpoint of selecting what appears most valid in allowing students to show what they have learned." In general, the main phases are the following: preparation, assessment, evaluation (formative, diagnostic, and summative) and reflection. Below each one is briefly described:
Preparation: what is to be evaluated, the type of evaluation (formative, summative, or diagnostic) to be used, the criteria against which student learning outcomes will be judged, and the most appropriate assessment strategies with which to gather information on student progress. Decisions made during this phase form the basis for planning during the remaining phases.In the Spanish Intermediate and Conversation Program the criteria and strategies are guided by an instructor (graduate student) from the University of Ottawa.
Assessment: identify information-gathering strategies, construct or select instruments, administer them to the student, and collect the information on student learning progress. The identification and elimination of bias (such as gender and culture bias) from the assessment strategies and instruments, and the determination of where, when, and how assessments will be conducted are important considerations. Performing an evaluation assessment process of the Program "Adults General Interest," Spanish Intermediate and Conversation Program in the Community Center, City of Ottawa, requires an appropriate approach. The Stake's "responsive" approach seems to be an an adequate way to reporting the "success and failure" of that program. Stake (1975, p.19)recommended the "clock" model to reflect the prominent recurring events in a responsive evaluation: talk with clients, program staff, audiences; identify program scope; overview program activities; discover purposes, concerns; conceptualize issues, problems; identify data needs re issues; select observers, judges, instruments, if any; observe designated antecedents, transactions and outcomes; thematize: prepare portrayals, case studies; validate, confirm, attempt to dis confirm; winnow, for audience use; and assemble formal reports, if any. In this sense, the Stake's model helps in reporting evaluation assessment of Intermediate & Conversation Spanish Program, in which not only questionnaires but also specific tests and sample work portfolios were assessed. See example of past questionnaires.
Evaluation: the information gathered during the assessment phase is used to make judgments about student progress. Based on the judgments (evaluations), decisions about student learning programs are made and reported to students, parents, and appropriate school personnel.
Reflection: allows pondering the successes and shortfalls of the previous phases. Specifically, evaluate the utility and appropriateness of the assessment strategies used, and make decisions concerning improvements or modifications to subsequent teaching and assessment. Instruments contain questions that encourage reflection on student assessment,teachers'planning, and on the structure of the curriculum. In the Intermediate & Conversational Spanish, successes of the program of Intermediate and Conversational Spanish, we can mention the following: excellent Audio CD cassettes; and an exciting vacation with a great learning opportunity, offered in combination with similar programs to study Spanish complemented with other programs, such as sports tennis and golf, games and with latin dance programs.
Until now no failures have been reported. To the contrary, students are looking for more "living spanish" programs.
Sources: Program Evaluation, Particularly Responsive Evaluation (Occasional Paper No. 5, p.19) by R.E. Stake, 1975b, Kalamazoo, MI: Western Michigan University Evaluation Center, Adapted by permission. Cited in pag. 138 in Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines. White Plains, NY: Longman.
Online: www.wmich.edu/evalctr/pubs/ops/ops05.pdf
"Student Evaluation: A Teacher Handbook" Retrieved September 24th, 2009 Online:
http://www.saskschools.ca/~ischool/Drafting10/curr/part25.htm
Flowcharts:
http://www.scribd.com/doc/21010132
http://www.scribd.com/doc/21010006
Determining the feasibility and direction of my evaluation:
I have selected a community center as organization to use as a model for the rest of the course its programs (Adult General Interest programs such as Intermediate Spanish): http://www.gnag.ca/index.php
http://www.gnag.ca/index.php?page=154 I live close by; so, I can access individuals for input in my work. I chose the City of Ottawa: http://www.city.ottawa.on.ca/ specifically a neighborhood as organization because during the Spring/Summer I taught Spanish (complemented with tennis lessons)(www.moretennis.blogspot.com) as part of the "Ultra Play" program: http://www.ottawatennis.com/detail.php?news_id=294
Kindly please see below overview of my chosen organization:
Organization: A Community Center in the City of Ottawa, ON Canada
Program: "Adult General Interest - Spanish: Intermediate/conversational"
Model of Evaluation Assessment:student-centered evaluation Assessment.
According to the Student Evaluation: A Teacher Handbook(Saskatchewan Education, 1991) student evaluation should focus on the collection and interpretation of data which would indicate student progress. This, in combination with teacher self-evaluation and program evaluation, provides a full evaluation. Chapter one states that, "Assessment and evaluation are best addressed from the viewpoint of selecting what appears most valid in allowing students to show what they have learned." In general, the main phases are the following: preparation, assessment, evaluation (formative, diagnostic, and summative) and reflection. Below each one is briefly described:
Preparation: what is to be evaluated, the type of evaluation (formative, summative, or diagnostic) to be used, the criteria against which student learning outcomes will be judged, and the most appropriate assessment strategies with which to gather information on student progress. Decisions made during this phase form the basis for planning during the remaining phases.In the Spanish Intermediate and Conversation Program the criteria and strategies are guided by an instructor (graduate student) from the University of Ottawa.
Assessment: identify information-gathering strategies, construct or select instruments, administer them to the student, and collect the information on student learning progress. The identification and elimination of bias (such as gender and culture bias) from the assessment strategies and instruments, and the determination of where, when, and how assessments will be conducted are important considerations. Performing an evaluation assessment process of the Program "Adults General Interest," Spanish Intermediate and Conversation Program in the Community Center, City of Ottawa, requires an appropriate approach. The Stake's "responsive" approach seems to be an an adequate way to reporting the "success and failure" of that program. Stake (1975, p.19)recommended the "clock" model to reflect the prominent recurring events in a responsive evaluation: talk with clients, program staff, audiences; identify program scope; overview program activities; discover purposes, concerns; conceptualize issues, problems; identify data needs re issues; select observers, judges, instruments, if any; observe designated antecedents, transactions and outcomes; thematize: prepare portrayals, case studies; validate, confirm, attempt to dis confirm; winnow, for audience use; and assemble formal reports, if any. In this sense, the Stake's model helps in reporting evaluation assessment of Intermediate & Conversation Spanish Program, in which not only questionnaires but also specific tests and sample work portfolios were assessed. See example of past questionnaires.
Evaluation: the information gathered during the assessment phase is used to make judgments about student progress. Based on the judgments (evaluations), decisions about student learning programs are made and reported to students, parents, and appropriate school personnel.
Reflection: allows pondering the successes and shortfalls of the previous phases. Specifically, evaluate the utility and appropriateness of the assessment strategies used, and make decisions concerning improvements or modifications to subsequent teaching and assessment. Instruments contain questions that encourage reflection on student assessment,teachers'planning, and on the structure of the curriculum. In the Intermediate & Conversational Spanish, successes of the program of Intermediate and Conversational Spanish, we can mention the following: excellent Audio CD cassettes; and an exciting vacation with a great learning opportunity, offered in combination with similar programs to study Spanish complemented with other programs, such as sports tennis and golf, games and with latin dance programs.
Until now no failures have been reported. To the contrary, students are looking for more "living spanish" programs.
Sources: Program Evaluation, Particularly Responsive Evaluation (Occasional Paper No. 5, p.19) by R.E. Stake, 1975b, Kalamazoo, MI: Western Michigan University Evaluation Center, Adapted by permission. Cited in pag. 138 in Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines. White Plains, NY: Longman.
Online: www.wmich.edu/evalctr/pubs/ops/ops05.pdf
"Student Evaluation: A Teacher Handbook" Retrieved September 24th, 2009 Online:
http://www.saskschools.ca/~ischool/Drafting10/curr/part25.htm
Flowcharts:
http://www.scribd.com/doc/21010132
http://www.scribd.com/doc/21010006
Saturday, 26 September 2009
ECUR 809 Assignment # 2 - Model
Assignment # 2 Model or approach to evaluate the ECS Programming for Children with Severe Disabilities
Summary: Children with severe/profound disabilities are eligible for Program Unit Funding from Alberta Education. According to the Medicine Hat Catholic organization, the “ECS Programming for Children with Severe Disabilities” evaluates and selects eligible children and then it offers educational programs that must meet the individual child’s needs. The educational programming combines center-based programs and in-home programs. The teacher develops an Individual Program Plan with goals and objectives reflective of the child’s needs. The center-based programming takes place in settings such as preschools, kindergartens and day cares.
What approach is appropriate to evaluate this program? In order to effectively evaluate ECS programming, I suggest using qualitative methods to conduct a “naturalistic” evaluation model in light of Emil J. Posavac and Raymond G. Carey’s theory (2003), combined with the “participant-oriented evaluation” approach, wisely described by Jody L. Fitzpatrick, James R. Sanders and Blaine R. Worthen (2004).
In the naturalistic evaluation model, “the evaluator becomes the data gathering instrument, not surveys or records. By personally observing all phases of the program and holding detailed conversation with stakeholders, the evaluator seeks to gain a rich understanding of the program, its clients, and the social environment” (Posavac and Carey, 2004, p.28). In other words, personal observations and detailed reports are necessary to explain information about the home visits, which should be carefully planned and documented. Also this model is useful in explaining the child’s instruction in a classroom setting at a center or school. The steps in preparing to conduct an evaluation comprise: identifying the program and its stakeholders, becoming familiar with information needs, the planning evaluation and evaluating the evaluation itself.
In the participant-oriented evaluation approach, evaluators should not be distracted from what was really happening to the participant in the program by focusing only in “stating and classifying objectives, designing an elaborate evaluation system, developing technically defensible objective instrumentation, and preparing long detailed technical reports” (Fitzpatrick, Sanders and Worthen, 2004, p. 130).The participant-oriented evaluation stresses first hand experience with program activities and settings and involvement of program participants in evaluation. This approach is “aimed at observing and identifying all (or as many as possible) of the concerns, issues, and consequences integral to the human services enterprise” (p.131). Evaluators need to avoid focusing only on the results or on isolated comments or numbers, charts, figures, and tables, missing important individual facts.
In short, a “naturalist & participant-oriented evaluation” combined approach will provide the plurality of judgments and criteria or methods of inquiry that will help the evaluators portray the different values and needs of individuals and groups served by the educational programming. This model requires active involvement of participants. By involving participants in determining the criteria and boundaries of the evaluation, evaluators serve an important educative function by creating “better-informed” program participants.
Sources:
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines. White Plains, NY: Longman.
Posovac, E., & Carey, R. (2003). Program Evaluation:Methods and Case Studies. (6th edition). New Jersey: Prentice Hall
Medicine Hat Catholic, Separate Regional Division # 20.
Retrieved Sept 13th, 2009 from
http://www.mhcbe.ab.ca/cec/
Summary: Children with severe/profound disabilities are eligible for Program Unit Funding from Alberta Education. According to the Medicine Hat Catholic organization, the “ECS Programming for Children with Severe Disabilities” evaluates and selects eligible children and then it offers educational programs that must meet the individual child’s needs. The educational programming combines center-based programs and in-home programs. The teacher develops an Individual Program Plan with goals and objectives reflective of the child’s needs. The center-based programming takes place in settings such as preschools, kindergartens and day cares.
What approach is appropriate to evaluate this program? In order to effectively evaluate ECS programming, I suggest using qualitative methods to conduct a “naturalistic” evaluation model in light of Emil J. Posavac and Raymond G. Carey’s theory (2003), combined with the “participant-oriented evaluation” approach, wisely described by Jody L. Fitzpatrick, James R. Sanders and Blaine R. Worthen (2004).
In the naturalistic evaluation model, “the evaluator becomes the data gathering instrument, not surveys or records. By personally observing all phases of the program and holding detailed conversation with stakeholders, the evaluator seeks to gain a rich understanding of the program, its clients, and the social environment” (Posavac and Carey, 2004, p.28). In other words, personal observations and detailed reports are necessary to explain information about the home visits, which should be carefully planned and documented. Also this model is useful in explaining the child’s instruction in a classroom setting at a center or school. The steps in preparing to conduct an evaluation comprise: identifying the program and its stakeholders, becoming familiar with information needs, the planning evaluation and evaluating the evaluation itself.
In the participant-oriented evaluation approach, evaluators should not be distracted from what was really happening to the participant in the program by focusing only in “stating and classifying objectives, designing an elaborate evaluation system, developing technically defensible objective instrumentation, and preparing long detailed technical reports” (Fitzpatrick, Sanders and Worthen, 2004, p. 130).The participant-oriented evaluation stresses first hand experience with program activities and settings and involvement of program participants in evaluation. This approach is “aimed at observing and identifying all (or as many as possible) of the concerns, issues, and consequences integral to the human services enterprise” (p.131). Evaluators need to avoid focusing only on the results or on isolated comments or numbers, charts, figures, and tables, missing important individual facts.
In short, a “naturalist & participant-oriented evaluation” combined approach will provide the plurality of judgments and criteria or methods of inquiry that will help the evaluators portray the different values and needs of individuals and groups served by the educational programming. This model requires active involvement of participants. By involving participants in determining the criteria and boundaries of the evaluation, evaluators serve an important educative function by creating “better-informed” program participants.
Sources:
Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program evaluation: Alternative approaches and practical guidelines. White Plains, NY: Longman.
Posovac, E., & Carey, R. (2003). Program Evaluation:Methods and Case Studies. (6th edition). New Jersey: Prentice Hall
Medicine Hat Catholic, Separate Regional Division # 20.
Retrieved Sept 13th, 2009 from
http://www.mhcbe.ab.ca/cec/
Wednesday, 23 September 2009
A Revised Version of Assignment # 1
Assignment #1 A completed evaluation Case: Client Satisfaction Survey Analysis, Quebec Regional Office, Human Resources Development (HRDC) in Quebec, Canada, 2001.
The study is an analysis of requirements and proposal for a client satisfaction measurement program. Circum Network Inc. is listed as the author of this study. It was prepared for Evaluation Services Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development.
Model or process used in the evaluation: An improvement-focused model (Posovac, 2003, p.29). While providing the essential methodological foundations, “the self-directed” training document takes a pragmatic and “integrated” approach to conducting client satisfaction surveys. It includes devices such as decision trees and checklists. The project was carried out in three phases. First, the team developed the standardized questionnaires. Then the researchers developed an operational framework for the client satisfaction measurement program. Finally, they developed the analytical support tools and mechanisms.According to the report, there is a consensus within the HRDC-Quebec Region that systematic, rigorous measurement of client satisfaction with the products and services offered by the Region is essential to building the fifth pillar of the region’s vision: delivering services of the highest quality. There is also a consensus that the primary responsibility for improving the region’s services lies with the HRCCs and other operational centres, because it is they that control the daily delivery of services.
Strengths: (1) The goals were to plan a client satisfaction measurement program and an analysis of requirements and proposal for a client satisfaction measurement program; the team produced a self-directed training document on the implementation of client surveys. (2) The report presents clearly the development of standardized questionnaires, operational and implementation framework and pre-testing. (3) The report is a Guide for human resources development (employees who do not necessarily have the knowledge required to conduct formal, systematic surveys). The tools offered in the guide are to be used for measuring, sampling, collecting data, analyzing data, interpreting results, and implementing recommendations. The results included standardized questionnaires for various types of clients and various services’ conditions. (4) Researchers/evaluators helped programs’ staff to learn how to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, or between outcomes achieved and outcomes projected (Posovac, 2003, p.29).
In this sense, I think in this case, it was used an approach similar to Stake's countenance model explained by Jay Wilson (2009) because there was a need for formalized evaluation. It was not just anecdotal but descriptive data was necessary. It included description and judgment, intents, and observations, which were compared to standards then a judgment was made. In short, there was a "mix" or mixture of parts pieces (quantitative/qualitative elements) and, as a result, it was an “artistic” evaluation (creative thinking in the minds of evaluators).
Regarding possible weaknesses, I see a couple of things that could be considered in future program evaluations: (1) the program does not compare or discusses all sides of program evaluation both positive and negative. Deliberation of pros and cons are not evident in the survey evaluation of the program. It would be useful if this discussion or deliberative process can take place. (2) It does not describe participant’s “reactions” to and “learning” from the innovative program evaluation, as well as “behavior” changes in real job performance, and other potpourri “results.” A round table or discussion of these issues could help to enlightening program evaluation.
Sources:
Circum Network Inc. (2001) An integrated approach to conducting client satisfaction surveys analysis of requirements and proposal for a client satisfaction measurement program. Prepared for Evaluation Services’ Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development Canada.
Retrieved September 7, 2009 from:
http://www.circum.com/textes/program_hrdc_quebec_2001.pdf
David Crawford (2009) Evaluation exploration. Retrieved September 4, 2009, from http://www.ag.ohio-state.edu/~brick/evalexpl.htm
Miller, R., & Butler, J. (2008) Using an adversary hearing to evaluate the effectiveness of a military program. The Qualitative Report, 13(1), 12-25. Retrieved September 5, 2009 from http://www.nova.edu/ssss/QR/QR13-1/miller.pdf
Posovac, E., & Carey, R. (2003). Program Evaluation: Methods and Case Studies. (6th edition). New Jersey: Prentice Hall.
The study is an analysis of requirements and proposal for a client satisfaction measurement program. Circum Network Inc. is listed as the author of this study. It was prepared for Evaluation Services Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development.
Model or process used in the evaluation: An improvement-focused model (Posovac, 2003, p.29). While providing the essential methodological foundations, “the self-directed” training document takes a pragmatic and “integrated” approach to conducting client satisfaction surveys. It includes devices such as decision trees and checklists. The project was carried out in three phases. First, the team developed the standardized questionnaires. Then the researchers developed an operational framework for the client satisfaction measurement program. Finally, they developed the analytical support tools and mechanisms.According to the report, there is a consensus within the HRDC-Quebec Region that systematic, rigorous measurement of client satisfaction with the products and services offered by the Region is essential to building the fifth pillar of the region’s vision: delivering services of the highest quality. There is also a consensus that the primary responsibility for improving the region’s services lies with the HRCCs and other operational centres, because it is they that control the daily delivery of services.
Strengths: (1) The goals were to plan a client satisfaction measurement program and an analysis of requirements and proposal for a client satisfaction measurement program; the team produced a self-directed training document on the implementation of client surveys. (2) The report presents clearly the development of standardized questionnaires, operational and implementation framework and pre-testing. (3) The report is a Guide for human resources development (employees who do not necessarily have the knowledge required to conduct formal, systematic surveys). The tools offered in the guide are to be used for measuring, sampling, collecting data, analyzing data, interpreting results, and implementing recommendations. The results included standardized questionnaires for various types of clients and various services’ conditions. (4) Researchers/evaluators helped programs’ staff to learn how to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, or between outcomes achieved and outcomes projected (Posovac, 2003, p.29).
In this sense, I think in this case, it was used an approach similar to Stake's countenance model explained by Jay Wilson (2009) because there was a need for formalized evaluation. It was not just anecdotal but descriptive data was necessary. It included description and judgment, intents, and observations, which were compared to standards then a judgment was made. In short, there was a "mix" or mixture of parts pieces (quantitative/qualitative elements) and, as a result, it was an “artistic” evaluation (creative thinking in the minds of evaluators).
Regarding possible weaknesses, I see a couple of things that could be considered in future program evaluations: (1) the program does not compare or discusses all sides of program evaluation both positive and negative. Deliberation of pros and cons are not evident in the survey evaluation of the program. It would be useful if this discussion or deliberative process can take place. (2) It does not describe participant’s “reactions” to and “learning” from the innovative program evaluation, as well as “behavior” changes in real job performance, and other potpourri “results.” A round table or discussion of these issues could help to enlightening program evaluation.
Sources:
Circum Network Inc. (2001) An integrated approach to conducting client satisfaction surveys analysis of requirements and proposal for a client satisfaction measurement program. Prepared for Evaluation Services’ Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development Canada.
Retrieved September 7, 2009 from:
http://www.circum.com/textes/program_hrdc_quebec_2001.pdf
David Crawford (2009) Evaluation exploration. Retrieved September 4, 2009, from http://www.ag.ohio-state.edu/~brick/evalexpl.htm
Miller, R., & Butler, J. (2008) Using an adversary hearing to evaluate the effectiveness of a military program. The Qualitative Report, 13(1), 12-25. Retrieved September 5, 2009 from http://www.nova.edu/ssss/QR/QR13-1/miller.pdf
Posovac, E., & Carey, R. (2003). Program Evaluation: Methods and Case Studies. (6th edition). New Jersey: Prentice Hall.
Wednesday, 9 September 2009
ECUR 809 Assignment #1 (First Version)
ECUR 809.3-83551 Assignment #1 Case: Client Satisfaction Survey Analysis, Quebec Regional Office, Human Resources Development (HRDC) in Quebec, Canada, 2001.
Summary: Circum Network Inc. is listed as the author of this study. The program evaluation was prepared and developed for Evaluation Services Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development. It is an analysis of requirements and a proposal for a client satisfaction measurement program. It is a guide for human resources development (employees who do not necessarily have the knowledge required to conduct formal, systematic surveys), that is, a self-directed training document on the implementation of client surveys. The tools offered in the guide are to be used for measuring, sampling, collecting data, analyzing data, interpreting results, and implementing recommendations. The document presents clearly the development of standardized questionnaires for various types of clients and various services’ conditions, operational and implementation framework and pre-testing.
The model or process used in the evaluation is the improvement-focused model.
The purpose is to help programs’ staff to learn how to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, and between outcomes achieved and outcomes projected, among others (Posovac, 2003, p.29). While providing the essential methodological foundations, “the self-directed” training document takes a pragmatic and “integrated” approach to conducting client satisfaction surveys. It includes devices such as decision trees and checklists. The project was carried out in three phases comprising the development of (1) the standardized questionnaires, (2) the operational framework for the client satisfaction measurement program, and (3) the analytical support tools and mechanisms. According to the report, there is a consensus within the HRDC-Quebec Region that systematic, rigorous measurement of client satisfaction with the products and services offered by the Region is essential to building the fifth pillar of the region’s vision: delivering services of the highest quality. There is also a consensus that the primary responsibility for improving the region’s services lies with the HRCCs and other operational centers, because it is they that control the daily delivery of services.
I see a couple of things that could be considered in future program evaluations: (1) the program does not compare or discusses all sides of program evaluation both, positive and negative. Deliberation of pros and cons are not evident in the survey evaluation of the program. It would be useful if this discussion or deliberative process can take place. (2) It does not describe participant’s “reactions” to and “learning” from the innovative program evaluation, as well as “behavior” changes in real job performance, and other potpourri “results.” A round table or discussion of these issues could help to improve the process of program evaluation. Nelson
Sources:
Circum Network Inc. (2001) An integrated approach to conducting client satisfaction surveys analysis of requirements and proposal for a client satisfaction measurement program. Prepared for Evaluation Services' Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development Canada. Retrieved September 7, 2009 from:
http://www.circum.com/textes/program_hrdc_quebec_2001.pdf
David Crawford (2009) Evaluation exploration. Retrieved September 4, 2009, from http://www.ag.ohio-state.edu/~brick/evalexpl.htm
Miller, R., & Butler, J. (2008) Using an adversary hearing to evaluate the effectiveness of a military program. The Qualitative Report, 13(1), 12-25. Retrieved September 5, 2009 from http://www.nova.edu/ssss/QR/QR13-1/miller.pdf
Posovac, E., & Carey, R. (2003). Program Evaluation: Methods and Case Studies. (6th edition). New Jersey: Prentice Hall.
Please see above a more complete version of my Assignment # 1 (after Dr. Wilson's comments, Sept 25th, 2009)
Summary: Circum Network Inc. is listed as the author of this study. The program evaluation was prepared and developed for Evaluation Services Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development. It is an analysis of requirements and a proposal for a client satisfaction measurement program. It is a guide for human resources development (employees who do not necessarily have the knowledge required to conduct formal, systematic surveys), that is, a self-directed training document on the implementation of client surveys. The tools offered in the guide are to be used for measuring, sampling, collecting data, analyzing data, interpreting results, and implementing recommendations. The document presents clearly the development of standardized questionnaires for various types of clients and various services’ conditions, operational and implementation framework and pre-testing.
The model or process used in the evaluation is the improvement-focused model.
The purpose is to help programs’ staff to learn how to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, and between outcomes achieved and outcomes projected, among others (Posovac, 2003, p.29). While providing the essential methodological foundations, “the self-directed” training document takes a pragmatic and “integrated” approach to conducting client satisfaction surveys. It includes devices such as decision trees and checklists. The project was carried out in three phases comprising the development of (1) the standardized questionnaires, (2) the operational framework for the client satisfaction measurement program, and (3) the analytical support tools and mechanisms. According to the report, there is a consensus within the HRDC-Quebec Region that systematic, rigorous measurement of client satisfaction with the products and services offered by the Region is essential to building the fifth pillar of the region’s vision: delivering services of the highest quality. There is also a consensus that the primary responsibility for improving the region’s services lies with the HRCCs and other operational centers, because it is they that control the daily delivery of services.
I see a couple of things that could be considered in future program evaluations: (1) the program does not compare or discusses all sides of program evaluation both, positive and negative. Deliberation of pros and cons are not evident in the survey evaluation of the program. It would be useful if this discussion or deliberative process can take place. (2) It does not describe participant’s “reactions” to and “learning” from the innovative program evaluation, as well as “behavior” changes in real job performance, and other potpourri “results.” A round table or discussion of these issues could help to improve the process of program evaluation. Nelson
Sources:
Circum Network Inc. (2001) An integrated approach to conducting client satisfaction surveys analysis of requirements and proposal for a client satisfaction measurement program. Prepared for Evaluation Services' Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development Canada. Retrieved September 7, 2009 from:
http://www.circum.com/textes/program_hrdc_quebec_2001.pdf
David Crawford (2009) Evaluation exploration. Retrieved September 4, 2009, from http://www.ag.ohio-state.edu/~brick/evalexpl.htm
Miller, R., & Butler, J. (2008) Using an adversary hearing to evaluate the effectiveness of a military program. The Qualitative Report, 13(1), 12-25. Retrieved September 5, 2009 from http://www.nova.edu/ssss/QR/QR13-1/miller.pdf
Posovac, E., & Carey, R. (2003). Program Evaluation: Methods and Case Studies. (6th edition). New Jersey: Prentice Hall.
Please see above a more complete version of my Assignment # 1 (after Dr. Wilson's comments, Sept 25th, 2009)
Subscribe to:
Posts (Atom)