Wednesday, 9 September 2009

ECUR 809 Assignment #1 (First Version)

ECUR 809.3-83551 Assignment #1 Case: Client Satisfaction Survey Analysis, Quebec Regional Office, Human Resources Development (HRDC) in Quebec, Canada, 2001.

Summary: Circum Network Inc. is listed as the author of this study. The program evaluation was prepared and developed for Evaluation Services Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development. It is an analysis of requirements and a proposal for a client satisfaction measurement program. It is a guide for human resources development (employees who do not necessarily have the knowledge required to conduct formal, systematic surveys), that is, a self-directed training document on the implementation of client surveys. The tools offered in the guide are to be used for measuring, sampling, collecting data, analyzing data, interpreting results, and implementing recommendations. The document presents clearly the development of standardized questionnaires for various types of clients and various services’ conditions, operational and implementation framework and pre-testing.

The model or process used in the evaluation is the improvement-focused model.
The purpose is to help programs’ staff to learn how to discover discrepancies between program objectives and the needs of the target population, between program implementation and program plans, between expectations of the target population and the services actually delivered, and between outcomes achieved and outcomes projected, among others (Posovac, 2003, p.29). While providing the essential methodological foundations, “the self-directed” training document takes a pragmatic and “integrated” approach to conducting client satisfaction surveys. It includes devices such as decision trees and checklists. The project was carried out in three phases comprising the development of (1) the standardized questionnaires, (2) the operational framework for the client satisfaction measurement program, and (3) the analytical support tools and mechanisms. According to the report, there is a consensus within the HRDC-Quebec Region that systematic, rigorous measurement of client satisfaction with the products and services offered by the Region is essential to building the fifth pillar of the region’s vision: delivering services of the highest quality. There is also a consensus that the primary responsibility for improving the region’s services lies with the HRCCs and other operational centers, because it is they that control the daily delivery of services.

I see a couple of things that could be considered in future program evaluations: (1) the program does not compare or discusses all sides of program evaluation both, positive and negative. Deliberation of pros and cons are not evident in the survey evaluation of the program. It would be useful if this discussion or deliberative process can take place. (2) It does not describe participant’s “reactions” to and “learning” from the innovative program evaluation, as well as “behavior” changes in real job performance, and other potpourri “results.” A round table or discussion of these issues could help to improve the process of program evaluation. Nelson

Circum Network Inc. (2001) An integrated approach to conducting client satisfaction surveys analysis of requirements and proposal for a client satisfaction measurement program. Prepared for Evaluation Services' Information and Strategic Planning Directorate Quebec Regional Office Human Resources Development Canada. Retrieved September 7, 2009 from:

David Crawford (2009) Evaluation exploration. Retrieved September 4, 2009, from

Miller, R., & Butler, J. (2008) Using an adversary hearing to evaluate the effectiveness of a military program. The Qualitative Report, 13(1), 12-25. Retrieved September 5, 2009 from

Posovac, E., & Carey, R. (2003). Program Evaluation: Methods and Case Studies. (6th edition). New Jersey: Prentice Hall.

Please see above a more complete version of my Assignment # 1 (after Dr. Wilson's comments, Sept 25th, 2009)


  1. Nelson

    You have outlined the approach for this particular model of evaluation. Your analysis is clear and your suggestions valid. It has a number of areas that could be improved to increase it's effectiveness. You may have added more information about the future uses of this tool. Was the evaluation summative or formative?

  2. Hello Dr Wilson,

    The evaluation was both,summative and formative. They applied statistical tools to get the data but also the evaluation program aimed to train and educate participants. Thanks, :)

  3. In other words, the aim was to teach participants.

  4. Hell Dr Wilson,

    I have reviewed again your power point and I think in this case, it was used an approach similar to Stake's countenance model. There was a need for formalized evaluation. It was
    not just anecdotal but descriptive data was necessary. It included description and judgment,intents, and observations, which were compared to standards then a judgement was made. In short, there was a "mix" or mixture of parts pieces (quantitative/qualitative) artistic evaluation