The Development of a Self-Assessment Instrument to Evaluate Selected Competencies of Continuing Education Practitioners.

Adult and continuing education practitioners must engage in continuing professional education as a means of developing and maintaining their professional competence. This paper reviews the literature on competencies for continuing educators and examines the strengths of existing instruments for continuing educators to self-assess their competencies. One of the challenges in this process has been a means to systematically and comprehensively assess existing knowledge and skills and to identify where competencies are not adequate. This paper describes an assessment instrument that addresses these challenges and builds on the strengths of existing tools. It describes the development of a tool through a consultative process that identified the range of practitioner competencies required of adult and continuing educators. The assessment tool incorporates a behaviorally based approach to assessing identified competencies. It is unique because it addresses many of the needs identified in the literature including well articulated criteria for making judgments about a person’s current knowledge and skills, a format that allows for clear identification of gaps in learning so that meaningful professional development can take place, the clear structure needed to develop a “portfolio” of skills, and a format that requires the provision of evidence to support claims of proficiency in the identified areas. Though the development of the tool was initiated through the development of Prior Learning Assessment and Recognition processes, it is anticipated that it will be useful for continuing educators to undertake self-assessments as a basis for hiring, career laddering, and planning their ongoing professional development activities and goals. In both cases it provides a framework for meaningful reflection, assessment, gap identification, and planning.

Revue canadienne de l'éducation permanente universitaire Vol. 31, No 2, automne 2005 tools. It describes the development of a tool through a consultative process that identified the range of practitioner competencies required of adult and continuing educators. The assessment tool incorporates a behaviorally based approach to assessing identified competencies. It is unique because it addresses many of the needs identified in the literature including well articulated criteria for making judgments about a person's current knowledge and skills, a format that allows for clear identification of gaps in learning so that meaningful professional development can take place, the clear structure needed to develop a "portfolio" of skills, and a format that requires the provision of evidence to support claims of proficiency in the identified areas.
Though the development of the tool was initiated through the development of Prior Learning Assessment and Recognition processes, it is anticipated that it will be useful for continuing educators to undertake self-assessments as a basis for hiring, career laddering, and planning their ongoing professional development activities and goals. In both cases it provides a framework for meaningful reflection, assessment, gap identification, and planning.

INTRODUCTION
It has been frequently reported that continuing education practitioners lack training in the field of Adult and Continuing Education, despite the widespread availability of such programs. Not surprisingly, this situation applies to continuing education practitioners in CAUCE member institutions as well (Brooke & Morris, 1987;Morris & Potter, 1996;Thompson & Archer, 2003). Many continuing educators develop the knowledge and skills they require through on-the-job learning. Accordingly, there is a high need amongst them to find a systematic and comprehensive way to identify and assess both the knowledge and skills they have already acquired on the job and any gaps they need to fill through professional development opportunities. However, continuing educators who want to extend their professional competence face several problems.
The first problem arises from the common experience of continuing educators who are so busy developing educational programs that promote the knowledge and skills of other professionals that they are unable to attend to their own professional development needs. Jackson (1997) drew upon the analogy of "the cobbler's children having no shoes" to demonstrate the paradox in this situation. She emphasized the need for promoting staff empowerment as one important strategy for dealing with this. Of course, the problem of insufficient time is not unique to continuing educators; it is a challenge for all professionals wanting to maintain their professional competence. Brockett (1991) proposed that we make our professional development a priority, which involves developing and articulating a professional development plan.
Brockett's proposal leads us to a second problem. How do we determine our professional development needs and priorities? Doubtless, we can easily identify the problems and challenges we face on a day-to-day basis and the types of knowledge and skills that might help us deal more effectively with them. But will addressing only these shortcomings be enough to advance our careers and achieve our larger professional objectives? Not likely. To achieve these more ambitious goals, we need to engage in a reflective process that allows us to step outside our immediate environment and assess our current knowledge and skills within a broader context. How do we do this? How do we identify the knowledge and skills we need to make us more proficient as continuing educators? How do we go about assessing our current level of competence?
Self-assessment is a strategy advocated by Klevans, Smutz, Shuman, and Bershad (1992) to help professionals identify their learning needs and tailor their professional development planning. They described a self-assessment system for architects and other design professionals that was developed by Pennsylvania State University and concluded that ". . . self-assessment stimu- lates most participants to plan for and engage in both informal and formal continuing education activities" (p. 25). Similarly, Marienau (1999) reported that self-assessment ". . . serves as a powerful instrument for experiential learning, strengthens commitment to competent performance in the workplace, enhances higher order skills for functioning in the workplace, and fosters self-agency and authority" (p. 135). Thus, the process of self-assessment appears to contribute to enhancing motivation for professional development.
A number of professions have developed self-assessment instruments to determine levels of practitioner competency. For example, Heath (2002) described the use of a professional portfolio as an instrument to promote reflective self-assessment for teacher-librarians. Weddle, Himburg, Collins, and Lewis (2002) described the use of professional development portfolios as self-assessment instruments for dieticians. Self-assessment has also been documented for principals (White, Crooks, & Melton, 2002), psychologists (Belar et al., 2003), and trainers (Newman, 2002).
Closer to home, a number of university-based co-operative extension units have developed competency-based self-assessment instruments. For example, Stone and Coppernoll (2004) described such a system for extension professionals in Texas. But are there self-assessment instruments and processes appropriate to the needs of university continuing educators? This paper examines the resources currently available and finds many with a wide variety of strengths. Indeed, we believe that the assessment tool we have developed and present in this paper combines the strengths of other instruments and goes a step further in addressing the needs identified in the literature.
The tool was originally developed to support Prior Learning Assessment and Recognition (PLAR) processes for the Certificate in Adult and Continuing Education (CACE) program offered by a consortium, consisting of the University of Alberta, the University of Manitoba, the University of Saskatchewan, and the University of Victoria. A broad-based consultation process was used to identify the knowledge and skills that adult and continuing educators need to possess. It was guided by an outcomes-based approach to identifying the knowledge and skills that program graduates require to be successful adult and continuing educators. The process is described further in "The Program Outcomes Process" section of this paper. After working with the tool, we became confident that the results of this process, especially the self-assessment instrument that we developed, would benefit many CAUCE members. We believe the tool will assist university continuing educators not only to evaluate their current competencies but also to identify gaps, which can lead to concrete planning that addresses professional development needs. We begin by examining the literature on the competencies required by continuing educators. Emerging from this literature is a reasonable degree of concurrence on the generic set of competencies they require. Moreover, some work has already been done on developing self-assessment instruments for continuing educators.

REVIEW OF RELATED LITERATURE
This review addresses four specific areas: competencies required by continuing educators; previous efforts to create self-assessment instruments for continuing educators; portfolios and their assessment; and outcome/competency-based assessment.

Determining Competencies Required by Continuing Educators
Over the past 40 years, numerous reports and studies have attempted to delineate a practice description for the field of Continuing Education or to identify the competencies required by its practitioners. Freedman (1987) proposed that continuing educators require four broad areas of professional competence: curriculum building; determining methods, formats, and learning resources; marketing; and administration. He also identified a set of personal qualities that he concluded were important for continuing education administrators: entrepreneurship, judgment, energy, and self-confidence. Knox (1979) reviewed previous studies and reports and proposed that all categories of continuing education practitioners required three broad areas of proficiency: a comprehensive perspective on the field of continuing education; an understanding of adults as learners; and a set of personal qualities that included a commitment toward lifelong learning, effective interpersonal relations, and an approach to practice that emphasizes innovativeness. He allowed that the ways in which these three areas of proficiency are acquired and employed will vary considerably, depending upon the area of professional practice. For example, administrators, teachers and counsellors, and policy-makers may vary in this regard, but each area of proficiency will be important to all areas of practice.
By far the most frequently adopted approach to identifying practitioner competencies is the use of survey methodology. Four relatively recent examples of such studies are Amunson and Ebbers (1997), Cookson and English (1997), Dufour andQueeney (2004), andGerity (1999). Earlier reports of similar survey studies were reviewed by Campbell (1977), Knox (1979), and Rossman and Bunning (1978).
Revue canadienne de l'éducation permanente universitaire Vol. 31, No 2, automne 2005 Amunson and Ebbers (1997) identified competencies needed by future community services/continuing education directors. They concluded that such directors required 43 specific competencies and provided definitions for each of them. Cookson and English (1997) constructed behaviourally anchored rating scales for two types of positions: director of continuing education and area representative. This resulted in 10 areas of responsibility being identified for the director's position and 13 for the area-representative position. They generated a set of behavioural statements to represent examples of effective performance associated with each position's areas of responsibility. This was an especially significant contribution of this study. Almost all previous attempts to delineate competencies required by continuing educators failed to provide more than a brief definition of what each competency comprised. This was particularly problematic for self-assessment instruments. Dufour and Queeney (2004) employed a modified Delphi technique to create a practice description for the field of continuing higher education. They identified 12 areas of practice; associated with each area of practice was a set of representative responsibilities, which provided some elaboration of what elements comprised each broad area of practice. They concluded that the areas of practice, responsibilities, and tasks of continuing higher education practitioners are fluid and continue to change over time. Gerity (1999) created a self-assessment competency instrument for community college education professionals. He began with the competency model developed by Cookson and English (1997) and revised it through consultation with an expert panel of community college education professionals. The result was a model comprised of 89 task statements, categorized into 14 competency areas.
As Lewis (2001) noted, one of the challenges in reviewing these studies is the inconsistent use of terms identifying areas of competence. Nonetheless, she identified a number of competencies that had been specifically identified by multiple studies. She reported that four studies identified "communication skills" as a needed competency, while three others identified leadership, supervision skills, staff development, marketing, grantsmanship, and a positive attitude.
Some researchers have suggested it is important to examine the actual work behaviours of continuing educators rather than focusing on the competencies they require to perform those behaviours. For example, Donaldson (1993) and Donaldson and Kuhne (1994) observed the work activity of continuing educators, whereas English (1992) examined job descriptions to identify assigned duties. Griggs and Morgan (1988) surveyed continuing education administrators to determine the amount of time they devoted to various administrative tasks and to learn if tasks requiring the greatest expenditure of time also had the highest priority. Of particular interest to this study is their observation that these administrators spent a minimum amount of time on a number of tasks associated with staffing, staff development, and evaluation. For example, the task on which these administrators spent the least amount of time each year was conducting professional development needs assessments for faculty and staff.

Self-Assessment Instruments
A number of studies and reports have produced self-assessment instruments that allow continuing educators to determine those areas in which they might most profitably focus their professional development activities. Examples of self-assessment instruments are described in reports by Cookson and English (1997), Gerity (1999), Knowles (1980), and Lund and McGechaen (1981).
The studies by Cookson and English (1997) and by Gerity (1999) produced self-assessment instruments that provide a useful articulation of a broad range of competency elements. Moreover, the competencies identified in each study were determined through a broadly based consultative process. However, both used self-evaluation approaches that were superficial and unlikely to promote self-reflection on the part of the respondent. For example, Gerity directed respondents to allocate a score for the perceived importance they would assign to each of 89 competency statements. The scores were to be selected from a scale of 1 (representing lowest importance) to 5 (representing highest performance). In addition, respondents were directed to identify a developmental need score for each statement. These scores were also to be selected from a scale of 1 (lowest developmental need) to 5 (highest developmental need). The "importance" score was then multiplied by the "developmental need" score to produce a "training priority" score for each competency statement. The complexity of this process is impressive but what does it all mean? The scores may be arbitrarily selected and the process is not likely to promote self-reflection. Knowles (1980) created a self-diagnostic rating scale for the competencies associated with the role of adult educator, but gave no indication as to how these competencies had been selected. The scale included three role categories: learning facilitator, program developer, and administrator. Each category had a number of associated competency statements, and respondents were asked to identify both their current and their desired level of competence. The discrepancies identified through this process were expected to provide a useful guide to planning one's professional development. However, the competency descriptions were very brief and provided minimal guidance to respondents.
Revue canadienne de l'éducation permanente universitaire Vol. 31, No 2, automne 2005 Lund and McGechaen (1981) created a CE programmer/coordinator selfappraisal profile, but gave no indication as to how these competencies had been selected. The profile consisted of seven general areas of competency: assess community needs; plan courses and programs; promote and market courses and programs; manage courses and programs; manage personnel; practise communication skills; and conduct evaluations. Each competency area included between four and eight skill items, and respondents were asked to rate each item's level of importance according to their effective performance and their present skill level. Again, the skill items were very brief and provided minimal guidance to respondents.
One of the major strengths of these self-assessment instruments is their potential responsiveness to the different roles and circumstances of continuing education practitioners. This is important because these professionals can have widely variable roles-from administrators to classroom teachers and from program planners to organizational development specialists. Moreover, even people with similar responsibilities can find that the nature of their work varies, depending upon the size and type of institutional employer. Yet, most of these self-assessment instruments are not sufficiently behaviourally based to allow practitioners to be confident about their assessments. For example, Lund and McGechaen (1981) listed "Evaluate Programs" as a self-assessment skill item under the general competency area of "Conduct Evaluation." But what exactly does that mean? How does one determine his or her current level of proficiency without more information about the nature of this competency? Moreover, individuals undertaking this selfassessment are not required to reflect on how they developed this competency and what evidence they could assemble to demonstrate that they possess it. The self-assessment instrument we present in this study overcomes, or at least reduces, these deficiencies.

Portfolios
The use of portfolios is finding growing acceptance as a tool for assessing professional competence and for promoting professional development. Portfolios have been used to evaluate the competence of doctors (Wilkinson et al., 2002), nurses (Cook, Kase, Middleton, & Monsen, 2003), and teachers (Tucker, Stronge, Gareis, & Beers, 2003). Tillema (1998) identified three approaches for promoting professional development: self-assessment, peer assessment, and portfolio assessment. He discussed the advantages and disadvantages of each of these approaches. For example, portfolios are very useful in promoting personal reflection on one's strengths and weaknesses and are especially valuable in assembling evidence to support assessment. However, they can be extremely time-consuming to complete and it is not always clear what to include and how to structure the content. Further, they may be more useful in highlighting what we already know and do well and less so in highlighting areas in which improvement is warranted. By contrast, a structured self-assessment can provide a better indication of areas in which improvement is needed. But how confident can we be about our own judgments? As Tillema noted, "Self-assessment clearly differs from portfolio assessment in that no concrete functional evidence is being collected about actual performance levels" (p. 266). The self-assessment tool presented in our study does require such evidence and provides a clear framework for developing a portfolio. That is, the compilation of this evidence and the articulation of current knowledge and skills and a career and education plan will form the basis for a meaningful professional development portfolio.

Needs Not Currently Being Met
Currently, two identified needs are not being met by the instruments described in the literature. First, there is a need to "unpack" competencies into appropriately identified sub-competencies. For example, if we are invited to self-assess our knowledge and skills in regard to program evaluation, how do we do so? We might be more confident about our self-assessment judgments if we were asked whether we know how to design student evaluation forms or conduct a cost-effectiveness evaluation. But many of the self-assessment instruments do not provide sufficient differentiation of such competency elements. The instrument we have developed has subdivided practitioner competencies into underlying components. The second identified need involves the issue of evidence-based judgments. A number of self-assessment instruments invite respondents to select a Likert-scale response for each designated competency, but provide little or no guidance for making this determination. For example, respondents might be asked to select a number from 1 to 5, where 1 represents a competency level that fails to meet minimum performance expectations and 5 represents a level that exceeds performance expectations. But on what are we to base these judgments? Is it possible to undertake competency assessment in a way that provides a comprehensive and structured examination of our existing competencies and allows us to evaluate our current levels of competence through an evidence-based approach? We believe it is.
In the self-assessment instrument presented in this study, the criteria for making these judgments are articulated. Respondents can reflect upon how they learned the knowledge or skill associated with the competency; moreover, they are invited to document how they could prove they possess that knowledge or skill. This approach to self-assessment is critical to fostering self-reflection (Peters, 1991). These assessment methods draw upon the literature associated with PLAR, outcome-based evaluation, and portfolio assessment. The concepts and applications associated with these areas provide the framework needed to reflect, assess, provide evidence, and plan future professional development.  (Porter, 2002).
We use rich and complex knowledge and skills every day in our jobs as continuing educators. As stated earlier, to effectively reflect on and assess such knowledge and skills, we need clearly defined outcomes or expectations. Lipman (1991) stated that in order to think critically when assessing, we need to think about and clearly articulate the criteria against which judgments are being made. In PLAR implementation, this takes the form of thinking about and articulating clear learning outcomes and assessment criteria. This often means explicitly stating what has previously been implicit. Grant and Kohli (1979) stated that the identification of "explicit criteria also make[s] possible the assessment of previous learning, whereas much of traditional higher education provides no means of assessing prior learning other than evaluation of credit earned in courses" (p. 146). Assessment or self-assessment based on clearly defined outcomes and assessment criteria helps us become more critically reflective about our practice. Outcome-based assessment requires continuing education practitioners to think about and critically assess what they need to know and be able to do in their job role. Our assessment tool allows practitioners to do this. Schalock (1995) stated that outcome-based evaluation requires performance-based assessment, and this is what makes it so appropriate for our purposes. It provides a basis for assessing competency levels that is built upon objective assessment of actual performance. The outcome-based paradigm guided our work in two ways. First, it was the basis for the consultation we undertook in the first phase of our work to determine what knowledge and skills are required of adult and continuing education practitioners. Second, it contributed to the second phase of our work in which we assembled the self-assessment instrument that incorporates those knowledge and skill elements. In particular, it influenced our decisions about how we would ask respondents who utilized the self-assessment instrument to document evidence of their competencies.

Benefits of Outcome/Competency-based Assessment for Continuing Educators
A number of benefits are associated with the clear identification and articulation of competency statements.
• Outcome statements can provide a common understanding of what knowledge and skills are required in a particular job role.
• Clearly identified outcomes, with assessment criteria, capture the type of complex learning that we require of our practitioners. That is, they describe the integration of the knowledge, skills, and capacity to make expected appropriate judgments.
• Clearly articulated outcomes will accommodate the assessment of prior learning. Many practitioners already possess the knowledge and skills they need for their current or future position. Much of their learning will have been acquired on the job or through informal training events.
Recognizing this learning demonstrates respect for practitioners, avoids duplication of learning, and can save time and money.
• Clearly identified outcomes and assessment criteria allow for an expedient and accurate gap analysis of an individual's learning needs, which provides an easily negotiable path for engaging in professional development.
• Clearly articulated outcomes facilitate dialogue that allows learning acquired in one job to be linked with other positions to ensure laddering opportunities.
• Clearly articulated outcomes can create a transparent, flexible system for performance review.
In summary, our assessment tool is based on the requirements of an efficient and rigorous PLAR process. Because of this, it provides a transparent framework that combines clear outcomes and the criteria for assessing them, meaningful self-reflection, a requirement for evidence of knowledge and skills, and a clear identification of gaps. All of these elements are needed for meaningful professional development.

THE PROGRAM OUTCOMES PROCESS
The University of Manitoba has initiated a Prior Learning Assessment and Recognition (PLAR) Project that will allow learners in the Certificate in Adult and Continuing Education (CACE) Program to receive credit for the knowledge and skills they bring to the program. As stated earlier, clearly articulating what a learner needs to know and be able to do at the end of a course or program is critical to creating a transparent and efficient PLAR process. To this end, learning outcomes were defined at the program and course levels as part of this PLAR project. The core course learning outcomes provided a refined description of the overall program outcomes. The detail provided by these competencies provides an excellent tool not only for curriculum review and development but also for practitioner self-assessment, performance review, and professional development.

The Process
Forty-two adult education practitioners from Manitoba (n=11), Saskatchewan (n=12), British Columbia (n=9), and Alberta (n=10) took part in a DACUM 2 process to define the core competencies for a CACE graduate. Although a few of the participants were CACE students or faculty, the majority were not. In fact, participants who were CACE graduates or faculty were specifically directed to leave those "hats" by the door during the DACUM process. Every effort was made to include a cross-section of adult education practitioners from our communities. These individuals worked with industry, with Aboriginal, immigrant, and rural populations, with government, and with post-secondary and community-based programming. A one-day, facilitated brainstorming session was held to articulate what a CACE graduate needs to know and be able to do at the end of the program. A brainstorming process similar to DACUM was used to answer the question: What will graduates of the CACE program know and be able to do when they graduate? The Project Team Leader (and the first author of this paper) facilitated the process in which the participants identified what learners need to know and be able to do, as well as the criteria for assessing those skills.
The process consisted of several stages. As noted above, the first was task analysis, a day-long brainstorming process during which participants identified broad areas of learning and the necessary tasks that relate to them. In the second stage, the team leader took this raw material and crafted it into learning outcome statements and more complete statements of competency. The outcomes were then circulated to program committee members who reviewed them for accuracy in terms of content, level of expected learning to be demonstrated, and assessability, followed by revision.
Once all four sites had completed this process, the outcomes were reviewed and consolidated by the team leader and reviewed by consortium members. The end result differed slightly from a DACUM chart. Like DACUM, what the learner needs to know and be able to do was articulated, but the learning outcome development process also identified assessment criteria for the outcomes. In short, we not only asked "What should they know and be able to do?" but also "How would you know they know?" This process was beneficial for two reasons. First, thinking through the assessment forced participants to clearly articulate what was wanted and ensured that their expectations were reasonable. Second, clear articulation of outcome and criteria for assessment allowed practitioners to continuously self-assess.

Required Competencies
After all four partner institutions had defined their program outcomes, the outcomes were integrated into one set. To arrive at the final 13 outcomes, the team leader first reviewed the documents for outcomes that were the same. The second step was to identify those that were worded differently but had the same intent or meaning. The team leader's ability to do this was enhanced by the fact that she had facilitated the original sessions and done the word-smithing, reviews, and revisions. The final step was to collapse several broader categories into the 13 common ones now found in the final document. This document, entitled "Certificate in Adult and Continuing Education (CACE) Program Learning Outcomes," is found in Appendix 1.
As noted earlier, these 13 competencies are very similar to those reported by previous studies, which provides a measure of validation for the competencies identified in our process. The unique contribution of the present study, however, is the identification and articulation of a comprehensive set of performance elements that are associated with each of these competencies and the criteria needed to make judgments. Associated with the 13 competencies are 229 performance elements. A sample of the resulting instrument is presented in Appendix 2. For the full instrument, please go to http://www. extension.usask.ca/ExtensionDivision/credit/Certificate/CACEself-assess-mentCJUCE.doc Detailed instructions for one way to complete the self-assessment tool are provided in Appendix 2, but the tool may be used in a variety of ways for various purposes. The authors hope that the model will be taken and modified to suit the unique purposes of the individuals and organizations using it.
How the assessment tool is used will depend largely on the purpose for which it is being used. Practitioners using the document for their own selfassessment purposes can quickly review and check the performance statements to get an idea of the broad areas where they are strong or where they may need to enhance their skills. If the document is used for a performance review, employees may want to go through the document more carefully and make notes in the documentation section about how they could prove their knowledge and skills to their employer, while employers may want to "sign off " on an employee's proven skills and make notes in the areas that need improving. This document could then travel with the employee and be used for career laddering or to ensure skills are present when fulfilling multiple job roles. The document could also be used as a tool to articulate learning and provide verification for post-secondary credit or for external assessors (previous employers, other departments, volunteer supervisors) to evaluate the relevant knowledge and skills employees may bring from outside the workplace.

DISCUSSION AND CONCLUSION
In this paper, we have argued the need for an assessment instrument that would assist continuing educators to compare their current competencies against a comprehensive set of competencies required by continuing educators. In addition, we have suggested that for such an instrument to be useful, especially in the case of self-assessment, it must identify a carefully and fully articulated set of performance elements that comprise the competencies. Previous reports related to competencies required by continuing educators, and especially those that developed instruments for assessing these competencies, were reviewed.
The assessment instrument we present in this study was developed in consultation with a broadly based and representative group of adult educators. The instrument was developed by identifying competencies needed by participants in a program designed to train adult and continuing education practitioners, rather than by studying current continuing education practitioners. This may be a limitation of our study and the instrument we generated. Nonetheless, the competencies developed in this process correspond very closely with those identified by previous studies. This study's unique contribution is the identification of an extensive set of performance elements that comprise those competencies and a framework for providing evidence of learning. Moreover, our approach to assessment (and self-assessment) is based upon the portfolio-assessment and outcome-based evaluation literature. Thus, it is behaviourally based and focused upon the assessment of knowledge and skills. It requires individuals whose competencies are being assessed to identify and reflect on how they learned the knowledge and/or skills they believe they possess and to explicitly declare how they could document their conclusions. Accordingly, we believe that our instrument will prove to be very useful for continuing education practitioners, although we recognize there is no single common set of competencies that are uniformly required by all continuing educators. Rather, a range of contextual factors will contribute to defining a particular set of competencies for each practitioner. In addition, we recognize that competent performance also requires a supportive and adequately resourced workplace environment. Nonetheless, we believe that our instrument is sufficiently comprehensive to include the primary competencies required for a wide range of continuing education practitioners and will allow those practitioners to identify where their competencies are not sufficient for them to perform at the desired level of competence.
Although we have not undertaken a systematic analysis of validity and reliability measures for our instrument, it has been used very successfully in the context for which it was created. We are confident that it is a valid tool for assessing the competencies of continuing educators and that it will serve as a useful guide in planning professional development. It is with these purposes in mind that we have submitted this research. It has obvious face validity, and the competencies we have identified bear a robust relationship to those identified in other studies. In addition, the broadly consultative process by which the competencies and performance elements were established lends further support to its validity. Nonetheless, the reliability of the instrument does warrant further research.
We trust this assessment instrument will make a significant contribution to helping continuing education practitioners plan their professional development activities, thereby raising the level of professionalism of those in the field.

Certificate in Adult and Continuing Education (CACE) Program Learning Outcomes
Adult and Continuing Educators will be able to utilize effective methods and techniques to plan, conduct, and evaluate adult education programs. Practitioners will be able to promote lifelong learning through the integration of knowledge of adult learning theory, issues, concepts, and ideas into their professional practice.

THEORY
1. The practitioner will be able to describe theoretical principles of adult education and apply them in their practice.
A. Describe principles of adult learning.
B. Describe the Canadian historical context of adult education.
C. Examine major historical events, movements, programs, and institutions and their impact on current and future practice. 3. The practitioner will be able to incorporate principles of adult education in the planning and development of programs/courses that meet organizational and adult learner needs.
A. Define principles and practices for program/project management.
B. Integrate theories of adult education into program design.
C. Recognize trends that affect adult education.
D. Create a model for program planning or work with existing models.
E. Describe process and steps in community development.
F. Provide rationale for using principles of adult education in educational design.
G. Conduct a needs assessment to determine training/education needs with all stakeholders; identify stakeholders, target audience.
H. Collaborate with all program stakeholders in design and delivery of program.
I. Help learners to articulate their needs.
J. Link learning to strategic goals of an organization by demonstrating practical value of course/program.
K. Determine when it is appropriate to buy or build course curriculum to meet identified needs.
L. Develop program goals and objectives.
M. Determine scope of the program.
N. Develop a plan to reach defined outcomes.
O. Develop a budget.
P. Contribute to the development of a strategic process of learning in an organization.
Q. Negotiate politics: establish "buy-in"; develop and maintain project "champions." R. Choose and use various strategies appropriate to contextual factors, such as organization, community, culture, and individuals, when planning programs.
S. Recognize business constraints.
T. Ensure learning provided is relevant to the learners in their own environment.
U. Correlate resource (time, money, staff) acquisition with program goals, objectives, and needs. Q. Refer learners to appropriate resources.
R. Identify factors that impede performance.
S. Analyze enablers, inhibitors, and forces that contribute to performance gaps.
T. Apply and assess learning management systems.
U. Ensure learning outcomes are met.
V. Identify challenges and successes as follow-up.
W. Facilitate learners evaluating their own learning.
X. Develop intervention strategies (e.g., mentoring on the job).
Y. Confer with supervisors, colleagues, and other community resources if special assessment is required Z. Respect integrity of assessment tools and use them effectively.

INSTRUCTION
7. The practitioner will be able to deliver adult learning experiences that address learners' needs through the integration of adult learning philosophy and principles into practice.
A. Accommodate diverse learning styles, abilities, cultures, and experiences, including learners who have disabilities or other special needs.
B. Adapt educational experience to support diversity.

APPENDIX 2
Instructions for Use of the Self-assessment Tool Assessing your learning Each competency describes a complex task required of adult and continuing educators. Several Elements of Performance are listed under each competency statement to indicate the particular knowledge and skills you need to successfully perform the task described in the competency. To self-assess your knowledge and skills in these areas, begin by reading the competency statement and then each one of the performance indicators. Next, reflect on whether or not you could successfully demonstrate each of these elements and to what extent. That is, check "Yes" if you feel you could successfully demonstrate the element; check "Partially" if you still need to acquire some knowledge and skills in this area; and check "Need to learn" if you have very little or no knowledge and skill that would be needed to demonstrate that element of performance.

How did you learn it?
If you checked "Yes" or "Partially," then fill out the second column, Learning Method. In this section, you need to think about and make brief notes on where and how you acquired the knowledge and skills needed to demonstrate the element of performance. (You could also use this section to describe how you will acquire the needed knowledge and skills to fill a knowledge gap. That is, answer the question "How will I learn it?")

How could you prove it?
In the third section, Documentation/Assessment Tool, you need to think about and make brief notes on how you might demonstrate or prove that you have the necessary knowledge and skills. There are as many different ways to prove your learning as there are elements, individuals, and contexts. For example, you might describe a process, provide something that you have produced yourself, or demonstrate the skill. (You could also use this section to describe the assessment tool that you will use if you have any gaps to fill.) Revue canadienne de l'éducation permanente universitaire Vol. 31, No 2, automne 2005

Example Competency 2: Program Administration/Management
The practitioner will be able to develop, align, and manage organizational direction, group goals, and individual objectives to achieve an organization's goals and objectives.

Competency 2: Program Administration/Management
The practitioner will be able to develop, align, and manage organizational direction, group goals, and individual objectives to achieve an organization's goals and objectives.

BIOGRAPHIES
Sherry Sullivan has worked as an adult educator for 14 years with the last seven devoted to developing prior learning assessment (PLA) systems. This work included developing assessment systems and facilitating job, course, and program outcomes in community settings, colleges, universities and industry. She has presented nationally and internationally on the subject of recognizing learning, and has developed and delivered numerous courses and workshops on PLA and assessment. She has developed and delivered portfolio courses for Brandon University, the University of Saskatchewan, and the University of Manitoba. Sullivan's thesis research studied faculty members' experience of PLA development and implementation processes. Sullivan was the project team leader for the PLA Project in the Certificate in Adult and Continuing Education Program and is currently the director of recognition of prior learning at the University of Manitoba.