Received 09/14/12
Revised 10/29/12
Accepted 11/09/12
DOI: 10.1002/j.1556-6676.2014.00134.x
Assessment Use by Counselors in the
United States: Implications for Policy
and Practice
Christina Hamme Peterson, Gabriel I. Lomas,
Edward S. Neukrug, and Matthew W. Bonner
Although assessment use is a professional activity recognized by every major counseling organization, little is known
about which assessments are used in counseling. In this study, 926 respondents from a random national sample of
counselors reported their use of personality, projective, career, intelligence/cognitive, educational/achievement, clinical/behavioral, and environmental/interpersonal tests. Test rankings by frequency of use and comparisons by type of
counselor and type of test are reported. Implications for policy and practice are discussed.
Keywords: assessment instruments, counselors, testing, measurement
The appropriate administration and interpretation of assessment instruments in all realms of counseling has been
highlighted in no less than 13 documents and standards (Association for Assessment and Research in Counseling, 2013).
In fact, assessment is one of eight common core curricular
areas mandated by the Council for Accreditation of Counseling and Related Educational Programs (CACREP; 2009);
is one of the main content areas assessed by the National
Counselor Exam (National Board for Certified Counselors
[NBCC], 2012b); and is highlighted in all counseling-related
codes of ethics, including those of the American Counseling
Association (ACA; 2005) and NBCC (2012a).
In their Standards for Qualifications of Test Users, ACA
(2003) identified that competence in testing is acquired
through education, training, and experience and that master’slevel counselors with course work in assessment are qualified
to use objective measures. They further argued that, with
additional specialized training, counselors can administer
projective tests, intelligence tests, and clinical diagnostic
tests. Such tests are sometimes called Level C or advanced
(American Psychological Association [APA], 1954; Turner,
DeMers, Fox, & Reed, 2001). This argument is consistent
with the Standards for Educational and Psychological Testing (American Educational Research Association, APA, &
National Council on Measurement in Education, 1999), which
stipulates that qualifications for assessment use should stem
from experience, training, and credentials and should be in
compliance with the code of ethics of the individual’s professional organization. Accordingly, test publishers typically
require verification of a potential test user’s level of education,
training, and credentials. Such practices have largely been recognized by ACA as acceptable (Naugle, 2009) and relatively
consistent with guidelines set by APA (Turner et al., 2001).
Despite the aforementioned standards that suggest that
counselors should be able to give and interpret a wide range
of assessment instruments, there are significant roadblocks
to assessment use by counselors (Naugle, 2009; Watson &
Sheperis, 2010.) Psychologists generally have considered only
their training as suitable for the administration of some types
of tests (Society for Personality Assessment, 2006; Turner et
al., 2001), and some state licensure boards have attempted to
define competency as synonymous with psychology licensure
(Association of Test Publishers, 2007; Watson & Sheperis,
2010). Multiple states, including Alaska, Nebraska, Tennessee, and California, do not allow counselors to administer
intelligence tests, whereas additional states (e.g., Alabama,
Alaska, Arkansas, California, Tennessee, and Texas) do not allow counselors to use projective tests (California Association
for Licensed Professional Clinical Counselors, n.d.; Licensed
Professional Counselor Act, 1999; Naugle, 2009). In fact, a
survey of U.S. states and Canadian provinces found that 67%
of states and provinces had restrictions on the administration of psychological testing by nonpsychologists (Dattilio,
Tresco, & Siegel, 2007).
Unfortunately, the establishment of competency and the
right to use tests has also been hampered by counselors and
counselor educators themselves. For instance, a random
sample of 641 counselors and counselor educators rated
CACREP’s common core curricular area of assessment as one
of the least beneficial of all eight CACREP core standards
(McGlothlin & Davis, 2004). In addition, counseling students
have expressed dismay and fear about assessment courses (Davis, Chang, & McGlothlin, 2005; Wood & D’Agostino, 2010),
whereas counselor educators have expressed a lack of desire to
teach such courses (Davis et al., 2005). Similarly, researchers
suggest that practicing counselors do not view assessment as
Christina Hamme Peterson, Department of Graduate Education, Leadership and Counseling, Rider University; Gabriel I. Lomas,
Department of Education and Educational Psychology, Western Connecticut State University; Edward S. Neukrug and Matthew W.
Bonner, Department of Counseling and Human Services, Old Dominion University. Correspondence concerning this article should
be addressed to Christina Hamme Peterson, Department of Graduate Education, Leadership and Counseling, Rider University,
2083 Lawrenceville Road, Lawrenceville, NJ 08648 (e-mail: cpeterson@rider.edu).
© 2014 by the American Counseling Association. All rights reserved.
90
Journal of Counseling & Development
■
January 2014
■
Volume 92
Assessment Use by Counselors in the United States
a main focus of counseling and feel inadequate and poorly
trained in this area (Ekstrom, Elmore, Schafer, Trotter, &
Webster, 2004; Fischer & Chambers, 2003; Mellin, Hunt, &
Nichols, 2011; Villalba, Latus, & Hamilton, 2005). Despite
these expressions of ambivalence toward assessment, many
counselors reported involvement with assessment (Hood,
2001). Surveys of school counselors found assessment usage,
including interpretation and synthesis with other sources of
data in counseling, occurring as often as three times a week
(Blacher, Murray-Ward, & Uellendahl, 2005) and by as many
as 91% of respondents (Ekstrom et al., 2004).
In this era of data-driven reform, high assessment use by
counselors is not surprising. School counselors are increasingly being called to analyze standardized test scores to find
areas of need and respond with appropriate interventions to
address these concerns (American School Counselor Association, 2012). School counselors are also often involved in
child study teams where the need to understand and interpret
educational and psychological tests is critical. In addition,
clinical mental health counselors are increasingly being asked
to provide evidence of positive treatment outcomes (Marotta
& Watts, 2007). Evidence provided by sound psychometrically based instruments can demonstrate to funding agencies,
insurance companies, and others the effectiveness of client
treatment (Studer, Oberman, & Womack, 2006).
Of course, the use of assessment by counselors to increase
student and client self-awareness and for case conceptualization is equally important (Neukrug & Fawcett, 2010; Rudy
& Levinson, 2008). There is some evidence that clinical
hypotheses from client interviews may be subject to confirmation bias, given that counselors seek additional information
to confirm their working hypothesis but do not actively seek
alternative explanations (Owen, 2008; Strohmer & Shivy,
1994). Objective measures can be one mechanism of countering such bias.
Although research to understand the types of assessment
instruments used by psychologists in a variety of specialty
areas has been conducted (Archer, Buffington-Vollum, Stredny, & Handel, 2006; Demaray, Schaefer, & Delong, 2003;
Hogan, 2005; McCloskey & Athanasiou, 2000; Shapiro &
Heick, 2004; Watkins, Campbell, Nieberding, & Hallmark,
1995), there is little parallel research for counselors. Evidence
suggests that school counselors frequently use assessments
(Blacher et al., 2005; Ekstrom et al., 2004); however, studies
have not identified the specific tools they use. Furthermore,
although Juhnke, Vace, Curtis, Coll, and Paredes (2003)
investigated assessment instruments used by addictions counselors and Hogan and Rengert (2008) looked at instruments
counselors use in research, no broad-based investigation of
counselor use of assessment instruments has occurred. Thus,
to inform counselors and counselor educators about counselor
assessment use, this study sought to determine (a) which tests
are used most frequently by counselors overall and within
counseling specialty areas, (b) which test categories (e.g.,
Journal of Counseling & Development
■
January 2014
■
personality, projective, career) are the most heavily used, and
(c) how usage of the test categories and of tests overall differs
by type of counselor (school, clinical mental health, and all
other counselors combined).
Method
Survey Development
To build the survey, we developed a list of 174 commercial
standardized tests by systematically examining assessment
instruments identified in four textbooks frequently used by
counselor educators (i.e., Drummond & Jones, 2010; Erford,
2007; Hood & Johnson, 2007; Neukrug & Fawcett, 2010)
and three articles that surveyed psychologists’ and addictions
counselors’ use of assessment instruments (i.e., Hogan, 2005;
Hogan & Rengert, 2008; Juhnke et al., 2003). To reduce the
length of the survey and increase response rate (Edwards et
al., 2002), we organized the survey into seven test categories:
personality, projective, career, intelligence/cognitive, educational/achievement, clinical/behavioral, and environmental/
interpersonal. Excluded were tests of specific aptitudes (e.g.,
Mechanical Aptitude Test, Meier Art Test) because the list
of these was extensive and would have rendered the survey
too long.
Within each category, assessment instruments were sorted
according to the number of sources in which they appeared.
Instruments that were mentioned in four or more sources were
retained, and the remaining instruments were retained if any
author indicated that they were of high importance or if at least
two authors indicated that they were of moderate importance.
This resulted in a final list of 98 assessment instruments,
including 14 in personality, 10 in projective, 12 in career, 20
in intelligence/cognitive, 15 in educational/achievement, 22
in clinical/behavioral, and five in environmental/interpersonal
(see Table 1). For each assessment instrument, respondents
were asked to rate frequency of use on a scale ranging from
1 (never) to 5 (frequently), with use being defined as “in
any capacity, ranging from administering the test to simply
reviewing the results.” A demographic section preceded the
list of assessments and included questions about respondents’
professional background (including education, population
served, type of counseling practice, and number of years
practicing) and personal characteristics (including gender,
age, and race/ethnicity).
Participants and Procedure
The survey was sent via e-mail to a sample of 5,000 national
certified counselors randomly selected by NBCC. The study
and its purpose were briefly described in the e-mail, and
respondents were asked to click a link to SurveyMonkey in
order to read the informed consent form, submit the informed
consent, and complete the survey. All procedures were approved by the appropriate institutional review board. Of the
initial sample, 268 responses were returned as nondeliverVolume 92
91
Peterson, Lomas, Neukrug, & Bonner
Table 1
All Tests With Pooled Mean Usage Frequencies, Percentage of Counselors Who Use the Test, and
Rankings by Combined Sample and Different Types of Counselors
CMHCs
%
Rank
M
OCs
%
Rank
3.17
2.20
1.86
1.32
1.37
1.62
1.80
1.85
2.46
73
52
39
15
18
29
33
32
54
1
5
9
48
42
20
13
10
2
2.48
3.18
3.14
2.04
1.96
2.63
1.81
1.61
1.92
60
78
73
39
39
57
34
27
39
4
1
2
6
7
3
13
20
10
38
19
5
17
13
2.28
1.83
1.45
1.26
1.56
48
35
22
13
22
3
11
36
59
22
1.86
1.95
1.61
2.26
1.76
33
40
25
46
30
11
9
20
5
16
18
44
24
35
33
12
26
23
24
20
11
35
28
27
41
6
24
16
13
52
27
28
30
38
58
15
20
17
1.93
1.38
1.89
1.55
1.52
2.22
1.74
1.76
1.73
1.31
1.87
1.25
1.47
1.45
43
18
38
26
23
42
31
30
31
16
35
13
21
20
6
41
7
23
26
4
15
14
16
51
8
61
30
36
1.86
1.58
1.63
1.68
1.55
1.49
1.57
1.54
1.44
1.96
1.54
1.47
1.47
1.35
41
24
31
34
25
20
28
22
18
39
23
22
23
16
11
23
19
18
25
30
24
26
39
7
26
33
33
51
1.94
1.35
1.91
1.33
1.54
1.18
1.38
1.31
1.39
1.23
1.26
1.26
1.08
1.56
38
21
37
18
24
11
18
16
21
13
15
12
6
25
10
41
12
44
26
65
37
46
36
56
50
50
89
23
1.21
1.64
1.15
1.63
1.46
1.83
1.54
1.50
1.22
1.45
1.29
1.51
1.71
1.16
11
29
9
30
21
33
26
23
11
24
15
26
28
10
64
18
82
19
31
11
24
28
63
36
57
27
17
79
1.37
1.52
1.43
1.51
1.47
1.44
1.49
1.60
1.78
1.70
1.77
1.49
1.45
1.47
18
25
22
24
19
21
23
26
35
37
34
26
20
25
46
28
41
29
33
39
30
22
14
17
15
30
38
33
1.47
1.45
1.48
1.47
1.29
1.12
1.40
1.58
1.22
1.19
1.16
1.24
1.43
1.17
1.24
1.62
1.28
22
23
22
20
15
7
20
24
12
12
9
12
21
9
15
25
15
30
33
29
30
48
72
35
22
57
63
69
54
34
68
54
21
49
1.35
1.31
1.31
1.26
1.42
1.57
1.19
1.21
1.46
1.46
1.46
1.46
1.17
1.49
1.36
1.12
1.20
17
14
16
12
20
24
11
11
20
23
21
20
10
20
19
7
11
44
51
51
59
40
21
72
64
31
31
31
31
75
29
43
86
71
1.29
15
59
1.34
17
55
1.30
16
58
1.36
17
49
1.35
16
51
1.37
17
46
1.47
20
33
1.26
15
65
1.36
18
49
1.38
19
45
1.39
20
44
1.28
15
61
1.35
18
51
1.29
12
59
1.31
20
57
1.17
10
83
1.41
20
42
(Continued on next page)
Rank
M
SCs
%
Rank
1
2
3
4
5
6
7
8
9
1.95
1.95
2.10
2.85
2.85
1.92
2.44
2.52
1.55
43
41
47
55
56
38
49
50
26
8
8
7
1
1
11
4
3
24
10
11
12
13
14
1.36
1.67
2.23
1.74
1.89
17
28
44
30
36
15
15
17
18
19
19
21
22
23
24
25
26
26
28
1.35
2.18
1.55
1.80
1.89
1.25
1.53
1.49
1.47
1.36
1.21
1.86
1.64
1.74
29
30
31
32
32
34
35
35
37
38
39
40
41
42
43
44
45
45
47
47
49
50
51
52
53
54
55
55
57
57
59
All
Test
Beck Depression Inventorya
Myers–Briggs Type Indicator b
Strong Interest Inventoryc
ACTd
SAT/PSATd
Self-Directed Searchc
Wechsler Intelligence Scale for Childrene
Conners’ Rating Scalesa
Beck Anxiety Inventorya
Substance Abuse Subtle Screening
Inventorya
Wechsler Adult Intelligence Scalee
Woodcock–Johnson Tests of Cognitive Abilitiese
O*NET System and Career Exploration Toolsc
Wide Range Achievement Testd
Minnesota Multiphasic Personality
Inventory (MMPI)b
Woodcock–Johnson Tests of Achievementd
House-Tree-Person Testf
Stanford–Binet Intelligence Scalee
Wechsler Individual Achievement Testd
Mini-Mental State Examinationa
Human Figure Drawingf
Child Behavior Checklista
Children’s Depression Inventorya
Values Scalec
Symptom Checklist a
Stanford Achievement Testd
Attention Deficit Disorders Evaluation Scalea
Behavior Assessment System for Childrena
Iowa Tests of Basic Skills/Iowa Test of
Educational Developmentd
Beck Scale for Suicide Ideationa
Armed Services Vocational Aptitude Batteryc
MMPI–Adolescentb
Wechsler Abbreviated Scale of Intelligencee
Trauma Symptom Checklista
Draw-a-Man/Draw-a-Womanf
Bender Visual Motor Gestalt Testa
Career Occupational Preference Systemc
Sixteen Personality Factor Questionnaireb
Campbell Interest and Skill Surveyc
Thematic Apperception Testf
Michigan Alcoholism Screening Testa
Kuder Career Searchc
Wechsler Preschool and Primary Scale of
Intelligencee
Vineland Adaptive Behavior Scalese
Kaufman Assessment Battery for Childrene
Kaufman Brief Intelligence Test e
Kinetic Drawing System for Family and Schoolf
Millon Clinical Multiaxial Inventorya
Peabody Picture Vocabulary Testd
Cognitive Abilities Test e
Sentence Completion Seriesf
Eating Disorder Inventorya
Personality Assessment Inventoryb
Rotter Incomplete Sentences Blankf
Basic Achievement Skills Inventoryd
Parenting Stress Indexg
Rorschach Inkblot Testf
Otis–Lennon School Ability Teste
Differential Aptitude Testsc
92
M
Journal of Counseling & Development
■
January 2014
■
Volume 92
Assessment Use by Counselors in the United States
Table 1 (Continued)
All Tests With Pooled Mean Usage Frequencies, Percentage of Counselors Who Use the Test, and
Rankings by Combined Sample and Different Types of Counselors
All
Test
Wechsler Memory Scalee
Marital Satisfaction Inventoryg
Parent-Child Relationship Inventoryg
Slosson Intelligence Teste
State–Trait Anxiety Inventorya
Test of Nonverbal Intelligencee
Millon Adolescent Personality Inventoryb
Millon Index of Personality Stylesb
Career Maturity Inventoryc
Kaufman Test of Educational Achievementd
Children’s Apperception Test f
Quality of Life Inventorya
Harrington–O’Shea Career Decision-Making
Systemc
Family Environment Scaleg
Coopersmith Self-Esteem Inventoryb
Piers–Harris Children’s Self-Concept Scaleb
Family Assessment Measureg
Millon Adolescent Clinical Inventorya
Metropolitan Achievement Testd
Kaufman Adolescent and Adult Intelligence
Teste
California Psychological Inventoryb
NEO Personality Inventoryb
Kindergarten Readiness Testd
Reynolds Adolescent Depression Scalea
Trail Making Teste
Halstead–Reitan Neuropsychological Test Batterye
Achenbach System of Empirically Based
Assessmenta
Eysenck Personality Questionnaireb
Raven’s Progressive Matricese
Tennessee Self-Concept Scaleb
NEO Five Factor Inventoryb
Jackson Vocational Interest Surveyc
KeyMathd
Luria–Nebraska Neuropsychological Batterye
Forer Structured Sentence Completion Testf
Gesell Developmental Observationd
Devereux Scales of Mental Disordersa
Boston Process Approache
Metropolitan Readiness Testd
Rank
M
SCs
%
Rank
60
61
62
63
64
65
66
67
67
67
70
71
1.21
1.03
1.11
1.36
1.06
1.34
1.14
1.12
1.18
1.33
1.20
1.07
11
3
7
17
4
18
9
9
11
14
12
6
58
96
74
38
94
43
70
72
65
44
62
91
72
73
74
75
76
77
78
1.19
1.09
1.18
1.21
1.05
1.09
1.25
10
7
10
11
4
8
12
79
80
81
81
83
84
85
1.10
1.09
1.13
1.31
1.11
1.08
1.10
86
87
87
89
90
90
90
93
94
94
96
97
98
1.11
1.09
1.11
1.10
1.09
1.10
1.21
1.07
1.07
1.09
1.02
1.02
1.09
Rank
1.34
1.28
1.27
1.24
1.35
1.20
1.25
1.25
1.41
1.21
1.20
1.28
17
13
12
12
17
12
12
15
21
11
11
14
55
61
64
70
51
76
66
66
42
73
76
61
92
51
61
64
48
48
92
1.37
1.24
1.21
1.21
1.25
1.18
1.19
19
12
14
12
13
11
11
46
70
73
73
66
81
79
9
11
12
6
11
10
7
79
75
64
90
64
74
86
1.25
1.24
1.15
1.08
1.13
1.17
1.20
12
18
11
6
9
9
12
66
70
85
94
88
83
76
10
10
6
9
10
6
4
7
8
4
8
5
3
72
79
90
83
75
92
95
89
85
95
83
95
98
1.11
1.15
1.19
1.15
1.11
1.18
1.09
1.13
1.06
1.10
1.07
1.08
1.03
6
11
8
11
9
10
5
8
4
6
4
5
2
90
85
79
85
90
81
93
88
97
92
96
94
98
Rank
1.31
1.54
1.44
1.21
1.34
1.21
1.33
1.34
1.12
1.17
1.29
1.31
15
24
19
10
17
10
16
18
7
9
16
16
51
24
39
64
45
64
47
45
86
75
57
51
63
82
65
58
95
82
52
1.09
1.31
1.25
1.21
1.32
1.32
1.09
6
15
14
12
16
15
6
6
7
9
16
7
6
7
78
82
71
46
74
89
78
1.16
1.17
1.21
1.10
1.21
1.18
1.12
7
5
6
6
5
6
14
6
5
6
3
3
5
74
82
74
78
82
78
58
91
91
82
97
97
82
1.19
1.16
1.10
1.14
1.17
1.09
1.07
1.11
1.13
1.07
1.14
1.07
1.04
1.41
0.39
Mh
SD
OCs
%
CMHCs
%
M
1.42
0.34
M
1.46
0.38
Note. All = all counselors combined; SCs = school counselors; CMHCs = clinical mental health counselors; OCs = other counselors;
O*NET = Occupational Information Network.
a
Clinical/behavioral test. bPersonality test. cCareer test. dEducational/achievement test. eIntelligence/cognitive test. fProjective test.
g
Environmental/interpersonal test. hDifferences are not significant, F(2, 923) = 0.49, p > .05.
able and an additional 19 were returned by participants who
indicated that they were no longer practicing, for a final
sample of 4,713. Following Edwards et al.’s (2002) findings
on increasing survey response rates, we sent three follow-up
e-mails to the sample over a 6-week interval. A total of 926
usable responses were returned, for a response rate of 19.6%.
Demographics of respondents are presented in Table
2. Of the respondents, 68.1% were clinical mental health
counselors; 17.0% were school counselors; and 14.9% were
Journal of Counseling & Development
■
January 2014
■
college counselors, rehabilitation counselors, career counselors, family/marriage counselors, counselor educators, or
other (nonspecified) counselors. Because of lower numbers
of respondents, the last six categories were combined into
a single counselor category called “other,” leaving three
categories of counselors (school, clinical mental health, and
other) for all subsequent analyses. As is consistent with the
field, women were heavily represented at 79.4%, and 82.6%
of the respondents had a master’s degree as their highest deVolume 92
93
Peterson, Lomas, Neukrug, & Bonner
Table 2
Demographic Characteristics of Respondents
Variable
Gender
Men
Women
Race/ethnicitya
Hispanic
African American/Black
Asian
Native American
Caucasian/White
Highest degree attained
Master’s
Master’s and additional graduate credits
EdS/CAS
Doctorate
Type of counselor
School
Clinical mental health
College
Rehabilitation
Career
Family/marriage
Counselor educator
Other (nonspecified)
Type of population servedb
Preschool children
Elementary school children
Middle school children
Adolescents
Adults
Families
n
Valid %
190
732
20.6
79.4
34
73
6
10
795
3.7
8.0
0.7
1.1
86.6
380
376
41
118
41.5
41.1
4.5
12.9
157
631
51
14
14
11
7
41
17.0
68.1
5.5
1.5
1.5
1.2
0.8
4.4
47
212
206
389
618
223
5.1
22.9
22.2
42.0
66.7
24.1
Note. N = 926. EdS = education specialist; CAS = certificate of
advanced study.
a
Percentages do not total 100 because of rounding. bResponses
do not total 100 because respondents could select all that apply.
gree obtained. Respondents’ age ranged from 24 to 80 years,
with an average of 47.4 years (SD = 13.07); number of years
practicing ranged from .05, indicating that, overall, the three types of counselors
were using tests at about the same frequency.
Pooled mean usage frequency by category of test for the
entire sample is provided in Table 3. Results of a repeated
measures ANOVA indicated that frequency of use differed
across the categories and that these differences were statistically significant with a small effect, F(4.07, 3761.25) =
45.85, p < .006, h2 = .05. Descriptive statistics in Table 3
demonstrate that, for the overall sample, usage of clinical/
behavioral tests was highest and usage of environmental/
interpersonal, educational/achievement, and intelligence/
cognitive tests was lowest.
Results of one-way ANOVAs comparing different types of
counselors on mean frequency of use in each test category are
also presented in Table 3. Results were statistically significant
for all test categories except for projective and intelligence/cognitive tests, suggesting that usage of tests within most categories
Table 3
Pooled Mean Frequency of Use of Each Test Category by Different Types of Counselors and for All
Counselors Combined
Test Category
Personality
Projective
Career
Intelligence/
cognitive
Educational/
achievement
Clinical/
behavioral
Environmental/
interpersonal
School
Counselors
M
SE
Clinical Mental
Health Counselors
M
SE
Other
Counselors
M
SE
Univariate F
b
M Diff
c
1.23
1.30
1.50
0.39
0.54
0.60
1.40
1.48
1.28
0.56
0.75
0.51
1.45
1.38
1.82
0.49
0.66
0.78
8.95d
4.83
52.46d
1.46
0.57
1.31
0.57
1.35
0.62
4.41
1.70
0.65
1.23
0.49
1.42
0.59
49.49d
MH < O < S
1.36
0.44
1.70
0.70
1.48
0.65
20.90d
1.09
0.34
1.42
0.83
1.27
0.74
12.85d
h2
S < MH, O
.02
MH < S < O
.10
All Counselorsa
M
SE
1.38
1.43
1.40
0.53
0.70
0.61
1.34
0.57
.10
1.34
0.56
S, O < MH
.04
1.61
0.67
S < MH, O
.03
1.34
0.76
Note. S = school counselors; MH = clinical mental health counselors; O = other counselors.
a
Results of a repeated measures analysis of variance. Differences between categories for all counselors combined were statistically
significant, F(4.07, 3761.25) = 45.85, p < .006, η 2 = .05. bdfbetween, dfwithin = (2, 923). cGames–Howell post hoc comparisons (p < .05).
d
Significant at p < .006.
Journal of Counseling & Development
■
January 2014
■
Volume 92
95
Peterson, Lomas, Neukrug, & Bonner
differs by type of counselor. Post hoc comparisons suggest that
clinical mental health counselors use clinical/behavioral tests
at higher rates than do school and other counselors, school
counselors use career and educational/achievement tests at
higher rates than do clinical mental health counselors, and
school counselors use personality tests at lower rates than do
clinical mental health and other counselors. Although these
differences were statistically significant, the effect sizes were
small to moderate (h2 = .02–.10). This suggests that use of
certain test categories may differ by type of counselor but that
all counselors use tests in all categories to a degree.
Discussion
Despite the political and legislative opposition described by
Dattilio et al. (2007) and Naugle (2009), results of this study
demonstrate that counselors are using assessments in their
practice, with some of the more frequently used tests being the
BDI, MBTI, SII, ACT Assessment, SAT/PSAT, Self-Directed
Search, Wechsler Intelligence Scale for Children, Conners’
Rating Scales, BAI, and Substance Abuse Subtle Screening
Inventory (see Table 1). In addition, counselors are using assessment instruments in patterns that are appropriate given
the nature of their practice. For example, school counselors
use career and educational/achievement assessments more frequently than do clinical mental health counselors and clinical
mental health counselors use clinical/behavioral assessments
more frequently than do school and other counselors.
Although the use of projective tests by counselors is prohibited in some states and generally requires additional training if
counselors are to administer them ethically (Licensed Professional Counselor Act, 1999; Naugle, 2009), these instruments
were found to be used more frequently than intelligence/
cognitive and environmental/interpersonal tests. The sparse
use of environmental/interpersonal tools may be due to their
more systemic nature. The Marital Satisfaction Inventory, for
example, can be used only with couples and is not useful for
single individuals. However, such low usage may also be due
to a failure on the part of counselor educators to cover such
assessment instruments in their courses. In fact, although
all other types of assessments were covered in several of the
counseling assessment textbooks we reviewed, environmental/
interpersonal tests were covered in only one (Erford, 2007).
It is interesting that clinical mental health counselors identified a number of Level C or advanced instruments, such as
projective tests, as among their most commonly used tests. For
example, the Minnesota Multiphasic Personality Inventory
and the Wechsler Adult Intelligence Scale were ranked within
their top 15 assessments. Such instruments are identified by
test publishers as requiring course work and/or experience
that are generally beyond what is offered at the master’s level.
Clinical mental health counselors’ high rankings of these
tools suggest that despite the restrictions, counselors are using them. Given that use in our survey was defined as “in any
96
capacity, ranging from administering the test to simply reviewing the results,” it is possible that these counselors are merely
reviewing the results of tools administered by psychologists
or more advanced-level practitioners. Alternatively, clinical
mental health counselors may be actively seeking advanced
course work and supervision to meet administration qualifications. Whatever the case, these findings suggest that there
may be a need to prepare counselors for tools that in the past
many counselors had not used.
Although the highest ranked tools in this study were used
by the majority of respondents, average frequency of test use
across all test categories was low. In fact, counselors reported
average use as falling between rarely and never (see Table
3). In today’s climate of accountability, the use of objective
measures is a critical component of counseling work (Neukrug
& Fawcett, 2010; Rudy & Levinson, 2008). Many instruments
can be useful tools to identify client concerns, create treatment
plans, and quantify progress. However, these low frequency
ratings indicate that counselors are not heavy users of psychological tests. Lack of usage may impede counselors’ ability
to collect evidence of effective practice, which may make it
more difficult for them to advocate for funding to sustain or
enhance their services (Studer, Oberman, & Womack, 2006).
Possible explanations as to why counselors are not using
assessment instruments more frequently include legislative
restrictions and a lack of interest in assessment by counselors
and counselor educators (e.g. McGlothlin & Davis, 2004;
Naugle, 2009; Wood & D’Agostino, 2010). However, other
explanations may also include a lack of awareness about available assessments, test costs, or a lack of training and relatively
little guidance from CACREP regarding which tests should
be covered (Neukrug, Peterson, Bonner, & Lomas, 2013).
The behaviors in assessment administration may also feel
fundamentally different from the interpersonal interaction
and connection that may have drawn counselors to the field
in the first place, resulting in a reticence to use tools that may
be seen as causing distance between counselor and client.
Although CACREP’s 2009 Standards suggest that students
should know “basic concepts of standardized and nonstandardized testing and other assessment techniques, including
norm-referenced and criterion-referenced assessment, environmental assessment, performance assessment, individual
and group test and inventory methods, psychological testing,
and behavioral observations” (p. 13), there is little guidance
for university faculty regarding which specific tests should
be taught and in how much depth. With many faculty having
little or no interest in assessment (Davis et al., 2005), this lack
of direction may leave faculty who do teach testing unclear
about which assessment instruments should be highlighted.
The present study offers a first step in providing counselor
educators knowledge regarding which instruments are commonly used by practitioners and may suggest instruments
that students should learn. In addition, counselor education
programs may want to consider whether training in a wider
Journal of Counseling & Development
■
January 2014
■
Volume 92
Assessment Use by Counselors in the United States
range of assessment instruments might benefit counselors,
especially in terms of how counselors can enhance their
work with clients and how assessment instruments can be
used for accountability purposes and to advocate for future
funding of services.
Additional research should also investigate other explanations for limited assessment use among counselors. If lack of
knowledge about the wide range of instruments available is
one cause, this study offers currently practicing counselors a
vehicle to identify tests that may be of help in their own work.
Counselors could explore the utility of a variety of the heavily used instruments identified herein and, when necessary,
consider additional training to increase their confidence and
skills in competently and ethically using instruments. Counselors who are qualified to use assessments might advocate
for assessment use through work with their state counseling
associations to remove current barriers faced by counselors.
Limitations of the study include the low response rates from
counselors in specialty areas other than clinical mental health
counseling and school counseling. Although our findings provide a general overview of tests used by counselors, it would
be helpful to gain more information from additional counselor
specialty areas (e.g., career counselors, addictions counselors,
couple and marriage counselors, rehabilitation counselors),
because assessment use may vary with the population served.
Second, because of concerns about the survey length, our
list of assessment instruments was not exhaustive and some
assessment categories, such as tests of mechanical aptitude,
were not represented. Additional research should explore use
of other tools and assessment categories not included herein.
Finally, findings from this study reveal which instruments
counselors are using in their practice. However, it is likely to
be of significant value to determine how counselors are using
these instruments. For example, the cohort of school counselor
respondents indicated that the Woodcock–Johnson Test of Cognitive Abilities is the fifth most commonly used assessment in their
practice. However, how school counselors use the instrument is
not known. Future studies on the use of assessment instruments
by all counselors might address how they use instruments in
practice and may focus on specific test categories.
References
American Counseling Association. (2003). Standards for qualifications of test users. Alexandria, VA: Author.
American Counseling Association. (2005). ACA code of ethics.
Alexandria, VA: Author.
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing.
Washington, DC: American Educational Research Association.
American Psychological Association. (1954). Technical recommendations for psychological tests and diagnostic techniques.
Washington, DC: Author.
Journal of Counseling & Development
■
January 2014
■
American School Counselor Association. (2012). The ASCA National Model: A framework for school counseling programs (3rd
ed.). Alexandria, VA: Author.
Archer, R. P., Buffington-Vollum, J. K., Stredny, R. V., & Handel,
R. W. (2006). A survey of psychological test use patterns among
forensic psychologists. Journal of Personality Assessment, 87,
84–94. doi:10.1207/s15327752jpa8701_07
Association for Assessment and Research in Counseling. (2013).
Resources. Retrieved from http://aarc-counseling.org/resources
Association of Test Publishers. (2007, Summer). Restricted test list
repealed in Indiana. Test Publisher, 14, 1.
Blacher, J. H., Murray-Ward, M., & Uellendahl, G. E. (2005).
School counselors and student assessment. Professional School
Counseling, 8, 337–343.
California Association for Licensed Professional Clinical Counselors. (n.d.). FAQ: Scope of practice for LPCCs. Retrieved from
http://calpcc.org/scope-of-practice-for-lpccs
Collins, L. M., Schafer, J. L., & Kam, C. M. (2001). A comparison of
inclusive and restrictive strategies in modern missing-data procedures.
Psychological Methods, 6, 330–351. doi:10.1037/1082-989X.6.4.330
Council for Accreditation of Counseling and Related Educational
Programs. (2009). 2009 CACREP accreditation manual and
application. Alexandria, VA: Author.
Dattilio, F. M., Tresco, K. E., & Siegel, A. (2007). An empirical survey
on psychological testing and the use of the term psychological: Turf
battles or clinical necessity? Professional Psychology: Research
and Practice, 38, 682–689. doi:10.1037/0735-7028.38.6.682
Davis, K. M., Chang, C. Y., & McGlothlin, J. M. (2005). Teaching assessment and appraisal: Humanistic strategies and activities for counselor educators. Journal of Humanistic Counseling, Education and
Development, 44, 94–101. doi:10.1002/j.2164-490X.2005.tb00059.x
Demaray, M. K., Schaefer, K., & Delong, L. K. (2003). Attentiondeficit/hyperactivity disorder (ADHD): A national survey of
training and current assessment practices in the schools. Psychology in the Schools, 40, 583–597. doi:10.1002/pits.10129
Drummond, R. J., & Jones, K. D. (2010). Assessment procedures
for counselors and helping professionals (7th ed.). Upper Saddle
River, NJ: Pearson Education.
Edwards, P., Roberts, I., Clarke, M., Diguiseppi, C., Pratap, S.,
Wentz, R., & Kwan, I. (2002). Increasing response rates to
postal questionnaires: Systematic review. BMJ: British Medical Journal, 324, 1183–1185. doi:10.1136/bmj.324.7347.1183
Ekstrom, R. B., Elmore, P. B., Schafer, W. D., Trotter, T. V., & Webster, B. (2004). A survey of assessment and evaluation activities
of school counselors. Professional School Counseling, 8, 24–30.
Erford, B. (2007). Assessment for counselors. Boston, MA: Houghton
Mifflin.
Fischer, J. M., & Chambers, E. (2003). Multicultural counseling
ethics and assessment competencies: Directions for counselor
education programs. Journal of Applied Rehabilitation Counseling,
34, 17–21.
Graham, J. W. (2009). Missing data analysis: Making it work in
the real world. Annual Review of Psychology, 60, 549–576.
doi:10.1146/annurev.psych.58.110405.085530
Volume 92
97
Peterson, Lomas, Neukrug, & Bonner
Hogan, T. P. (2005). Widely used psychological tests. In G. P.
Koocher, J. C. Norcross, & S. S. Hill (Eds.), Psychologists’
desk reference (2nd ed., pp. 101–104). New York, NY: Oxford
University Press.
Hogan, T. P., & Rengert, C. (2008). Test usage in published research
and the practice of counseling: A comparative review. Measurement and Evaluation in Counseling and Development, 41, 51–56.
Hood, A. B. (2001). Revitalizing the assessment course in the counseling curriculum. Retrieved from ERIC database. (ED457429)
Hood, A. B., & Johnson, R. W. (2007). Assessment in counseling:
A guide to the use of psychological assessment procedures.
Alexandria, VA: American Counseling Association.
Juhnke, G. A., Vace, N. A., Curtis, R. C., Coll, K. M., & Paredes,
D. M. (2003). Assessment instruments used by addictions counselors. Journal of Addictions & Offender Counseling, 23, 66–72.
doi:10.1002/j.2161-1874.2003.tb00171.x
Licensed Professional Counselor Act, Texas Stat. §§ 503-001-003
(1999).
Marotta, S. A., & Watts, R. E. (2007). An introduction to the best
practices section in the Journal of Counseling & Development. Journal of Counseling & Development, 85, 491–503.
doi:10.1002/j.1556-6678.2007.tb00617.x
McCloskey, D., & Athanasiou, M. S. (2000). Assessment and
intervention practices with second-language learners among
school psychologists. Psychology in the Schools, 37, 209–225.
doi:10.1002/(SICI)1520-6807(200005)37:33.3.CO;2-R
McGlothlin, J. M., & Davis, T. E. (2004). Perceived benefit of CACREP
(2001) core curriculum standards. Counselor Education and Supervision, 43, 274–285. doi:10.1002/j.1556-6978.2004.tb01852.x
Mellin, E. A., Hunt, B., & Nichols, L. M. (2011). Counselor professional identity: Findings and implications for counseling and
interprofessional collaboration. Journal of Counseling & Development, 89, 140–147. doi:10.1002/j.1556-6678.2011.tb00071.x
National Board for Certified Counselors. (2012a). Code of ethics. Retrieved from http://www.nbcc.org/assets/ethics/nbcccodeofethics.pdf
National Board for Certified Counselors. (2012b). NBCC examinations. Retrieved from http://www.nbcc.org/Exams
Naugle, K. A. (2009). Counseling and testing: What counselors need
to know about state laws on assessment and testing. Measurement
and Evaluation in Counseling and Development, 42, 31–45.
doi:10.1177/0748175609333561
Neukrug, E. S., & Fawcett, R. C. (2010). Essentials of testing and
assessment: A practical guide for counselors, social workers,
and psychologists. Belmont, CA: Brooks/Cole.
Neukrug, E., Peterson, C. H., Bonner, M., & Lomas, G. I. (2013). A
national survey of assessment instruments taught by counselor
educators. Counselor Education and Supervision, 52, 207–221.
Owen, J. (2008). The nature of confirmatory strategies in the initial assessment process. Journal of Mental Health Counseling, 30, 362–374.
Rubin, D. B. (1976). Inference and missing data. Biometrika, 63,
581–592. doi:10.1093/biomet/63.3.581
98
Rudy, H. L., & Levinson, E. M. (2008). Best practices in the multidisciplinary assessment for emotional disturbances: A primer for
counselors. Journal of Counseling & Development, 86, 494–504.
doi:10.1002/j.1556-6678.2008.tb00537.x
Schafer, J. L. (1999). Multiple imputation: A primer. Statistical Methods in Medical Research, 8, 3–15.
doi:10.1191/096228099671525676
Schafer, J. L., & Graham, J. W. (2002). Missing data: Our view
of the state of the art. Psychological Methods, 7, 147–177.
doi:10.1037/1082-989X.7.2.147
Schafer, J. L., & Olsen, M. K. (1998). Multiple imputation for
multivariate missing-data problems: A data analyst’s perspective.
Multivariate Behavioral Research, 33, 545–571. doi:10.1207/
s15327906mbr3304_5
Shapiro, E. S., & Heick, P. F. (2004). School psychologist assessment practices in the evaluation of students referred for social/
behavioral/emotional problems. Psychology in the Schools, 41,
551–561. doi:10.1002/pits.10176
Society for Personality Assessment. (2006). Standards for education
and training in psychological assessment: Position of the Society
for Personality Assessment. Journal of Personality Assessment,
87, 355–357. doi:10.1207/s15327752jpa8703_17
Sterner, W. R. (2011). What is missing in counseling research?
Reporting missing data. Journal of Counseling & Development,
89, 56–62. doi:10.1002/j.1556-6678.2011.tb00060.x
Strohmer, D. C., & Shivy, V. A. (1994). Bias in counselor hypothesis testing: Testing the robustness of counselor confirmatory bias. Journal of Counseling & Development, 73, 191–197.
doi:0.1002/j.1556-6676.1994.tb01735.x
Studer, J. R., Oberman, A. H., & Womack, R. H. (2006). Producing
evidence to show counseling effectiveness in schools. Professional School Counseling, 9, 385–391.
Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics. Boston, MA: Pearson.
Turner, S. M., DeMers, S. T., Fox, H. R., & Reed, G. M. (2001).
APA’s guidelines for test user qualifications: An executive summary. American Psychologist, 56, 1099–1113. doi:10.1037/0003066X.56.12.1099
Villalba, J. A., Latus, M., & Hamilton, S. (2005). School counselors’
knowledge of functional behavioral assessments. Behavioral
Disorders, 30, 449–455.
Watkins, C. E., Campbell, V. L., Nieberding, R., & Hallmark, R.
(1995). Contemporary practice of psychological assessment by
clinical psychologists. Professional Psychology: Research and
Practice, 26, 54–60. doi:10.1037/0735-7028.26.1.54
Watson, J. C., & Sheperis, C. J. (2010). Counselors and the right
to test: Working toward professional parity. Alexandria, VA:
American Counseling Association.
Wood, C., & D’Agostino, J. V. (2010). Assessment in counseling:
A tool for social justice work. In M. J. Ratts, R. L. Toporek, &
J. A. Lewis (Eds.), ACA Advocacy Competencies: A social justice framework for counselors (pp. 151–159). Alexandria, VA:
American Counseling Association.
Journal of Counseling & Development
■
January 2014
■
Volume 92
Purchase answer to see full
attachment