Abstract
Keywords
In the late 1990s, The Education Trust, Inc. introduced the National Transforming School Counseling Initiative (TSCI; House & Martin, 1998). Their work paved the way for the future of school counseling to include leadership, advocacy, and systemic change through the use of data. In 2003, American School Counselor Association (ASCA) incorporated many of these Education Trust initiatives to develop the
The ASCA National Model was created as a tool for building comprehensive school counseling programs that focused on foundation, delivery, management, and accountability (ASCA, 2003). In it, ASCA recommended school counselors engage in designing standards-based programs focusing on increasing student expectations while supporting students in the academic, career, and personal/social domains. The framework was designed to assist school counselors and school counseling program teams to utilize various data and accountability tools as they design, coordinate, implement, manage, and evaluate the school counseling program’s efficiency and effectiveness in regard to student success (ASCA, 2003). ASCA indicated that the primary role of the school counselor is to enhance and promote student achievement and identified the school counselor as a leader, an advocate, and a catalyst for systemic change. The framework was intended to unify the profession, providing a blueprint for program development along with the flexibility to fit every school counseling program, and was designed to be incorporated in conjunction with the district and school’s mission and goals.
Since the ASCA National Model was released in 2003, hundreds of research studies and/or dissertations specifically related to the implementation of the ASCA National Model have been conducted. These articles include studies about school counselor job satisfaction (Pyne, 2011), models of data-based decision making (Poynton & Carey, 2006; Stone & Dahir, 2007), school counselor accountability practices (Topdemir, 2010), school counselors’ readiness to deliver school counseling programs (Dahir, Burnham, & Stone, 2009), obstacles and successes in implementation (Studer, Diambra, Breckner, & Heidel, 2011), current state of school counseling models (Martin, Carey, & DeCoster, 2009), and educating future principals (Bringman, Mueller, & Lee, 2010; Light, 2005). These studies reflect the amount of attention the model has garnered over the years, and the potential the model holds to influence the direction of the school counseling profession.
In addition, the introduction of the ASCA National Model was followed by the creation of ASCA’s Recognized ASCA Model Program (RAMP) designation. RAMP components reflect the implementation of a comprehensive school counseling program based on the ASCA National Model. Since 2003, more than 500 schools throughout the country have been awarded this designation (ASCA, 2015a). Although there is little research to date regarding the impact on students’ achievement in RAMP schools, ASCA National Model programs at the elementary level have been shown to significantly affect students’ attendance rates and reading achievement (Ward, 2009). Wilkerson, Pérusse, and Hughes (2013) found that schoolwide proficiency rates in English/Language Arts and Math were significantly higher in RAMP-designated schools compared with those that were not. These studies suggest that a fully implemented ASCA National Model program may be viewed as one factor of the entire school system that positively affects student success.
Zagelbaum, Kruczek, Alexander, and Crethar (2014) analyzed the content of 413 articles published in the
In 2008, Hatch and Chen-Hayes published an article about school counselors’ beliefs with regard to the ASCA National Model School Counseling Program Components using the School Counseling Program Component Scale (SCPCS). In that article, the authors presented results of a survey administered in 2002 to more than 1,200 ASCA members, to establish the psychometric properties of the SCPCS and collect national baseline data on school counselor beliefs about certain program components in the ASCA National Model prior to its release in 2003.
The results of the first SCPCS survey administered in 2002 revealed that when the ASCA National Model was first released, participants reported that program component activities involving data and accountability were less important than other components, such as mission, goals and competencies, and administrator support (Hatch & Chen-Hayes, 2008). Respondents also ranked the use of data for the purposes of identifying achievement gaps and monitoring student progress as the least important in relationship to other activities (Hatch & Chen-Hayes, 2008). The authors suggested the low rankings for accountability measures may be a result of lack of training in how to collect, measure, analyze, and interpret data. Hatch and Chen-Hayes (2008) recommended that successful implementation of the ASCA National Model would begin when school counselors “understand the importance of developing data skills and then using data in both program management and accountability” (p. 40).
Dahir et al. (2009) identified the “gaps in the school counselors’ ability to embrace and implement the new vision of comprehensive school counseling during the initial stages of implementation” (p. 182). Dahir et al.’s (2009) research focused on school counselors’ “readiness to deliver comprehensive programs by assessing their attitudes, beliefs, and priorities for key program elements affirmed in the ASCA National Model” (p. 182). Dahir et al. (2009) discussed the impact the ASCA National Model has had on the school counseling profession, citing studies that reveal how students benefit from participating in comprehensive school counseling programs, and noted the lack of training opportunities on how to implement a comprehensive program. Dahir and Stone (2009) also expressed concern that, despite the development of the ASCA National Model and increased political attention to eliminating the achievement gap, many school counselors continue to adhere to ineffective methods of accountability.
Hatch and Chen-Hayes (2008) recommended future research be conducted to document whether changes in professional beliefs and attitudes regarding the ASCA National Model might change over time. This study was designed to measure and document any shifts in school counselor beliefs about ASCA National Model School Counseling Program Components between 2002 and 2009, 6 years after the release of the ASCA National Model in 2003.
Research Questions
The primary focus of this study was to examine how the introduction of the ASCA National Model may have influenced school counselors’ beliefs. This was accomplished by comparing data collected in 2009 with data collected in 2002 as reported by Hatch and Chen-Hayes (2008). Specifically, we sought to answer the following research questions:
Method
Information about the participants and procedures in the 2002 study can be found by consulting the original manuscript (Hatch & Chen-Hayes, 2008). The participants and procedures below describe the current study, with data collected in 2009.
Participants
The sample included 12,953 ASCA members who indicated they were school counselors or administrators working in K-12 schools, who had also opted to make their email address publicly available to other ASCA members in the membership database. Of the 1,021 respondents who started the online survey by answering at least one question, 617 provided usable data yielding an overall response rate of 4.8%.
Of the 617 participants who provided usable surveys, 27% (
Procedures
An invitation to participate in the study was sent via email to the 12,953 ASCA members. The email included information about how to participate, described informed consent, and provided a link to the online survey. A reminder email was sent 2 weeks later. The initial email invitation yielded 285 usable surveys, whereas the remaining 332 participants completed the survey after the reminder email was sent. The online survey was a multipage survey, which contained a page with informed consent, a page with a modified version of the SCPCS, and a page with demographic questions.
Instrument
The SCPCS was modified slightly for use in the 2009 study, based on feedback provided by Professional School Counseling reviewers (Hatch & Chen-Hayes, 2008). The original five-point scale, which ranged from 1 to 5 with named anchors at 1 (
Data Analyses
To assess the possible impact of changing the scale on the structure, reliability, and validity of the SCPCS, several analyses were performed in a manner identical to those reported by Hatch and Chen-Hayes (2008). Specifically, replication analysis (Osborne & Fitzpatrick, 2012) of the principal components analysis (PCA) and internal consistency estimates were calculated in a manner identical to the initial SCPCS study (Hatch & Chen-Hayes, 2008). To facilitate analysis of changes in the perceived importance of the items over time, individual items were rank-ordered based on observed mean scores. The use of rank-ordered lists of the 2002 and 2009 data facilitates comparison between the two administrations in spite of the change in the scale, and allows conclusions to be drawn about the perceived importance of each item relative to all other SCPCS items.
Results
To compare the component structure derived from the 2002 SCPCS data with the 2009 SCPCS data and the relationship of subscales to the instrument as a whole, a replication PCA was performed. A forced four-component PCA with a Promax rotation and Kaiser normalization revealed that 18 of the 19 SCPCS items loaded on the same factors in 2009 as in 2002. The exception was the item not retained in the PCA of the 2002 data (Item 19), which loaded on the Administrator Support component. Squared differences in the factor loadings ranged from <.001 to .031. The similarity of the factor structure and small differences in item factor loadings indicate that the original PCA was replicated with our sample (Osborne & Fitzpatrick, 2012). Factor loadings for the 2009 data and squared differences of the factor loadings from 2002 are provided in Table 1. The observed component saturation and sample size exceed the recommendations of Guadagnoli and Velicer (1988) and indicate that the component patterns in the sample are stable with respect to the parameters of the larger population.
Factor Loadings for the SCPCS.
Squared difference of 2002 and 2009 factor loadings.
Factors: 1 = Use of Data for Program Planning; 2 = Use of Data for Accountability; 3 = Administrator Support; 4 = Mission, Goals, and Competencies.
Item not retained in analysis of 2002 data.
The eigenvalues of three of the four factors in the 2009 data were greater than 1.00, and the amount of variance each factor explained differed from the 2002 data. The eigenvalues for each factor and the amount of variance explained were 9.94 (Use of Data for Accountability, 52.60%), 1.35 (Use of Data for Program Planning, 7.10%), 1.14 (Administrator Support, 5.99%), and 0.89 (Mission, Goals, and Competencies, 4.71%). The majority of the variance in the 2002 data was explained by the Use of Data for Program Planning factor (43.50%), whereas the majority of the variance in the 2009 data was explained by the Use of Data for Accountability factor (52.60%).
Internal consistency estimates were calculated for each of the four subscales using Cronbach’s (1951) alpha, and are presented with the internal consistency estimates obtained from the 2002 data in parentheses: Use of Data for Program Planning = .87 (.82), Use of Data for Accountability = .91 (.80), Administrator Support = .85 (.78), and Mission, Goals, and Competencies = .84 (.86). All subscales evidenced acceptable reliability characteristics using the commonly accepted .7 criterion provided by Nunnally (1978).
To assess the perceived importance of each individual item in a comprehensive school counseling program, descriptive statistics (mean and standard deviation) were calculated. The SCPCS item means and standard deviations are presented in rank order in Table 2, along with the rank of the items based on the 2002 data. The three items revealing the greatest rank order shifts in a positive direction (toward very important) were all items about the use of data. The item
SCPCS Items Rank Ordered in Importance With Mean Ratings, Standard Deviations, and the Rank From the 2002 Data.
Subscale means were computed and subjected to a repeated measures ANOVA, which was statistically significant,
Although Hatch and Chen-Hayes (2008) noted statistically significant differences in subscale scores between groups based on level of practice (e.g., elementary, middle), the effect sizes were very small; all observed η2 values were less than .01 and do not hold practical significance (Sink & Stroh, 2006). Therefore, reporting of the means and standard deviations by level is not provided for this study.
Discussion
The first purpose of this study was to reassess the factor structure, reliability, and validity of the SCPCS (Hatch & Chen-Hayes, 2008) instrument. Overall, despite the differences between the two studies (scale change and participation due to online administration), our analyses revealed four distinct and internally consistent factors evidencing construct validity. The factor structure of the SCPCS remained intact in the second study. Although the amount of variance each factor explained differed from the 2002 data, differences in factor loadings and internal consistency estimates exceeded acceptable requirements for the scale and each of the four subscales. Thus, the four subscales were found to be consistent and reliable.
Of particular interest is the notable shift from the Use of Data for Program Planning factor accounting for the majority of the variance in 2002 to the Use of Data for Accountability factor accounting for the majority in 2009. Administrator Support and Mission, Goals and Competencies loaded in the same order (third and fourth, respectively) in both years. Consistent with the 2002 findings, the Use of Data for Program Planning item “included five items related to using data to target interventions and identify program foci” (Hatch & Chen-Hayes, 2008, p. 38). The Use of Data for Accountability factor included items relating to monitoring program implementation and measuring and reporting outcomes. One explanation for these findings may be that the introduction of the ASCA National Model, with its focus on accountability, contributed to a shift in school counselor beliefs from prioritizing the act of reviewing data as a means to design the program to prioritizing the importance of being held accountable for measuring the impact of program activities and communicating results to school counseling program stakeholders. This shift aligns with persistent requests for school counselors to do more than use data to design and prepare program activities (Dahir & Stone, 2009; Gysbers, 2004; Sink, 2009). Progressive and evolutionary behaviors for school counselors have been encouraged through calls for purposeful program improvement models designed to improve efficiency and effectiveness and to promote the schools counselors’ value as worth its cost or resource (Dahir & Stone, 2003; Dimmitt, Carey, & Hatch, 2007; Hatch, 2008). Future research is needed utilizing a confirmatory factor analysis approach to further assess the SCPCS factor structure, and to assess whether wording, order of items on the survey, or other factors are influencing the factor structure.
Rank Order Findings
The second purpose of this study was to compare the baseline data, in rank order form, on school counselor beliefs about specific components of the ASCA National Model before its release in 2002 (Hatch & Chen-Hayes, 2008) with data collected 7 years after its release (in 2009), to determine the extent to which school counselor beliefs have shifted since the introduction of the ASCA National Model. The discussion will begin with top rank order findings, followed by largest positive rank shifts, largest negative rank shifts, and finally an item that did not shift.
The top ranking item on both the 2002 and 2009 surveys was developing goals for the program. This finding aligns with calls in the last few decades for business, education, and the school counseling profession to prioritize goal setting (Campbell & Dahir, 1997; Dahir, Sheldon, & Valiga, 1998; Doran, 1981; Haycock, 2001; Marzano, 2010; Meyer, 2003). Setting goals within the school counseling program is a required component of the RAMP (ASCA, 2015b), and SMART goals are included in ASCA’s
Develop goals for the counseling program
Write a mission statement or philosophy
Utilize schoolwide and student data to design new counseling activities
Identify specific student competencies to which the school counseling program, curriculum, or activities contribute or align
Use data to demonstrate the impact of the school counseling program on student success in school
Use data to measure the outcome results of the school counseling program
Greatest positive rank order shifts
The findings in this study suggest school counselors have begun to prioritize the value, importance, and necessity of using data in their school counseling programs. All three items with the greatest rank order shifts in the positive direction (toward very important) were items focusing on the use of data. Two items,
Encouraging counselor educators to teach using data to identify and eliminate achievement gaps and to measure the impact of interventions has been the focus of the TSCI within the Education Trust Foundation since 1996 (House & Martin, 1998). Promoting skills of using data, teamwork, collaboration, technology, advocacy, and leadership were essential in the Education Trust’s recommendations for preparing school counselors (House & Martin, 1998). The ASCA National Model incorporated these same essential tenets in its text, themes, and in the Closing the Gap Action Plan (ASCA, 2003, 2005, 2012a).
Current and former national reform efforts by The College Board’s National Office for School Counselor Advocacy (NOSCA), TSCI, and the National Association of College Admissions Counselors (NACAC) promote the urgency and necessity of school counselors using data to ensure all students are college and career ready upon graduating from high school (Hatch, 2012; Hatch & Bardwell, 2012; Hines & Lemons, 2011 ). The newest edition of the ASCA National Model contains its strongest language to date regarding the use of data, which is mentioned 124 times in the 161-page document (ASCA, 2012a). The newly revised model incorporates language from the most recent revision of the ethical guidelines (ASCA, 2010) stressing the responsibility of school counselors to use equity-based data to identify, address, and resolve attainment, achievement, and opportunity gaps. The term “gap[s]” is mentioned 60 times. Finally, the ASCA Model’s focus on results (124 mentions) and accountability (48 mentions) is irrefutable. School counselors who responded to the current survey are members of ASCA who have received firm directive from the professional association to answer the following question:
Greatest negative rank order shifts
The two largest rank order shifts in the negative direction (toward not important) were for items related to non-counseling activities. In the findings from the survey administered in 2002,
In the 2002 survey, the item
Minimal rank order shift
One item that experienced only a minimal shift from the previous survey,
Finally, little research exists on the demographics of ASCA members. Therefore, it warrants discussion that 83% of the respondents in both surveys identified as female. The sample in 2009 was slightly more diverse (86% White compared with 92% in 2002). The largest increase in respondents identifying as non-White was the percentage of Hispanic/Latino respondents, which doubled from 2% in 2002 to 4% in 2009. African American respondents increased minimally from 3% to 4%. In 2012, the College Board released the results of a national survey containing a “Profile of America’s School Counselors” in which the respondents identified themselves as 78% female, 10% African American, and 15% Hispanic or Latino (Bruce & Bridgeland, 2012, p. 79). The College Board study surveyed only secondary school counselors; our study included counselors from all grade levels, and may be one possible explanation of the observed race/ethnicity and gender differences of the samples. Given this discrepancy, future research is needed to compare school counselors’ race/ethnicity and gender with membership in professional associations as they frequently provide information through email, newsletters, and conferences that are likely to influence school counselor beliefs.
Implications and Recommendations
Rank ordered shifts found in this study and recent research suggest school counselors are increasingly willing to collect data to create comprehensive school counseling programs (Wilkerson et al., 2013). Professional school counselors are encouraged to participate in collegial discussions at their school sites and within their professional community on the importance and value of analyzing and presenting results data to strengthen their school counseling program. Counselor educators are encouraged to revise their pre-service training to ensure graduates are prepared to use data in schools and to continue researching the impact of implementing data-driven school counseling programs, which continue to yield positive results on student achievement (Wilkerson et al., 2013).
Multiple publications are available to teach school counselors how to collect and analyze data, create measureable action plans, and analyze and utilize results for program improvement (e.g., Hatch, 2014; Young & Kaffenberger, 2013). All of this work is central to pre-service training programs and the professional development of today’s school counselor. Sharing program results with school administrators, school district officials, and other school stakeholders is the essential next step to garnering the political and organizational support necessary for sustaining and promoting school counseling programs (Hatch, 2008; Sink, 2009).
Limitations and Conclusion
Although a focus of the current study was on the comparison of findings from data obtained in 2009 with data obtained in 2002, the data are not longitudinal, as the samples were not related to each other. Similar limitations to those reported by Hatch and Chen-Hayes (2008) regarding the data collected in 2002 also apply to the data collected in 2009. Namely, the SCPCS assesses self-reported beliefs, not actual behaviors; and, the sample is likely representative of only ASCA members, not school counselors in general.
The method and SCPCS instrument used to collect data in 2009 differed from those used in 2002. Specifically, email was used to recruit participants and facilitate participation in the study in 2009, whereas recruitment and participation in 2002 was completed using the postal mail and paper surveys. The online tools utilized for recruitment and participation yielded a relatively low response rate and did not account for invalid or undeliverable email addresses. If events leading to failed email delivery had been effectively documented, the response rate would have likely improved slightly—but more important, it would also be more accurate. A problem unique to studies using email-based recruitment is that there currently is no effective way to determine how many invitations are actually received by potential respondents. For example, because the emailed message was from an unfamiliar source and contained a hyperlink to the online survey, it may have been construed as “junk email” by the school district’s server or the individual’s email service provider.
The generalizability of our findings is inherently affected by the limitations noted above. Although our sample was similar to the sample in the original study, it consisted of only ASCA members. Our response rate was quite low; although, we believe our reported response rate is an underestimation of the actual response rate of those who received our invitation to participate in this study, self-selection bias still likely exists and influenced our findings. Finally, our sample was lacking in gender and racial diversity. Although our sample evidenced more racial diversity than the original 2002 study, we are unsure of how, specifically, our sample compares with the demographic makeup of ASCA members and the larger school counseling community.
The results of this study provide significant and valuable feedback on the shifts in school counselors’ beliefs since the 2003 introduction of the ASCA National Model. The Third Edition of the ASCA (2012a) National Model contains additional important changes and a stronger focus on the use of data. Further research is needed to measure the impact of the new edition on the beliefs and behaviors of practicing school counselors.
