Examples of our research on the practice of evaluation include the study of (a) the degree to which the participation of program personnel in evaluations affects evaluation methods and results and (b) the study of the degree to which programs are implemented as intended.
Our research on the methods of evaluation has examined the development, validation, and use of evaluation methods such as classroom observations, teacher logs, student self-report instruments, student assessments, and other data collection instruments. As a result of this kind of research, CRDG faculty and staff have published reports about the development of the instruments, as well as several evaluation instruments in the professional evaluation literature.
Our research on the theory of evaluation has addressed fundamental issues in evaluation—revisited regularly in new contexts over the years—that undergird how and why evaluations are conducted. Theory topics have included the extent to which evaluations should respect indigenous populations, the use of evaluation findings by program personnel, and others.
Our research on the profession of evaluation has included reviews of the degree to which research-on-evaluation studies have been reported in the professional evaluation literature and a study of evaluators’ perceptions of evaluation. Recent scholarship on this issue, as well as on other aspects of research on evaluation, was published in Issue No. 148 (Winter 2015) of New Directions for Evaluation, an American Evaluation Association journal.
The results of CRDG faculty and staff’s work on the four broad evaluation topics have been presented at national evaluation conferences, published in several national refereed journals, and discussed in books. They have provided some of the background necessary to be awarded grants from the National Science Foundation and the U. S. Department of Education, and they have helped serve as the foundation for the selection of a CRDG faculty member as the 2013–2017 Editor-in-Chief of New Directions for Evaluation. CRDG faculty and staff intend to broaden their body of research on evaluation practice, methods, the profession, and theory and thereby continue to enhance CRDG’s stature and contributions nationally and internationally.
Recent Scholarship
Lewis, N. R., Harrison, G. M., Ah Sam, A. F., & Brandon, P. R. (2015). Evaluators’ perspectives on research on evaluation. In P. R. Brandon (Ed.), Research on Evaluation. New Directions for Evaluation, 148, 89–102. doi: 10.1002/ev.20159
Vallin, L. M., Philippoff, J., Pierce, S., & Brandon, P. R. (2015). Research-on-evaluation articles published in the American Journal of Evaluation, 1998-2014. In P. R. Brandon (Ed.), Research on Evaluation. New Directions for Evaluation, 148, 7-15. doi: 10.1002/ev.20153[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”full_width” angled_section=”no” text_align=”left” background_image_as_pattern=”without_pattern” css_animation=””][vc_column][vc_separator type=”transparent” up=”10″][/vc_column][/vc_row][vc_row row_type=”expandable” text_align=”left” background_color=”#f7f7f7″ more_button_label=”Evaluation Resources” less_button_label=”Evaluation Resources” button_position=”left”][vc_column][vc_column_text]
- Appendix to “Research-on-Evaluation Articles Published in the American Journal of Evaluation, 1998-2014” (Vallin, Philippoff, Pierce, & Brandon, 2015)
- Appendix to “Evaluators’ Perspectives on Research on Evaluation” (Lewis, Harrison, Ah Sam, & Brandon, 2015)
- Appendix to “The State of the Empirical Research Literature on Stakeholder Involvement in Program Evaluation” (Brandon & Fukunaga, 2014)
- SAS Code for Calculating Intraclass Correlation Coefficients and Effect Size Benchmarks for Site-Randomized Education Experiments (as cited in Brandon, Harrison, & Lawton, 2014)
Harrison, G. M., Duncan Seraphin, K., Philippoff, J., Vallin, L. M., & Brandon, P. R. (2015). Comparing models of nature of science dimensionality based on the Next Generation Science Standards. International Journal of Science Education, 37, 1321–1342. doi: 10.1080/09500693.2015.1035357
Lewis, N. R., Harrison, G. M., Ah Sam, A. F., & Brandon, P. R. (2015). Evaluators’ perspectives on research on evaluation. In P. R. Brandon (Ed.), Research on Evaluation. New Directions for Evaluation, 148, 89–102. doi: 10.1002/ev.20159
Vallin, L. M., Philippoff, J., Pierce, S., & Brandon, P. R. (2015). Research-on-evaluation articles published in the American Journal of Evaluation, 1998-2014. In P. R. Brandon (Ed.), Research on Evaluation. New Directions for Evaluation, 148, 7-15. doi: 10.1002/ev.20153
Yin, Y., Olson, J., Olson, M., Slovin, H., & Brandon, P. R. (2015). Comparing two versions of professional development for teachers using formative assessment in networked mathematics classrooms. Journal of Research on Technology in Education, 47, 41–70.
Brandon, P. R. (2014). Book review: J. Bradley Cousins and Jill C. Chouinard, Participatory evaluation up close: An integration of research-based knowledge. American Journal of Evaluation, 35, 291–297. doi:10.1177/1098214013503202
Brandon, P. R., & Fukunaga, L. (2014). The state of the empirical research literature on stakeholder involvement in program evaluation. American Journal of Evaluation, 35, 26–44. doi: 10.1177/1098214013503699
Brandon, P. R., Lawton, B. E., & Harrison, G. M. (2014). Issues of rigor and feasibility when observing the quality of educational program implementation: A case study. Evaluation and Program Planning, 44, 75–80. doi:10.1016/j.evalprogplan.2014.02.003
Brandon, P. R., Smith, N. L., Ofir, Z., & Noordeloos, M. (2014). African Women in Agricultural Research and Development: An exemplar of managing for impact in development evaluation. American Journal of Evaluation, 35, 128–143. doi: 10.1177/1098214013509876
Brandon, P. R., & Lawton, B. E. (2013). The development, validation, and potential uses of the Student Interest-in-the-Arts Questionnaire. Studies in Educational Evaluation, 39, 90–96. doi: http://dx.doi.org/10.1016/j.stueduc.2013.01.001
Brandon, P. R., Harrison, G. M., & Lawton, B. E. (2013). SAS code for calculating intraclass correlation coefficients and effect size benchmarks for site-randomized education experiments. American Journal of Evaluation, 34, 78–83. doi: 10.1177/1098214012466453
Brandon, P. R., Smith, N. L., & Grob, G. F. (2012). Five years of HHS home health care evaluations: Using evaluation to change national policy. American Journal of Evaluation, 33, 251–263.
Young, D.B, Pottenger, F.M., Brennan, C.A., Brandon, P., & Nguyen, T. T. (2012). Science and engineering in early education: a conceptual framework for improving teaching and learning. Moscow, Russia: Proceedings of the International Scientific and Practical Conference Science Education in the School of the Information Age (translated from Russian).
Brandon, P. R. (2011). Reflection on four multisite evaluation case studies. In J. A. King & F. Lawrenz (Eds.), Multisite evaluation practice: Lessons and reflections from four cases. New Directions for Evaluation, 129, 87–95.
Brandon, P. R., Smith, N. L., & Hwalek, M. (2011). Aspects of successful evaluation practice at an established private evaluation firm. American Journal of Evaluation, 32, 295–307.
Smith, N. L., & Brandon, P. R. (2011). If not to predict, at least to envision, evaluation’s future. American Journal of Evaluation, 32, 565–566.
Brandon, P. R., & Smith, N. L. (2010). Exemplars editorial statement. American Journal of Evaluation, 31, 252–253.
Brandon, P. R., Smith, N. L., Trenholm, C., & Devaney, C. (2010). Evaluation exemplar: The critical importance of stakeholder relations in a national, experimental abstinence education evaluation. American Journal of Evaluation, 31, 517–531.
Smith, N. L., Brandon, P. R., Lawton, B. E., & Krohn-Ching, V. (2010) Evaluation exemplar: Exemplary aspects of a small group-randomized local educational program evaluation. American Journal of Evaluation, 31, 254–265.
[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”full_width” angled_section=”no” text_align=”left” background_image_as_pattern=”without_pattern” css_animation=””][vc_column][vc_separator type=”transparent” up=”10″][/vc_column][/vc_row][vc_row row_type=”expandable” text_align=”left” background_color=”#f7f7f7″ more_button_label=”Selected Presentations” less_button_label=”Selected Presentations” button_position=”left”][vc_column][vc_column_text]Harrison, G. M. (2016, September). Using a validity argument to plan better surveys. Invited half-day workshop at the annual meeting of the Hawai‘i-Pacific Evaluation Association, Kāne‘ohe, HI.Harrison, G. M. (2016, April). Switching between models to measure metacognition. Paper presented at the biennial International Objective Measurement Workshop, Washington, D.C.
Harrison, G. M., & Vallin, L. M. (2016, April). Empirical evidence about the factor structure of the Metacognitive Awareness Inventory. Paper presented at the annual meeting of the American Educational Research Association, Washington, D.C.
Harrison, G. M., & Vallin, L. M. (2016, January). Empirical evidence about the factor structure of the Metacognitive Awareness Inventory. Paper presented at the annual meeting of the Hawai‘i Educational Research Association, Honolulu, HI.
Harrison, G. M., Lewis, N., Ah Sam, F., & Brandon, P. R. (2015, November). Evaluators’ perceptions of published research on evaluation. Paper presented at the annual meeting of the American Evaluation Association, Chicago, IL.
Lewis, N., Ah Sam, F., Harrison, G. M., & Brandon, P. R. (2015, November). An increasing use of and involvement in research on evaluation: Preliminary report of evaluators’ suggestions. Paper presented at the annual meeting of the American Evaluation Association, Chicago, IL.
Harrison, G. M., Lewis, N., & Brandon, P. R. (2015, September). Applying the theory of planned behavior to model evaluators’ research-on-evaluation beliefs and involvement. Paper presented at the annual meeting of the Hawai‘i-Pacific Evaluation Association, Kāne‘ohe, HI.
Lewis, N., Harrison, G. M., Ah Sam, F., & Brandon, P. R. (2014, September). Evaluators’ perspectives on research on evaluation. Paper presented at the annual meeting of the Hawai‘i-Pacific Evaluation Association, Kāne‘ohe, HI.
Harrison, G. M. (2014, April). The effects of intrajudge consistency feedback in an Angoff standard-setting procedure. Paper presented at the annual meeting of the National Council on Measurement in Education, Philadelphia, PA.
Harrison, G. M. (2014, April). Modeling the dimensionality of nature-of-science understanding. Paper presented at the biennial International Objective Measurement Workshop, Philadelphia, PA.
Harrison, G. M., Vallin, L. M., Philippoff, J., Brandon, P. R., & Seraphin, K. (2014, April). Balancing development and measurement needs in an evaluation of a program under development. Paper presented at the annual meeting of the American Educational Research Association, Philadelphia, PA.
Philippoff, J., Seraphin, K., Nguyen, T. T. T., Harrison, G. M., Vallin, L. M., & Brandon, P. R. (2014, April). Aquatic science, hybrid structure, and metacognitive strategies: Innovative aspects of a science professional development program. Paper presented at the annual meeting of the American Educational Research Association, Philadelphia, PA.
Harrison, G. M., Vallin, L. M., Philippoff, J., & Brandon, P. R. (2013, September). Balancing outcomes measurement demands with the complexities of program development. Paper presented at the annual meeting of the Hawai‘iPacific Evaluation Association, Kāne‘ohe, HI.
Seraphin, K., Philippoff, J., Harrison, G. M., Vallin, L. M., Kaupp, L., Lawton, B. E., & Brandon, P. R. (2013, September). Effects of the Teaching Science as Inquiry (TSI) Aquatic professional development course for middle- and high-school teachers. Poster presented at the semi-annual meeting of the Society for Research on Educational Effectiveness, Washington, D.C.
Vallin, L. M. & Philippoff, J. (2013, October). A closer look at research on evaluation: Common areas of research, design and methods. Paper presented at the annual meeting of the American Evaluation Association, Washington, D.C.
Brandon, P. R. (2012, April). Ruminations on research on evaluation . Paper presented at the meeting of the American Educational Research Association, Vancouver, B. C.
Harrison, G. M., & Vallin, L. M. (2012, October). Developments in measuring teachers’ knowledge and practice in teaching science as inquiry. Paper presented at the annual meeting of the American Evaluation Association, Minneapolis, MN.
Harrison, G. M. (2012, October). Charting L1 vocabulary growth with the Vocabulary Size Test: Rasch-based validation for L2 interpretation. Paper presented at the Second Language Research Forum, Pittsburgh, PA.
Harrison, G. M., Vallin, L. M. & Brandon, P. R. (2012, September). Scoring teachers’ responses to mini-vignettes for measuring professional-development effects. Roundtable paper presented at the annual meeting of the Hawai‘i-Pacific Evaluation Association, Kāne‘ohe, HI.
Harrison, G. M., Brandon, P. R., & Lawton, B. E. (2012, April). Observing the quality of integrating the arts into elementary school reading instruction. Poster presented at the annual meeting of the American Educational Research Association, Vancouver, Canada.
Vallin, L. M., & Harrison, G. M. (2012, October). Lessons learned in using vignettes to assess teachers’ knowledge of inquiry-based teaching practices. Paper presented at the annual meeting of the American Evaluation Association, Minneapolis, MN.
Brandon, P. R., Harrison, G. M., & Lawton, B. (2011, November). Intraclass correlation coefficients and effect sizes for planning school-randomized experiments. Paper presented at the annual meeting of the American Evaluation Association, Anaheim, CA.
Harrison, G. M., & Brandon, P. R. (2010, September). Statistics for planning schoolrandomized experiments in Hawaii. Poster presented at annual meeting of the Hawai‘i-Pacific Evaluation Association, Honolulu, HI.
Brandon, P. R. (2001, March). An overview of test standard-setting methods. A paper presented at the Hawai‘i Assessment Conference, Honolulu.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”full_width” angled_section=”no” text_align=”left” background_image_as_pattern=”without_pattern” css_animation=””][vc_column][vc_separator type=”transparent” up=”20″ down=”20″][/vc_column][/vc_row]