Follow-up to AEA Think Tank session: Identifying Unintended Consequences of Development Programs
Azenet Book Club – Life Cycles, Rigid evaluation requirements, and Implementation theory
Discussion in the first session of the Azenet Tucson book club -- th Theory; using explanatory power section with the introduction to life cycle behavior (p.49). The most common evaluation activity among our members is evaluation of state or federally funded programs (DOE, SAMHSA, OJJDP, BJA). Common characteristics:programs have a few years to implement an 'evidence-based … Continue reading Azenet Book Club – Life Cycles, Rigid evaluation requirements, and Implementation theory
Arizona Evaluation Book Group – Reading The Book
Our book group is part of the Tucson, Arizona contingent of Azenet, an AEA affiliate. Reading Evaluation in the Face of Uncertainty chapters 1-4 has stimulated the rich discussion and experience sharing that we had hoped for, among new and experienced evaluators. As JM anticipated in the intro, some read cases as they were cited, while … Continue reading Arizona Evaluation Book Group – Reading The Book
Surprise in Evaluation: Values and Valuing as Expressed in Political Ideology, Program Theory, Metrics, and Methodology (AEA 2011 – Think Tank Proposal)
Submitted by Jonny Morell Joanne Farley Tarek Azzam Abstract: How does political ideology affect program theories, methodologies, and metrics? Participants will be randomly assigned to groups, and asked to sketch an evaluation based on one of three positions. 1) Government has an obligation to alleviate social inequities and thereby promote the public good. 2) Government’s … Continue reading Surprise in Evaluation: Values and Valuing as Expressed in Political Ideology, Program Theory, Metrics, and Methodology (AEA 2011 – Think Tank Proposal)
Miscellaneous thoughts on complexity in Evaluation
How do concepts of "complexity" play in Evaluation?
Empirical test of Jonny’s beliefs about evaluation surprise
Technical section of proposal to the National Science Foundation
Unexpected program outcomes as a function of the ideologies driving an evaluation – Some questions for the AEA
I want to spark a discussion of what evaluation might look like if it were practiced by people who were working from different ideological frameworks.
Efficiently (on the cheap) and effectively evaluating training with uncertain outcomes
Effective workshop evaluation on the cheap.
Looking for examples of unexpected program outcome
Looking for examples of unexpected program outcome.