Categories
Program Design Program Evaluation Speaking

Presentation recap for my AEA 2015 (#eval15) Presentations on #ProgDes

AEA 2015 was a special one for me for it was the first time the concept of program design got traction. I presented with fellow design-minded evaluators in two sessions.

In the first one, I reported on my experience of embedding design principles into a developmental evaluation. The presentation was entitled,  Lessons-learned from embedding design into a developmental evaluation: The significance of power, ownership, and organizational culture. And, here’s the abstract:  

Recent attempts at developmental evaluation (DE) are incorporating human-centered design (HCD) principles (Dorst, 2011; IDEO, n.d.) to facilitate program development. HCD promotes a design-oriented stance toward program development and articulates a set of values that focuses the evaluation beyond those ideals expressed by stakeholders. Embedding design into DE promises to offer a more powerful means to promoting program development beyond either approach alone. Yet, embedding design into DE introduces additional challenges. Drawing on a case study into a design-informed DE, this panelist discusses the tensions and challenges that arose as one developmental evaluator attempted to introduce design into a DE. Insights from the case study point to the importance of:

– Attending to power dynamics that could stifle or promote design integration; and,

– Evaluator sensitivity over the deep attachment program developers had over program decisions

These findings allude to the significance of organizational culture in enabling a design-informed DE.

In the second presentation, Chithra Adams (@ChithraAdams), John Nash (@jnash), Beth Rous (@bethrous), and I discussed how principles of human-centered design could be applied to the development of programs.

Specifically, we introduced two design exercises–Journey Mapping, and User Archetyping–as means to bringing human-centered design principles into program design and evaluation.

In an upcoming post, we’ll take a deep dive into these design exercises and examine their application to program design.

Are you curious about program design? Have you any particular questions about its methods and methodologies that you’d like us to write about? Drop me a note below or find me on Twitter @chiyanlam, where I curate tweets on evaluation, design, social innovation, and creativity.

Until next time. Onwards!

Advertisement
Categories
Conference Program Evaluation

Highlights from Michael Quinn Patton’s #eval13 talk on the ‘State of Developmental Evaluation’

Michael Patton gave a great talk today at AEA13 on the State of Developmental Evaluation.  Here are some highlights.

1. The ‘Doors’ to Discovering Developmental Evaluation.

Patton observed that developmental evaluators and clients typically arrive at DE through multiple doors. One door through which people arrive at DE are those engaged in innovation. The second door through which people arrive at DE are those seeking systems change. The third door through which people arrive at DE are those dealing with complexity. The final door through which people arrive at DE are those working with unstable, changing context.

Driving this  ‘search for the alternative’ are evaluation users’ desire for a compatible evaluation framework.

2. DE is becoming a bonafide approach. 

AEA 13 features over 30+ sessions on developmental evaluation.

The Australasian Evaluation Society recently awarded their Best Policy and Evaluation Award to a crew of developmental evaluators.

(The CES awarded its  best student essay to an empirical research on understanding the capacity of DE for developing innovative program.)

3. DE is best enabled by clients who are willing to explore and experiment.

4. DE is methods-agnostic, and in fact, defies prescription.

Patton emphasized the importance of operating from the principles of DE and applying and adapting them when conducting DE. (Another way of looking this is to frame DE as engaging in inquiry… this might actually make a nice blog post).

Some observations…

Participants raised some great questions during the Q&A session.  Part of the confusion, it seems to me, lies in the more subtle aspects  to how and why Developmental Evaluation might be more appropriate/useful in some contexts. This confusion arises because of how necessarily responsive developmental evaluation is by design. The on-ramping for someone who hasn’t done DE, but wants to do it, can be difficult. So,  I wonder if there might be a place for a clearinghouse of sort for frequently asked questions—i.e. the sort often asked by newcomers.

Categories
Speaking

Developmental Evaluation and the Graduate Student Researcher

This presentation was delivered on February 22, 2012 as part of the EGSS ScholarShare series.

This presentation introduces the discipline of program evaluation and offers a glimpse to how developmental evaluation responds to the call of providing an evaluation approach to working in complex contexts, such as social innovation. I conclude by introducing the notion of design and design thinking as a way of approaching problems we face in today’s complex world.

I discuss, briefly, some of the strategies I employed to manage my own thesis research as a graduate student researcher in education.