Monitoring, Evaluation and Learning Systems

NPF TIG Week: Patrick Germain on Powerful Evaluation on Limited Resources

Hello from Patrick Germain! I am an internal evaluator, professor, blog writer, and the President of New York Consortium of Evaluators.  Working as a nonprofit internal evaluator teaches you a few things about evaluating with very few resources. Even as our sector gets better at using validated evidence for accountability and learning, the resources to support evaluative activities remain elusive.  I have written elsewhere about how nonprofits should be honest with funders about the true costs of meeting their evaluation requirements, but here I want to share some tips and resources for evaluators who are trying to meet higher evaluation expectations than they are receiving funding for.

Hot Tip #1: Don’t reinvent the wheel.

  1. Use existing data collection tools: ask your funder for tools that they might use or check out sites like PerformWell, OERL, The Urban Institute, or others that compile existing measurement instruments.
  2. The internet is your friend. Websites like surveymonkey, d3js (for fancy data viz), chandoo (for excel tips), and countless others have valuable tools and information that evaluators might find useful.  And places like Twitter or AEA365 help you stay on top of emerging resources and ideas.
  3. Modify existing forms or processes to collect data; this can be much more efficient than creating entirely new data collection processes.

Hot Tip #2: Use cheap or free labor.

  1. Look into colleges and universities to find student interns, classes that need team projects, or professors looking for research partners.
  2. Programs like ReServe and your local RSVP group place older adults who are looking to apply their professional skills to part time or volunteer opportunities.
  3. Crowdsourcing or outsourcing through websites like Skillsforchange, HelpFromHome, or Mechanical Turk, can be a cheap way of accomplishing some of the more mundane and time-consuming aspects of your projects.
  4. Organize or join a local hackathon, or find data analysts to volunteer time.

Hot Tip #3: Maximize the value of your efforts.

  1. Use resources allocated for evaluation as an opportunity to build the evaluation capacity of your organization – leverage your investment to help the organization improve its ability to conduct, participate in, and use evaluations.
  2. Focus your efforts on what is needed, be deliberate about eliminating as much unnecessary work as you can, and be very efficient with your time.

What other tools or resources do you use when you have limited resources?

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. NPF TIG Week: Trina Willard on Moving from Measurement Strategy to Implementation in Small Nonprofits
  2. NPF TIG Week: Gretchen Shanks on What Do We Mean by “Evaluation Resources”?
  3. Randahl Kirkendall and Ellen Iverson on Integrating Web Analytics into Mixed Methods Evals

NPF TIG Week: Trina Willard on Moving from Measurement Strategy to Implementation in Small Nonprofits

American Evaluation Association 365 Blog - Wed, 10/01/2014 - 01:15

My name is Trina Willard and I am the Principal of Knowledge Advisory Group, a small consulting firm that provides research and evaluation services to nonprofits, government agencies and small businesses. I’ve worked with a variety of nonprofit organizations over the years, many of which have limited staff and financial resources.

Such organizations sometimes have the opportunity to secure a small grant from a funder, awarded with good intentions to “nudge” their evaluation capacity in the right direction. These dollars may be adequate to create a measurement strategy or evaluation plan, but support is rarely provided for implementation. Consequently, many recipients leave these efforts with the feeling that they’ve accomplished little. So how do we effectively guide these organizations, but avoid leaving them in the frustrating position of being unable to take next steps? These three strategies have worked well for me in my consulting practice. 

Hot Tip #1: Discuss implementation capacity at the onset of measurement planning. Get leadership engaged and put the organization on notice early that the evaluation plan won’t implement itself. Help them identify an internal evaluation champion who will drive the process, provide oversight and monitor progress.

Hot Tip #2: Leave behind a process guide. Provide clear written guidance on how the organization should move forward with data collection. The guide should answer these questions, at a minimum:

  • Who is responsible for collecting the data?
  • What are the timelines for data collection?
  • How and where will the data be stored?
  • What does accountability for data collection look like?

Hot Tip #3: Create an analysis plan. Great data is useless if it sits in a drawer or languishes in a computer file, unanalyzed. Spend a few hours coaching your client on the key considerations for analysis, to include assigning responsibilities, recommended procedures, and how to find no/low-cost analysis resources.

Below are a few of our favorite go-to resources for small nonprofits that need support implementing evaluation strategies.

Rad Resources: Creating and Implementing a Data Collection Plan by Strengthening Nonprofits. Try this if you need a quick overview to share with staff.

Analyzing Outcome Information by The Urban Institute. This resource, referenced in the above-noted overview, digs into more details. Share it with the organization’s evaluation champion as a starting point to build analysis capacity.

Building Evaluation Capacity by Hallie Preskill and Darlene Russ-Eft. I’ve recommended this book before for nonprofits and it bears repeating. The tools, templates and exercises in the Collecting Evaluation Data and Analyzing Evaluation Data sections are particularly valuable for those that need implementation support.

What tips and resources do you use to prepare small nonprofits for implementing measurement strategies with limited resources?

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Catherine Jahnes on Learning Communities
  2. Clare Nolan and Sonia Taddy on Capacity Building for Nonprofits
  3. Karen Elfner Childs on Measuring Fidelity of School-Wide Behavior Support

NPF TIG Week: Kamilah Henderson on Working with Nonprofits to Maximize Evaluation Capacity

American Evaluation Association 365 Blog - Tue, 09/30/2014 - 01:15

Hi, I’m Kamilah Henderson, Evaluation Fellow at Skillman Foundation in Detroit. I work with Foundation staff and partners to create learning opportunities that inform the work of improving conditions for Detroit kids.

Skillman provided a social innovation grant to the Detroit Bus Company to develop the Youth Transit Alliance (YTA), creating a long-term transportation solution for youth in Southwest Detroit. YTA’s work has required nimbleness and creative agility to respond to shifts in the volatile ecosystem in which the project is embedded. As an internal evaluator, I used rapid learning to complement the spirit and energy of the YTA’s work to 1) highlight and track tangible changes in program strategy, 2), develop a rigorous data collection system, 3) surface solutions in a way that fosters continued mutual responsiveness and collaboration.

Lesson Learned:

Social innovators work fast solving seemingly intractable problems. Rapid learning allows foundations to match the pace of social innovators who need data to inform their swift responses to systems level changes.

Hot Tip #1: Demonstrate Values of Collaboration through Action. Developing evaluation relationships early in project planning ensures that rapid learning addresses the concerns of the grantee and Foundation. Starting with this value has made for stronger learning questions. As implementers of the work, YTA learned from the rapid learning cycles about moving key levers in systems change for kids, and Skillman’s Social Innovation team learned about providing technical assistance resources for core grantees.

Hot Tip #2: Use Tried and True Tools. Beverly Parsons developed a framework to assess program development as it moves toward sustainability and scaling. The framework helped me identify strategy changes the YTA employed during their pilot year. Parsons’ tool was beneficial in the absence of a logic model, which is sometimes the case with social innovation projects versus traditional nonprofit programs.

Hot Tip #3: Faster is Better. Instead of year-end reports, YTA has appreciated getting the results of data analyses within months so that they could more quickly shift the direction of their work toward better outcomes for kids. Skillman has valued learning as the work progresses rather than after a grant cycle has ended. Melanie Hwalek’s memo format is a helpful tool for presenting critical analyses without the long wait.

Rad Resource: Evaluating Social Innovation, by Preskill and Beer.

Rad Resource: The Real-Time Evaluation Memo, by Melanie Hwalek.

Rad Resource: Developing a Framework for Systems-Oriented Evaluation, by Beverly Parsons.

Get Involved: I would love to hear from others who are doing similar work. I will be presenting with a panel of colleagues at the AEA Conference. Please join Marie Colombo, Sara Plachta Elliott, Nancy Latham and me at Learning about Rapid Learning: Identifying Approaches that Increase Evaluation Use in System-Building.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Sara Plachta Elliott on Sparking Collective Learning
  2. Systems Week: Heath & Lakshmanan on Using Systems Thinking to Evaluate Collaboratives
  3. Michael Quinn Patton on Developmental Evaluation

NPF TIG Week: Patti Patrizi on Using Existing Data in New Ways

American Evaluation Association 365 Blog - Mon, 09/29/2014 - 01:15

I am Patti Patrizi an evaluation consultant working primarily with foundations helping them develop evaluation and learning systems. After working at The Pew Charitable Trusts, I founded The Evaluation Roundtable. My tip is an approach I used to help a large foundation develop a learning system that fosters internal learning about their strategies as an antidote to years of producing reports about results and outcomes.

Hot Tips:

  • Assessing the current reporting system: We used a modified “post action review” (http://www.fireleadership.gov/documents/Learning_AAR.pdf ) with a 16 person representative staff group asking them to describe their experience with their current system (this included asking about: audience, process, questions, actual use–by whom, gaps and positives) and to describe their hopes. The process took 2 meetings at 1.5 hours each.
  • Providing quick feedback: We quickly provided their comments back on a single Excel sheet sent to them for comments.
  • Plotting out the new system: Using the information, we generated a rough outline of the major elements of a new reporting system, which they reviewed in one group meeting, and then via email. We then selected four members of the larger group, to help detail mechanics, rules, and flows for the new system.
  • The core of the process: The system builds exchange between officers and their directors on each strategy. The exchange is teed up by responses to a set of questions developed to stimulated thinking and discussion on issues. Each officer writes a note; their director reads it, and convenes the group of officers working on a strategy, and then writes his/her own note.   Each note represents each person’s own perspective; there are no “corrections” in the process.   The group then meets with their program VP to discuss implications.
  • Developing good learning questions: The old system focused on listing accomplishments. The new system drives on questions that challenge officers to think critically about the strategy, and about why something happened or not. Using data of some kind (qualitative or quantitative) is a requirement. So as an example:

“Are you finding that you need to rethink the assumptions behind your theory of change, including:

  • Time needed to achieve outcomes envisioned
  • The extent of partnership and interest delivered by key stakeholders
  • Availability or nature of resources needed to make a difference
  • Levels of interest from external stakeholders—such as policy makers, NGOs etc.
  • Unanticipated changes in policy
  • The level of capacity that exists within relevant field/s in order to carry out the work, or as it relates to the key approaches
  • Other assumptions that have not materialized as you hoped?”

Last thought: This process will be only as good as the thinking it produces in the organization.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. WMU Scribing: Stacy French on the Intersection of Strategy and Evaluation
  2. Stefanie Leite on Interview Tips for Job Seekers in Evaluation
  3. David Urias on a Unique Reflective Tool as an Evaluative Component

NPF TIG Week: Gretchen Shanks on What Do We Mean by “Evaluation Resources”?

American Evaluation Association 365 Blog - Sun, 09/28/2014 - 01:15

Hi, I’m Gretchen Shanks with the Bill and Melinda Gates Foundation’s Strategy, Measurement and Evaluation (SM&E) team. Our team works to ensure the foundation’s leadership, program teams and partners have the necessary capacity, tools and support to measure progress, to make decisions and to learn what works best to achieve our goals.

Before the Foundation, I supported teams at non-profits that were eager to evaluate their projects in the field; however, financial resources were inevitably scarce. Now that I work for a grant-maker that prioritizes the generation of evidence and of lessons about what works, what doesn’t and why, I think about issues of resourcing measurement and evaluation a bit differently.

In particular, I think less about whether we have enough financial resources in our budget for M&E and more about whether we have “enough” technical resources for measurement available (both to our internal teams and to our partners), or “enough” appropriately targeted and utilized evaluations. Some of the questions I ask about grantee evaluation include:

  • Are we investing sufficient resources, both time and technical support, in our work with partners to articulate the logical framework of measureable results for a project?
  • Have we adequately planned for measurement of those results and any possible evaluation(s)?
  • Do we know if we really need an evaluation, and if so, towards what end?
  • Does the design of the evaluation appropriately match the purpose and audience?
  • Do we know how (and by whom) the evaluation results will be used?

Planning results and measurement up front, supporting M&E implementation, and facilitating the use of data and lessons learned from evaluation all require resourcing – some financial, some technical, and (perhaps most importantly) temporal—the time needed from the relevant stakeholders (internal and external) is critical. As you likely know well from your own work, there are no magic solutions to these challenges. Here at the foundation we’re working on getting smarter about how to utilize scarce resources to support actionable measurement and evaluation.

Hot Tips: Here are a few examples of ways we’re tackling these challenges:

  • Check out this blog post by SM&E’s Director, Jodi Nelson. She introduces the foundation’s evaluation policy, which aims to “help staff and partners align on expectations and focus scarce evaluation resources where they are most likely to produce actionable evidence.”
  • Read this PowerPoint deck, which describes the foundation’s approach to designing grants with a focus on measureable results.
  • Listen to an Inside the Gates podcast to hear from NPR’s Kinsey Wilson and Dan Green, BMGF’s deputy director of strategic partnerships, as they discuss measurement in the field of media communications and some of the related challenges. (The segment runs from 8:55 to 15:38.)

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Lee Kokinakis on the House that Evaluation Built
  2. Ellie Buteau on Facing the Challenges in Foundation Performance Assessment
  3. Maureen Wilce on Program Evaluation Basics Webinar Series

Sheila B Robinson on Engaging Your Audience and a New Learning Opportunity!

American Evaluation Association 365 Blog - Sat, 09/27/2014 - 07:13

Hello All! Sheila B Robinson, aea365′s Lead Curator and sometimes Saturday contributor here with even more good news about audience engagement! Last Saturday, I wrote this post introducing the new Audience Engagement Workbook, the new Potent Presentations (p2i) tool featuring the WHY, WHAT and HOW of audience engagement, along with 20 specific strategies any presenter can use with limited investment of time or money. Look for the workbook to be posted on the p2i site any minute now!

In just a moment, I’ll share another strategy from the book, but in the meantime, I want to let you know about another opportunity to learn about audience engagement. Are you excited? Raise your hand if you want to learn more! (Are you feeling engaged now?)

Hot Tip: Join me for an AEA Coffee Break Webinar* - Audience Engagement Strategies for Potent Presentations - on Thursday October 9 at 2:00pm EST where I’ll preview several key strategies appropriate for a variety of presentation types. Click here to register.

Cool Trick: Try a quote mingle. This requires some preparation in that you will gather quotes about a topic and print them out on cards – enough for each participant to have one (either print a few quotes on cardstock or on paper, cut apart, and paste to index cards). Use this activity as an icebreaker opportunity for participants to introduce themselves, or during or at the end of the session to have them make a connection to your content. Distribute cards randomly, and ask each participant to stand and get with a partner. Partners take turns reading their quotes, saying briefly what the quotes mean to them, and then introducing themselves, or answering your question, or relating the quote to their situation, etc. Once the exchange is over, call time and ask partners to exchange quotes, and find a different partner. Do as many exchanges as time permits.

Quick tip: You don’t need to gather as many quotes as participants. You can repeat quotes two or three times to produce larger sets of cards.

Caution: You will need a microphone or loud projecting voice to be able to call time to switch partners and to call an end to the activity. This activity will likely be very challenging with a group larger than 60-70 people.

Image credit: Sean MacEntee via Flickr

Rad Resource: The p2i family of tools and resources to polish your presentation to perfection!

Hot Tip: Type”p2i” in the search box (just look to your right…see it?) and read some great aea365 posts from people who have used p2i tools to spice up their presentations.

*Coffee Break Webinars are free for AEA members. Not a member? Why not join now? Click here for more information.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. John Nash on Creating Outstanding Presentation Slides
  2. Kylie Hutchinson on Simple and Low Tech Things to Increase the Effectiveness of your Presentations
  3. Sheila B Robinson on a new p2i tool: The Audience Engagement Workbook!

LAWG Week: Mariana Enríquez on Reaching Minority Communities

American Evaluation Association 365 Blog - Fri, 09/26/2014 - 01:15

Hello, my name is Mariana Enríquez and I am a Program Evaluation Consultant based in Denver, Colorado. My work focuses on the evaluation of education and public health programs across Colorado.

Immigration has been in recent news because of the large number of unaccompanied children arriving in the country from Central America. Colorado, and especially the Denver Metropolitan Area, is home to a large number of immigrants from all corners of the world; many are refugees from Africa, Asia and the Middle East. In fact, close to 10% of Colorado residents were born in countries other than the USA (source).

Although more than one-third of immigrants in Colorado are naturalized U.S. citizens (source), many maintain their own language and culture. For example, almost 17% of the Colorado population speaks a language other than English at home, and Denver Public School students collectively speak more than 120 languages. This diversity makes evaluation work very challenging when crossing languages and cultures trying to reach these communities. As AEA’s Statement on Cultural Competence in Evaluation indicates, “The diversity of cultures within the United States guarantees that virtually all evaluators will work outside familiar cultural contexts at some time in their careers.” Additionally, “Cultural competence is fluid. An evaluator who is well prepared to work with a particular community is not necessarily competent in another.”

Hot Tips:

  • Learn as much as possible about participants’ cultural identity and background.
  • Use cultural brokers, cultural translators, bridge builders, interpreters to access and get to know your participants.
  • Do not assume that same language makes same worldviews. Language could be a barrier, but it is not “the only” barrier.
  • Adapt to the participants’ needs, do not expect them to adapt to yours.
  • Ensure that participants’ intentions are understood and their voices are heard.
  • Use advisory committees, involve representation from all stakeholders in all phases of evaluation.

Rad Resources: Things to Do in Denver during Evaluation 2014

  • Get out at sunset and don’t miss Chihuly Nights, illuminated glass sculptures by renowned sculpture artist Dale Chihuly on display at Denver Botanical Gardens.
  • Getting around downtown Denver is easy and FREE. Quickly reach downtown restaurants, museums and shops on the FREE 16th Street Mall Ride. Get around downtown quickly during the rush hours on the FREE Metro Ride on 18th Street.
  • Visit the newly renovated Union Station where you can connect to many metropolitan areas via bus
  • Shop or eat at the historical Larimer Square, five blocks west of the Colorado Convention Center.
  • Head west to Boulder – Rent a car and visit the Celestial Seasonings tea factory and take a hike among the famous Flatirons in beautiful Chautauqua Park.
  • Check the Westword magazine for other ideas and activities.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October
  2. LAWG Week: Arens, Fuentes, and Olmos on the Local Arrangements Activities for Evaluation 2014
  3. PD Presenters Week: Osman Özturgut and Cindy Crusto on Remembering the “Self” in Culturally Competent Evaluation

Assess Impact in Complex Environments

Networking Action - Thu, 09/25/2014 - 12:31

A big challenge people are always asking me about is how to measure impact when many organizations and people are involved and when success depends on flexibly changing course in response to learning what works.  They are characteristics of complex

LAWG Week: Maggie Miller on Teaching Evaluation in the Denver Area

American Evaluation Association 365 Blog - Thu, 09/25/2014 - 01:15

Hello, I am Maggie Miller, the principal of Maggie Miller Consulting. I conduct program evaluation for nonprofits in the Denver/Boulder area. Welcome to Colorado! We Coloradoans tend to be very friendly; when you meet us at Evaluation 2014, we will be very happy to share any information about Colorado that we can.

Coloradoans also like to learn about evaluation. When I’m not consulting, I teach various evaluation classes and workshops in the greater Metro Denver area. There are many opportunities for program staff in nonprofits (and the private sector) to learn about evaluation. These are a few organizations that I’ve taught for: the Colorado Nonprofit Association, the Nonprofit Cultivation Center, Mountain States Employers Council, the Nonprofit Management program at Regis University, and Denver Evaluation Network (DEN), which is for Denver-area museums and cultural institutions. The staff at the Denver Public Library system were very receptive to a series of evaluation planning classes I gave, and once I even presented a logic modeling workshop for the HR department New Belgium Brewery. Hey, everyone can benefit from thinking about outcomes!

(P.S.: While I’ve never taught for them, I should mention that there are some large evaluation firms in town that offer excellent training to our evaluation-oriented Coloradoans.)

Lessons Learned: Anyone can learn about evaluation and improve their skills. It’s important to keep these teaching tips in mind.

  • Assess where your students “are at” in terms of their experience and existing skills (which may include evaluation-related things like teaching, research, project management, and facilitation).
  • For any given teaching opportunity, figure out what’s most important to teach. Keep your lesson focused on a few important ideas which they will remember and use, rather than giving them an overwhelming smorgasbord.
  • Facilitate hands-on interactive activities to help people engage deeply with new ideas.
  • Use examples that are relevant to your students, and encourage them to apply what they learn to their professional (and even personal) lives.
  • Whenever possible, get them to review what they learned. This is easier in multi-session workshops or classes, but you can still do it before and after breaks in a one-time workshop.

Hot Tip: Some of the places I’ve taught are great resources for you when you are in town! Check out Denver’s wonderful DEN-participating museums, our fabulous public library, and taste some great New Belgium beer at many restaurants and bars in the Denver area.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

 

Related posts:

  1. LAWG Week: Marley Steele-Inama on Collaboratively Building Evaluation Capacity
  2. LAWG Week: Erik Mason on Learning to Become an Evaluator as a Non-Evaluator
  3. LAWG Week: Helen Holmquist-Johnson on Using Process Evaluation to Capture the Diversity of Colorado

LAWG Week: Helen Holmquist-Johnson on Using Process Evaluation to Capture the Diversity of Colorado

American Evaluation Association 365 Blog - Wed, 09/24/2014 - 01:15

Greetings from Colorado – Home of the Southern Rocky Mountains and the edge of the Great Plains. My name is Helen Holmquist-Johnson and I am the Assistant Director of the Social Work Research Center at Colorado State University in Fort Collins, Colorado. Colorado is not only geographically diverse, but incredibility diverse in terms of the demographic characteristics of the communities and the individuals who live here. Some of my recent work focuses on evaluating evidence-based programs that promote strengthening families and keeping children safe and healthy.

In Colorado, the Division of Child Welfare is a state supervised, county administered system. In fact, Colorado is one of only nine states in the U.S. where the administrative structure of Child Welfare can be described this way. Each county, while held to the same State and Federal requirements, can individually decide how to operate and deliver child welfare services to families. This arrangement increases the level of autonomy the counties have on everything from which programs they implement, to utilizing different models of leadership and supervision. In a state with 64 counties (I know some of you are already ahead of me here), this structure introduces some unique evaluation challenges.

This is where process evaluation becomes useful and important. What might be missed or overlooked in an outcomes evaluation can be captured by asking process evaluation questions. In general, these questions focus on who the program reaches and whether or not the program was carried out as planned. Because we are evaluating evidence-based models, our interest shifts somewhat from asking does the program work, to asking how, why and in what context or conditions does the program work? As you can see, these are the questions which are important to ask when working in 64 different counties throughout the state. Answers to these questions will assist county administrators and other stakeholders in making policy and practice decisions which consider the contextual factors unique to their community and families.

Rad Resource: For specifics about how to design and conduct a process evaluation, read Steckler, A., and Linnan, L. (Eds.), Process Evaluation in Public Health Interventions and Research.

Hot Tip for Denver: If you have children and family coming with you to Evaluation 2014 in Denver you might want to check out The Children’s Museum of Denver and The Butterfly Pavilion.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. CP TIG Week: Melissa Strompolis and Suzanne Sutphin on Measuring Well-Being in Child Welfare
  2. LAWG Week: Marley Steele-Inama on Collaboratively Building Evaluation Capacity
  3. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October

LAWG Week: Erik Mason on Learning to Become an Evaluator as a Non-Evaluator

American Evaluation Association 365 Blog - Tue, 09/23/2014 - 01:15

Hi – I’m Erik Mason, the Curator of Research at the Longmont Museum and Cultural Center, located in Longmont, Colorado, about 35 miles northwest of Downtown Denver. I am not an evaluator – in fact, the word “evaluation” does not appear in my job description.  I have come to believe, however, that evaluation is critical to the success of my work as a museum curator.  Much of that realization is the result of my participation in the Denver Evaluation Network (DEN), a collection of 15 museums across the Denver metro area that have made a commitment to learn about, and do, evaluation on a regular basis.

Only two members of DEN have full-time evaluators on staff. The rest of us are a mix of educators, exhibit developers, administrators, and curators.  Our daily work is filled with school tours, fundraising, label writing, and all the other stuff that goes into making museums fun and interesting places to visit. As a result, evaluation can get short shrift. We fall back to anecdote and what we think we know.

Over the last two years, the members of DEN have been presenting at museum conferences about the work we are doing to bring evaluation to a broader community.  It has been fascinating watching people who always thought evaluation was something scary and hard, and required a large supply of clipboards, realize that it can be done in many ways.

Within my workplace, I have been pleasantly surprised as we have begun incorporating evaluation into more and more of what we do. Data gathered from iPad surveys provides a baseline understanding of our audience demographics and allows us to compare the changes in our audience as our special exhibits change. Evaluation is now a part of the development of all our exhibits. In the course of doing evaluation, I’ve seen attitudes change from “Why are we wasting our time doing this?” to “When are we doing another evaluation?”

Rad Resource: Check out this video of testimonials from members of DEN.

Hot Tip for Evaluation 2014 Attendees: Denver really is the “Mile High City” and you can take home proof of this fact with a short jaunt and a camera. A free shuttle and brief walk away from the Colorado Convention Center is the Colorado State Capitol building, a Neoclassical building that sits at the eastern end of Denver’s Civic Center Park. The Capitol building sits exactly one mile above sea level, and the official marker can be found on 13th step. The Capitol building is emerging from a multi-year restoration effort with a shiny new coat of gold on its dome, in honor of Colorado’s mining heritage. Free tours of the Colorado Capitol Building are offered Monday-Friday.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October
  2. LAWG Week: Marley Steele-Inama on Collaboratively Building Evaluation Capacity
  3. LAWG Week: Arens, Fuentes, and Olmos on the Local Arrangements Activities for Evaluation 2014

LAWG Week: Valerie Williams on Connecting with Teachers for Front-End Evaluation

American Evaluation Association 365 Blog - Mon, 09/22/2014 - 01:15

My name is Valerie Williams and I am Senior Program Evaluator at the University Corporation for Atmospheric Research (UCAR) located in Boulder, CO. UCAR’s Center for Science Education is currently developing a new climate exhibit for the Mesa Laboratory Visitor Center and I am providing front-end evaluation support to help them test concepts and ideas about climate change with different audiences.

Teachers and their K-12 students are a primary audience for this exhibit; so much of my work involves conducting focus groups with teachers. It can be difficult to access teachers during the summer months when most schools are closed. Yet, this is often an opportune time for scheduling focus groups with teachers without having to squeeze time from their busy school day.

Lessons Learned: Professional development and skill-building workshops are a great way to identify local teachers that may be willing to participate in a focus group during the summer months. Boulder and the surrounding Front Range community are home to many universities and science research centers that host teacher workshops on climate-related topics during the summer. Working with local workshop coordinators can be an effective way of connecting to teachers.

Tokens of appreciation can go a long way toward expressing gratitude. Despite working within limited budgets, I always try to provide teachers with something they can use for their classrooms, such as posters or hands-on manipulatives to let them know I value their time.

Rad Resources for Front-End Evaluation: Most of my experience is in evaluating formal science education programs, so moving to informal science and museum evaluation has been a bit challenging. However, I’ve found many resources that have helped to smooth this transition.

Hot Tip: Advice for Evaluation 2014 in Denver. Not surprising, my hot tip is to take short trip to Boulder to de-compress from Evaluation 2014. Only 25 miles away from Denver, Boulder offers an amazing array of activities for connecting with nature. From the nature hikes that give you a breathtaking view of the Flatirons, to people watching and all around entertainment, a visit to Boulder is a great way to end an exciting and intellectually-stimulating conference!

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

 

Related posts:

  1. Climate Ed Eval Week: Susan Lynds on Considering Scientific Jargon to Avoid Communication Barriers
  2. EPE Week: Tracy Dyke-Redmond on Evaluating Adaptation Plans
  3. Climate Ed Eval Week: Nicole Holthuis on Lessons Learned from Measuring Intermediary Outcomes

LAWG Week: Arens, Fuentes, and Olmos on the Local Arrangements Activities for Evaluation 2014

American Evaluation Association 365 Blog - Sun, 09/21/2014 - 01:15

We are Sheila A. Arens, Stephanie Fuentes, and Antonio Olmos,members of the Colorado Evaluator’s Network (COEN) and the Local Arrangements Work Group (LAWG). We are honored to be hosting the upcoming 2014 AEA Conference in Denver! The LAWG is comprised of members from the COEN community who are working on a number of subcommittees to make your conference professionally and personally gratifying. When you are in the registration area, stop by the information desk and say hello. Volunteers will be available to provide information about the area and provide advice about nearby restaurants (for fast Mexican, try Colorado’s own Chipotle), coffee shops (Dazbog is a local favorite), and the like. They may also share some fascinating Denver facts [Denver brews over 200 different beers daily—more than any other city in the nation or warn you about “zombies” roaming downtown October 18th, when Denver hosts the world’s largest Zombie Crawl (we couldn’t make this up if we tried!)].

Hot Tip: Downtown Denver is teeming with places to stay and things to do. Check out the searchable Colorado Convention Center site for housing options. With the support of COEN the LAWG created guides about the Denver area that provide information about Denver restaurants, public transportation, and suggestions for activities. Electronic versions of these can be accessed via the AEA 2014 Conference websiteor directly via the following links:Guide to Denver, Dining in Denver, Events in Denver, and Visiting Boulder. If you have questions at the Conference, look for COEN members (and our local friends) identifiable by their “Denver” badges.

First Time Attendee Tip: In the registration area pick up a First Time Attendee ribbon for your name tag! This is a great way to meet veteran and fellow first-time attendees. And be on the lookout for evaluators wearing a “Conference Ambassador” ribbon who have volunteered to answer your questions.

International Attendee Tip: The LAWG International subcommittee is working with the International and Cross Cultural Evaluation (ICCE) TIG to connect evaluators from other parts of the globe with U.S.-based evaluators (buddies). There will be an informal opportunity for buddies to share a meal immediately after the ICCE TIG business meeting on Thursday, October 16th.

Need More Information? The AEA Conference website has other useful information about the conference, including a conference schedule, searchable program, information on pre-sessions, accommodations, and even more conference activities.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October
  2. LAWG Week: David J. Bernstein and Valerie Caracelli on the Local Arrangements Activities for Evaluation 2013
  3. LAWG Week: Marley Steele-Inama on Collaboratively Building Evaluation Capacity

Sheila B Robinson on a new p2i tool: The Audience Engagement Workbook!

American Evaluation Association 365 Blog - Sat, 09/20/2014 - 06:28

Hello! I’m Sheila B Robinson, aea365′s Lead Curator and sometimes Saturday contributor. Evaluation is my newer career. I’m actually an educator, having taught in K12 schools and at a university. I’m also a professional developer, having provided PD courses, workshops, coaching, and mentoring to educators and evaluators for more than 15 years, so I’m no stranger to presentation design.

Lessons Learned: Check out p2i tools before designing any presentation! I’ve learned so much from AEA’s Potent Presentations Initiative (p2i) - AEA’s effort to help members improve their presentation skills, particularly around delivering conference presentations with specific advice about how to make your presentations more potent by focusing on three things: message, design, and delivery – and have incorporated these principles and strategies into my work.  

Rad Resource: Coming soon! The new p2i Audience Engagement Workbook. I’m honored to be able to share my experience in designing and facilitating presentations and professional learning opportunities as we add to the family of p2i tools with the Audience Engagement Workbook, featuring the WHY, WHAT and HOW of audience engagement, along with 20 specific strategies any presenter can use with limited investment of time or money.

Each strategy is described and rated on a number of dimensions such as ease of application, materials needed, cost, and the degree of movement for participants. There’s even a special section on engaging audiences in a webinar environment!

Hot Tip: One strategy to try now!

Four Corners: Choose just about any topic or question that has 3 or 4 positions or answers (e.g. In your family are you a first born, only child, oldest child, or in the middle? In your evaluation work, do you mainly use qualitative, quantitative or mixed methods? Do you consider yourself a novice, experienced, or expert evaluator?) and ask participants to walk to the corner of the room that you specify. Once there, give them an opportunity (3-5 minutes) to discuss this commonality, then return to their seats. If time permits, call on volunteers to share some insights from their brief discussion.

Variation: Ask participants a question that requires them to take sides (usually two sides, but could be three or more). Ask them to walk to the side of the room assigned to that position, and discuss with others who share their views. You can ask them to form two lines facing each other and have a debate with participants from each side presenting support for their position.

Stephanie Evergreen, information designer, dataviz diva, and p2i lead is putting the finishing touches on the layout and design of the workbook and we’ll have it up and ready for you well ahead of Evaluation 2014! In the meantime, look for Stephanie to preview additional strategies in the next AEA Newsletter!

Do you want your audience doing this? (Image credit: zenobia_joy via Flickr)

 

Or this? (Image credit: Chris Hacking via Flickr)

 

 

 

 

 

 

 

 

   

 

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Sheila B Robinson on Engaging Your Audience and a New Learning Opportunity!
  2. p2i: Stephanie Evergreen on Michael Quinn Patton’s Fab Five Reboot
  3. June Gothberg on How Potent Presentations Changed my Presentation Worldview

YFE Week: Felicia Sullivan on Exploring Game Analytics

American Evaluation Association 365 Blog - Fri, 09/19/2014 - 01:08

My name is Felicia Sullivan and I research youth civic engagement at the Center for Information and Research on Civic Learning and Engagement (CIRCLE), a non-partisan research center at Tufts University’s Tisch College of Citizenship & Public Service. Building youth-focused evaluation strategies and working with practitioners are important ways CIRCLE links academic research to what is happening on the ground. Recently, we have been exploring what game analytics and data captured by interactive learning systems can tell us about hard to measure civic engagement processes like deliberation, perspective taking, and collaboration.

Lessons Learned

Two recent projects involve games called Civic Seed and Discussion Maker that we developed in collaboration with the Engagement Game Lab at Emerson University and Filament Games an interactive learning game studio in Madison, WI.

Measuring concrete knowledge in learning environments is essential, but capturing processes and interactions are also important.  Civic literacy is more than knowing about government and history, it is about having the skills to act and behave within a civic culture. For schools and national youth programs, capturing growth and development in civic literacy is hard to do. Increasingly we have looked to learning games and interactive technologies to provide us with insights about these complex, developmental processes.

These forays into gaming and technology-enabled learning have us thinking about new approaches to evaluation that are dynamic, formative and adaptive. We are by no means experts in the arena, but here are things we are currently looking at in game-based evaluation:

Hot Tip: Finishing the Game is the Assessment

If designed well, a game can embed the assessment of an outcome within the game play itself in a “stealthy” way.  Achieving game missions or completing tasks can be thought of as “tests” or “benchmarks” in the learning process.  Most of the projects we have been involved with are interested in learning related to civic literacy, but we believe that other domains that work with hard to grasp complex systems or dynamics could benefit from games.

Cool Trick: User Created Content

When game users type in a chat box, share a resource, or select text to support an argument, a content analysis can later provide insights about what users are thinking and experiencing.

Cool Trick: User Analytics

How players engage with a game — the choices they make, the path they take or where they get stuck – is a digital “observation” that can be analyzed.

Rad Resource: Games, Learning, and Assessment

This chapter from a much larger edited volume on assessment in game-based learning captures some of the issues related to assessment with some concrete examples.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Climate Ed Eval week; Rachel Becker-Klein on Using Embedded Assessment Tools for Evaluating Impact of Climate Change Education Programs on Youths
  2. Sheila B Robinson on the Olympics – Visualized!
  3. YFE Week: Krista Collins on Supporting Positive Developmental Outcomes

YFE Week: Julie Poncelet, Catherine Borgman-Arboleda, and Jorge Arboleda on Using Participatory Video to Engage Youth in Evaluation in a Creative and Empowering Way

American Evaluation Association 365 Blog - Thu, 09/18/2014 - 01:54

We are Julie Poncelet, Catherine Borgman-Arboleda, and Jorge Arboleda of Action Evaluation Collaborative, independent consultants who use evaluation to strengthen social change. We want to share our experiences using participatory video (PV) in evaluations with youth.

PV is a dynamic, powerful approach whereby youth use video to capture everything from their stories of change to issues that affect their everyday lives to ideas they have to effectuate change in their communities. As pictured below, we recently engaged PV with a group of teens from a community-based NGO in Yucatan, Mexico. Youth produced videos about their dreams and senses of identity. PV is a compelling approach to explore these themes, which emerged from a Theory of Action process with the NGO; specifically identified was the need to have youth analyze critically their communities and find their voices.

PV positions youth as researchers and evaluators of their own communities and supports them to contribute creatively and critically to issues. With PV, youth design, direct, film, and edit videos. They experience empowerment, ownership, and self-esteem rarely garnered from other evaluation approaches. Adults provide technical assistance, build capacity, and facilitate a process for PV to unfold (not to take over the process!). For evaluations, PV creates a space for community members and stakeholders to see the interests and needs of youth in the community and provides a unique platform to reflect collaboratively on meaning and implications.

Lessons Learned: Focus should be on learning the technology and the video storytelling process, as well as providing an appropriate approach for young people to collectively reflect on themselves and their realities. Give youth time to feel comfortable with the equipment and with engaging others in conversation. And remain aware of group dynamics! We often find that boys are more comfortable and will take leadership with technology, so consider breaking groups up by gender.

Consider using a short set of questions that can be asked by youth to stakeholders included in their videos. The insight can help to contextualize the analysis and overall sense-making. The PV process is as much an outcome as the product; engaging in PV is transformative, so don’t worry about getting ‘perfect’ videos.

Hot Tips: Although building a participatory video kit is not cheap, all you really need is a small camera (preferably with projecting capabilities) and a good quality hand held microphone. We have found that when young people hold the microphone, they feel more empowered to speak, so it helps them to find their voice.

Rad Resources: The PV approach aligns nicely with other qualitative methods (Most Significant Change) and different types of evaluations (Monitoring & Evaluation).

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. YFE Week: Mary Arnold on Getting Youth Participatory Evaluation Projects off to a Solid Start
  2. YFE Week: Kim Sabo Flores and David White on New Perspectives and Voices in Youth Focused Evaluation
  3. YFE Week: Mariah Kornbluh on Addressing Adultism & Being an Ally in Youth-Focused Evaluations, Part 2

YFE Week: Jessica Jerney on Coaching Youth Workers in Evaluation

American Evaluation Association 365 Blog - Wed, 09/17/2014 - 01:49

Hello, this is Jessica Jerney from the Extension Center for Youth Development at the University of Minnesota. I recently served as Project Coordinator for the Innovations on Youth Roles in Evaluation and Assessment project. This initiative included a learning cohort, symposia speakers’ series (featuring Kim Sabo Flores and Katie Richards-Schuster), and applied research.

As a result of this project, we gained new insights into the benefits and challenges of using a cohort model for professional development in youth-focused evaluation. Over nine months, 25 youth program practitioners met and engaged in dialogue, activities, and reflection to explore, test, and create new aspirations for engaging youth in evaluation.

Hot Tips for Engaging Adult Practitioners in Evaluation with Youth

Tackling the task of building youth worker capacity in youth-focused evaluation was a bigger challenge than originally imagined.  If you are considering implementing a learning cohort, consider the following Hot Tips for engaging adults in evaluation with youth:

  • Create a safe space for participants to grow and ideas to flourish. I often say that evaluation can be like therapy for participants. Involvement in qualitative evaluations has created new realizations for many, so did membership in the Innovators Learning Cohort.  We did not expect that the experience would challenge adult practitioner ideas about the roles of young people in evaluation. As Mariah Kornbluh pointed out, we need to be prepared to “address adultism and be an ally.” When engaging a cohort in a potentially controversial issue, allow space for learning, change, and surprises. Be ready to help participants peel back the unconscious ideas that our society has about youth roles in activities, like evaluation, and practice authentic youth-adult partnerships.
  • Establish an on-going group dedicated to learning and growing their practice. We created an application process to identify candidates that had the time, interest, and skills to participate at a high level.
  • Develop skills and share experiences. Adult learning theory tells us that grown-ups want to share their experiences and learn from others. Introduce activities that are challenging and hands-on. We found that it was important to participants to be pushed outside of their comfort zone in both theoretical and practical ways. In some instances, cohort members tried out new skills and ideas in their program and shared the results with the group. In other instances, local youth leaders participated in activities with the cohort and reflected on the experience the next day. These opportunities to develop working theories, test them out, and reflect were critical for growth and change to occur in the space in which they operate.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. YFE Week: Kim Sabo Flores and David White on New Perspectives and Voices in Youth Focused Evaluation
  2. YFE Week: Mariah Kornbluh on Addressing Adultism & Being an Ally in Youth-Focused Evaluations, Part 2
  3. YFE Week: Kim Sabo Flores and David White on A New Youth Focused Topical Interest Group

YFE Week: Anne Gleason and Miranda Yates on Practical Tools for a Youth Research Camp

American Evaluation Association 365 Blog - Tue, 09/16/2014 - 01:46

Hi, we are Anne Gleason and Miranda Yates from the Program Evaluation and Planning Department at Good Shepherd Services in New York City and we would like to share some tools we put together for a youth research camp. Two summers ago, we partnered with youth in one of our afterschool programs to conduct research on what youth think it means to be successful, a topic that the students selected, which ultimately culminated in a student-produced documentary. Drawing on techniques we learned at the Critical Participatory Action Research (CPAR) Summer Institute offered by CUNY’s Public Science Project, we facilitated a series of research camp days with a group of twenty 10-14 year olds. The days were organized as follows: What Is Research, Survey Design Parts I and II, Data Entry, and Data Analysis. Check out the camp schedule for more details.

The project provided an enriching learning experience for everyone involved.  Youth gained a unique first-hand experience conducting research by playing a lead role in the design and implementation of the study and the data analysis. In turn, their insider perspective helped us to better understand how to form more meaningful questions and interpret results. For example, one survey question presented a list of resources and asked respondents to rate their importance to achieving life goals. For the goal of attending college, we saw that older students rated having a supportive family/supportive teachers as less important when compared with younger students. We initially were perplexed as to why older students would hold less value for supportive adults. The youth participants posited that older youth may feel more independent and, thus, be more confident in their own ability to achieve success. This insight underscored the benefit of partnering with youth in research.

Lessons Learned:

  • If you plan to conduct a full youth participatory project, allow yourself plenty of time. Ideally, we would have liked a few extra days to delve deeper into research techniques and data analysis.
  • If you’re limited with time or resources, you don’t have to give up the idea of drawing on participatory techniques. We have also found ways to incorporate youth voice into our evaluation activities that are less time intensive, but inspired by a participatory approach. For example, we routinely conduct focus groups throughout our programs to gather feedback on surveys and other evaluation tools and develop action plans.

Rad Resources: Our camp curriculum included activities, role playing and group discussions. Here are two handouts that might be useful to those considering a camp of their own: Survey Development 101 and Survey Administration 101.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. YFE Week: Rob Shumer on Involving Youth in School-Based Participatory Evaluation
  2. Ann Gillard on Measuring Fun in Summer Camp
  3. Rob Shumer on Conducting Youth Led Evaluation

YFE Week: Jessica Manta-Meyer, Jocelyn Atkins, and Saili Willis on Creative Ways to Solicit Youth Feedback

American Evaluation Association 365 Blog - Mon, 09/15/2014 - 01:36

Our names are Jessica Manta-Meyer, Jocelyn Atkins and Saili Willis and we are evaluators at Public Profit, an evaluation firm with a special focus on out-of-school time programs for youth.

We usually evaluate networks of after school programs as a whole (some of which serve more than 20,000 youth, where a survey is indeed one of the best approaches). However, we particularly enjoy opportunities to build the capacity of youth programs to solicit feedback through creative ways that align with best youth development practices.

Here are some of the methods that have been most popular with these programs:

Cool Trick – Journals: At the start of a program, provide journals for all youth in the program and ask them to write something related to the program goals. Is one of the program’s goals to develop leadership skills? They can ask the youth to respond to this question: “In what ways are you a leader?” Is one of the goals to increase enjoyment of reading? “What do you like about reading?” Then, at the end of the program, youth can read what they wrote the first day and write “How would you answer that question differently, now?” or some other question to get them to reflect on how they’ve changed in the program.

Cool Trick – Candy surveys: Ask students to answer surveys questions by putting certain colors of candy in a cup then tally the candy colors to get your responses. Have the youth tally the results themselves. They can even make a bar chart on chart paper by taping the actual candy to the paper. The youth can then eat the candy after they’ve tallied the results.

Hot Tip – used wrapped candy! Starburst works well and is what this summer program used:

Cool Trick – 4 Corners Activity: Youth leadership programs do this all the time. They ask youth to “take a stand” next to signs that are marked Strongly Agree, Agree, Disagree or Strongly Disagree in response to a statement like “youth should be able to vote at age 16.” Once the youth stand next to one of the signs, the group can talk out their different perspectives. Programs can also use this to collect both quantitative (how many stand where) and qualitative (what they say about why they are standing where they are) data.

Hot Tip: For more Creative Ways, come to our Skill-Building Workshop Saturday at 8am. Yes, it’s early, but we promise to have you moving, interacting and creating. Plus, there will be candy.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Kim Sabo Flores on Engaging Youth in Evaluation
  2. YFE Week: Mariah Kornbluh on Addressing Adultism & Being an Ally in Youth-Focused Evaluations, Part 2
  3. Susan Kistler on AEA’s Evaluation Journals

YFE Week: Kim Sabo Flores and David White on New Perspectives and Voices in Youth Focused Evaluation

American Evaluation Association 365 Blog - Sun, 09/14/2014 - 01:31

Hello, my name is Kim Sabo Flores and I am David White. We are honored to serve as co-chairs of the Youth Focused Evaluation Topical Interest Group (YFE TIG). As we prepare for the 2014 conference and our annual business meeting, we have to acknowledge how truly remarkable it is that a loosely knit group of like-minded individuals could grow into a burgeoning group of over 300 individuals who have begun to define and unify the field and practice of youth focused evaluation within the Association. As a group, we are exploring evaluations focused on youth and positive youth development in a variety of settings. However, at our core, many of us are interested in the practice and outcomes of youth participation in evaluation. This focus has been a key part of our history as a TIG because we understand youth participation to be a pillar of positive youth development.

Hot Tip: Youth-Adult Partnerships

When youth are equal partners with adults in the evaluation process, they share equally the decision making power and responsibility. What does this look like? Here are a few key considerations:

  • Evaluation questions are jointly developed.
  • Evaluation activities are performed by youth and adults.
  • Data are analyzed by youth and adults.
  • Youth and adults receive significant benefit from involvement and from the evaluation findings.

Rad Resource: Youth-Adult Partnerships in the Evaluation Process

This 2005 chapter from the Innovation Center for Community and Youth Development outlines best practices in youth youth-adult partnerships in evaluation.

Our TIG is taking the first of several steps necessary to provide a genuine, inclusive, and participatory space for all evaluators, regardless of age. The TIG will host two sessions at AEA this year designed to ignite conversation about how to best include youth in our yearly conference.

See you in Denver!

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. YFE Week: Mary Arnold on Getting Youth Participatory Evaluation Projects off to a Solid Start
  2. YFE Week: Kim Sabo Flores and David White on A New Youth Focused Topical Interest Group
  3. YFE Week: Mariah Kornbluh on Addressing Adultism & Being an Ally in Youth-Focused Evaluations, Part 2