Monitoring, Evaluation and Learning Systems

AaEA Affiliate Week: Lindsey Stillman on Creating a System of Care for Vulnerable Populations

Hello, my name is Lindsey Stillman and I work at Cloudburst Consulting Group, a small business that provides technical assistance and support for a number of different Federal Agencies. My background is in Clinical-Community Psychology and so providing technical assistance around evaluation and planning is my ideal job! Currently I am working with several communities across the country on planning and implementing comprehensive homeless service systems. Much of our work with communities focuses on system change by helping various service providers come together to create a coordinated and effective system of care, rather than each individual provider working alone.

Lesson Learned:

  • The new HEARTH legislation includes a focus on system level performance versus program level performance. This has required communities to visualize how each program performance feeds into the overall performance of the system in order to identify how to “move the needle” at a system level. Helping communities navigate between the system level goals and the program specific goals – and the connections between them – is critical.
  • Integrating performance measurement into planning can help communities see the value of measuring their progress. All too often grantees or communities are given performance measures that they need to report on without understanding the links between their goals and activities and the performance measures. Presenting performance measurement as more of a feedback loop can help remove the negative stigma around the use of evaluation results and focus stakeholders on continuous quality improvement.
  • Working with agencies or communities to create a visual representation of the links between processes, program performance and system performance can really help to pull all of the pieces together – and also shine light on serious gaps. Unfortunately many federal grantees have had negative experiences with logic models and so finding creative ways to visually represent all of the key processes and outcomes/outputs/etc. can help to break the negative stereotypes. In several communities we have developed visual system maps that assist the various stakeholders in coming together to focus on the bigger picture and see how all of the pieces fit together. Oftentimes we have them “walk” through the system as if they were a homeless individual or family to test out the model and to identify any potential barriers or challenges. This “map” not only helps the community with planning system change but helps to identify places within the system and processes that measuring performance can help them stay “on track” toward their ultimate goals.

Rad Resources:

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. PD Presenters Week: Mindy Hightower King and Courtney Brown on A Framework for Developing High Quality Performance Measurement Systems of Evaluation
  2. GOVT Week: David Bernstein on Top 10 Indicators of Performance Measurement Quality
  3. MN EA Week: Alex Chan on Adding Value to Value-Added

AaEA Affiliate Week: Maureen Wilce and Sarah Gill on Using the Evaluation Questions Checklist to Improve Practice

American Evaluation Association 365 Blog - Mon, 10/20/2014 - 01:15

Hi, I’m Maureen Wilce, a founding member of the Atlanta-area Evaluation Association, and I’m Sarah Gill, president elect of AaEA. We’re both “true believers” in the power of evaluation to guide organizational learning. We’ve seen how good evaluation questions can help uncover important information to improve programs. We’ve also seen the opposite: how bad evaluation questions can waste time and resources – and increase distrust of evaluation in general.

Asking the right evaluation questions is critical to promoting organizational learning. Answers to good evaluation questions direct meaningful growth and build evaluation capacity. But what makes an evaluation question “good”? To get our answer, we reviewed the literature and then collected the practice wisdom of AaEA members and members of AEA’s Organizational Learning & Evaluation Capacity Building TIG. As we organized our thoughts, a checklist began to form. After more great discussions with our colleagues in AaEA and the TIG, we decided to structure the checklist around the standards. A few more refinements came as we used the resource in our work in CDC’s National Asthma Control Program, and finally, Good Evaluation Questions: A Checklist to Help Focus Your Evaluation was born!

Rad Resource: The Good Evaluation Questions Checklist, at http://www.cdc.gov/asthma/program_eval/AssessingEvaluationQuestionChecklist.pdf, is a tool to help ensure that the evaluation questions we create will be useful, relevant, and feasible. In keeping with the new accountability standard, it also provides a format for documenting our decisions when selecting evaluation questions.

Lesson Learned: Articulating what makes an evaluation question “good” requires thinking through several dimensions and assessing it against multiple criteria. A checklist can help us review evaluation questions to anticipate potential weaknesses and can also support communication with stakeholders during the question development process.

Rad Resource: While at the National Asthma Control Program website, check out our other evaluation resources, including our guides and webinars.

Get Involved: We received some great feedback from folks who attended our demonstration at AEAthanks to all who joined us! If you have additional suggestions about how to improve the checklist, please leave them in the comments below.

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. AaEA Affiliate Week: Travis Tatum on an Introduction to the Atlanta-area Evaluation Association Local Affiliate
  2. Deepa Valvi on the Strategic Evaluation Planning Process
  3. SCEA Week: Leslie Fierro & Deanna Rossi on Evaluating State Asthma Programs

AaEA Affiliate Week: Travis Tatum on an Introduction to the Atlanta-area Evaluation Association Local Affiliate

American Evaluation Association 365 Blog - Sun, 10/19/2014 - 13:26

Hi, my name is Travis Tatum and I currently work as an independent evaluator through my company Creative Research Solutions, LLC.  As President of the Atlanta-area Evaluation Association it is my pleasure to welcome you to our affiliate’s AEA365 week! This week, our members will be contributing AEA365 posts with advice, best practices, and new tools based on their experiences in evaluation.

Located in Atlanta, Georgia, the Atlanta-area Evaluation Association serves evaluators with a wide variety of backgrounds and areas of focus.  Of course, since the CDC is based in Atlanta, we have a particularly large population of public health evaluators among our members.  In recent years, AaEA has grown substantially. This year we have been working to develop our organizational processes to ensure that we can support and sustain our continued growth.

AaEA typically provides monthly events for our members, which alternate between professional development events and social activities.  We are volunteer based, and have several committees focused on different aspects of our activities:

  • The Programming and Professional Development Committee, led by Karen Anderson and Ayana Perkins, creates and organizes our monthly professional development and social activities.
  • The Membership and Networking Committee, led by Willliam Moore and Tekla Evans, handles new member recruitment and registration.
  • The Communications Committee, led by Linda Baffo and Linda Vo-Green, develops our email newsletter and manages our website, http://atl-eval.org.
  • The Finance Committee, led by Brandy Peterson and Judy Gibson, develops the budget, monitors the financial position of the organization, and helps identify ideas for fund raising.

In addition to the committee chairs, our board includes a President (myself), a President-Elect (Sarah Gill), and a Past President (Kari Cruz); we work together to guide the overall direction of the organization and support each of our committees however we can.

Hot Tip: Each of these committees are often in need of additional volunteers, so if you are a member in the Atlanta Area, we welcome your participation!

Hot Tip: Being part of a local affiliate can carry a lot of benefits for evaluators.

  • As an independent evaluator, I personally have made connections through AaEA that have led to new contracts and other business opportunities.
  • Members who work for larger companies can benefit from networking and professional development through AaEA.
  • Having a local community of people who care about evaluation makes it much easier to find partners to collaborate with on larger projects.

Rad Resource: The fastest way to find your local affiliate is to visit the local affiliates page on the main AEA website: List of local affiliates.

I am both grateful and excited for our members to share their insights.  I hope that you will also find our members’ contributions helpful every day this week!

The American Evaluation Association is celebrating Atlanta-area Evaluation Association (AaEA) Affiliate Week with our colleagues in the AaEA Affiliate. The contributions all this week to aea365 come from AaEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. AaEA Affiliate Week: Maureen Wilce and Sarah Gill on Using the Evaluation Questions Checklist to Improve Practice
  2. CEA Affiliate Week: Leah Christina Neubauer on Chicago-Based Evaluators and Including AEA Local Affiliates in Your 2014 Evaluator Learning Resolution Plans
  3. WE Week: Ladel Lewis and Bernadette Wright on Members Wanted: Recruitment, Retention and Reclamation

Sheila B Robinson on #Eval14 – Visionary Evaluation and the Big Blue Bear

American Evaluation Association 365 Blog - Sat, 10/18/2014 - 01:23

Hi, I’m Sheila Robinson, aea365’s Lead Curator and sometimes Saturday contributor with reflections from Denver as we wrap up Evaluation 2014. I’ve enjoyed five AEA conferences now, each one as exciting a learning and community-building opportunity as the last. I spent time thinking deeply about our conference theme and discovering the connections among the various presentations to those ideas and ideals.

Beverly Parsons, our 2014 AEA president, kicked off the conference with an inspiring opening plenary, Visionary Evaluation for a Sustainable Equitable Future during which she described three key areas and how they apply to evaluation.:

Systems thinking: emphasizes seeing interconnections especially related to competing values and ripple effects of various actions.

Building relationships: emphasizes working across disciplines and partners in new ways.

Equitable and sustainable living: draws attention to matters such as the interface between human justice and the use of natural resources.

John Gargani, AEA president-elect for 2016, helped close out the conference in the final plenary by asking participants to consider three key questions:

1.) What should AEA’s role be in supporting a sustainable equitable future?

2.) How might AEA support your plans for visionary evaluation?

3.) How should AEA contribute to the global evaluation community?

Lesson Learned: Many sessions were overflowing with standing room only and some presenters were surprised and honored that their sessions drew such interest. Handouts were in short supply and I heard many, many participants ask for the presenter’s slides.

Get Involved: With that in mind, Evaluation 2014 presenters: Please upload your materials – Slides, handouts, etc. – to the AEA Public eLibrary. It’s easy to do and not only will your Evaluation 2014 participants appreciate it, but your reach will be extended to those who could not be at the conference.

Cool Trick: To extend your learning and enjoy a variety of perspectives, start looking in the coming days and weeks for evaluation bloggers to reflect on their conference experiences. Heres a link to our AEA member blogs.

Hot Tip: This same link will get you a list of evaluators on Twitter. Be sure to search the hashtag #eval14 for conference tweets. I maintain a twitter list of evaluators as well and it grew substantially during the conference closing in on 300. You can subscribe to that list through me – @SheilaBRobinson. Be sure to follow some of the newest #eval tweeters too, to continue to build community among evaluators.

And finally, many evaluators had the opportunity to enjoy all that Denver offers, while others stayed close to the conference sites – The Hyatt Regency and Denver Convention Center. We were perplexed and amused by the friendly but imposing 40ft big blue bear who peers curiously into the Convention Center as if to say, “Who are all these evaluators and what are they about?”

“I See What You Mean” (2005) Sculpture by Lawrence Argent. Photo by Billy Hathorn

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Beverly Parsons on Visionary Evaluation for a Sustainable, Equitable Future and Welcome to Evaluation 2014!
  2. AEA Conference Week: Beverly Parsons on Linking the 2013 and 2014 AEA Conferences
  3. Sheila B. Robinson on the Annual Call for Proposals – Evaluation 2014

Leigh M. Tolley on Taking a Chance and Getting Involved at Evaluation 2014

American Evaluation Association 365 Blog - Thu, 10/16/2014 - 01:15

Hello everyone! My name is Leigh M. Tolley, and I am an advanced doctoral student in the department of Instructional Design, Development and Evaluation at Syracuse University, and a Research Assistant at Hezel Associates, LLC in Syracuse, NY. I realized on the way to Denver that this is my fifth AEA conference!

Lesson Learned: I feel like AEA is my professional organization home, and also a professional branch of my family. Each year, I am thrilled to catch up with colleagues and friends in person and to continue to learn more about the field. At my first conference, I was amazed by the many aspects of evaluation that exist at AEA. As a graduate student new to the field, I decided to start by exploring sessions and visiting business meetings.

At the 2011 annual conference, I raised my hand right away when members of the PreK-12 Educational Evaluation TIG asked for volunteers interested in serving as Members-at-Large for the next year. The following year, I served as the TIG’s Program Chair-elect, and have been the Program Chair for 2014.

It can be scary as a student or as someone new to the field to jump in, but for me the initial fear dissipated quickly. I loved the opportunity to get more involved with AEA, and those little steps have evolved into something huge for me, both professionally and personally.

Get Involved: I would like to encourage those thinking about getting more involved to jump in. There are many opportunities to get involved, including volunteering to help a specific TIG at their business meeting, helping to review proposals for next year’s program, or even writing a blog post for aea365. Even just chatting with those seated around you in a session can be a great way to start a network or add to those you already have.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. NPF Week: Charles Gasper on Getting Involved in the NPF TIG
  2. GSNE Week: A Rae Clementz on Planning Your Best AEA Annual Conference Experience
  3. Teaching Tips Week: Bonnie Stabile on Making the Most of the AEA Annual Conference

Jayne Corso on Early Perspectives from #eval14

American Evaluation Association 365 Blog - Wed, 10/15/2014 - 17:30

Hi! I’m Jayne Corso, Community Manager for AEA with some early perspectives from Day 1 of Evaluation 2014.

The twittersphere lit up this afternoon as Beverly Parsons, our current AEA president gave her plenary talk, Visionary Evaluation for  Sustainable, Equitable Future. Evaluators were especially impressed with young Xiuhtezcatl Roske-Martinez of Earth Guardians.

Thanks @X4Earth @earthguardianz for an inspiring start to the AEA annual conf. #eval14 @AEAamp pic.twitter.com/29hmLvxWCK

— Lovely Dhillon (@ldhillon) October 15, 2014

“We have to educate ourselves about the problems AND the solutions.” Xiuhtezcatl Martinez, 14 years old @earthguardianz #eval14

— Lisa Frantzen (@LisaFrantzen) October 15, 2014

Beverly Parsons: mentions of complex adaptive systems in AEA keynote. Still inspiring ever since I heard the idea from @owenbarder #eval14

— Kevin Skolnik (@datadrivenMandE) October 15, 2014

Young people are more impactful and knowledgable that we adults give them credit for #eval14

— Nicole Clark, LMSW (@NicoleClarkLMSW) October 15, 2014

Inspired by @earthguardianz and the dynamic young Xiuhtezctal Martinez! #eval14 #Colorado #sustainability

— kaye boeke (@kayebear) October 15, 2014

We are all connected #eval14 > Xiuhtezcatl Martinez pic.twitter.com/QmVjWIX0IA

— Chris Lysy (@clysy) October 15, 2014

One way to create a sustainable, equitable future – uplift youth voices. Blown away by Xiuhtezcatl Martinez at #eval14.

— Sammi Greenberg (@Eval_Revolution) October 15, 2014

Inspiring Xiuhtezcatl Martinez speaking to @aeaweb “adults had a party & now we are left with the clean-up” #Eval14 pic.twitter.com/0jGjBnV7v1

— Eugenia Boutylkova (@EBoutylkova) October 15, 2014

Be the change! Rap comes to the AEA plenary #eval14

— BetterEvaluation (@BetterEval) October 15, 2014

#eval14 #BevParsons #systemsthinking help us understand connections, boundaries & perspectives. http://t.co/YQ34H24j3x

— Allison Titcomb (@AllisonTitcomb) October 15, 2014 

Lesson Learned: Evaluators are passionate about learning!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Sheila B Robinson on #Eval14 – Visionary Evaluation and the Big Blue Bear
  2. Beverly Parsons on Visionary Evaluation for a Sustainable, Equitable Future and Welcome to Evaluation 2014!
  3. Sheila B. Robinson on Making the Most of Evaluation 2014, Even if You Cannot Attend

Dan McDonnell on Using Twitter to Enhance Your Conference Experience

American Evaluation Association 365 Blog - Tue, 10/14/2014 - 14:25

My name is Dan McDonnell and I am the Community Manager for the American Evaluation Association.

Ever wanted to be a professional conference insider? Social media offers a fascinating way to add an extra level of experience to any conference you attend, and provides you extra content, as well as the means to discover fellow attendee recommendations and conference secrets or ‘life-hacks,’ provided you know your way around Twitter and hashtags. Read on to see how you can take enhance your experience at the  next conference you attend (Evaluation 2014 for many of you!).

Hot Tip: Know Your Conference Hashtag

First things first. Check out the conference website or marketing materials to find out what hashtag will be used. In the case of Evaluation 2014, the official hashtag is #Eval14. Using the Twitter search client (or one of the many third-party Twitter management tools out there), search for the official conference hashtag and start reading. Consider this your conference command center! Whenever you have some downtime, or are interested in hearing your fellow attendee’s reactions to certain presentations or sessions, search the hashtag to see what people have to say/

Hot Tip: Share Your Experience

Part of getting the most value out of social media is by being, well, social. Use Twitter as your personal digital notepad by:

  • Tweeting out neat data points or insightful thoughts from speakers
  • Sharing your own reflections on the content and topics being discussed
  • Join the conversation by @replying to other users Tweeting on the hashtag
  • Posting photos from the event

With all of the above, be sure to include the conference hashtag to join in with the larger conversation. Not only will you have a digital record of some of your experiences from the event to review later, but you’ll open up opportunities to connect and meet with your fellow conference attendees, and give those who are unable to attend the conference a small taste of the experience they are missing.

Hot Tip: Connect with Others

Don’t miss an opportunity to expand your network and learn more from conference attendees and speakers. Follow people on Twitter that are using the conference hashtag, as chances are, you’ll have a lot in common. Search for conference speakers and presenters on Twitter (or just ask for their Twitter handle in person) to give them a shout out, especially if you enjoyed their session. You can also ask them follow-up questions via Twitter or simply to ‘subscribe’ to their feed and read up on more great evaluation content that interests you.

By following along with the conference hashtag, you may also uncover great recommendations on sightseeing, local cuisine and the best place to grab a coffee near the hotel or convention center.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Dan McDonnell on Using Twitter to Add Value to Your Evaluation 2013 Experience
  2. Dan McDonnell on Making New Friends and Mastering Lesser-Known Twitter Features Without Third Party Apps
  3. Sheila B. Robinson on Making the Most of Evaluation 2014, Even if You Cannot Attend

Sheila B Robinson on Perspectives from First-Time #Eval14 Attendees

American Evaluation Association 365 Blog - Mon, 10/13/2014 - 15:58

Hello! I’m Sheila B Robinson, aea365’s Lead Curator and frequent contributor with a woman-on-the-street (well, woman lurking in conference room) report from first time attendees at #eval14.

This afternoon, I caught up with participants from Presenting Data Effectively: Practical Methods for Improving Evaluation Communication.

Tanya Hills, Evaluation Manager with the USTA Foundation wanted to learn more about technology to better display data she has to report. “I wanted to learn more about different options for graphs and charts.” She shared that she learned a lot of useful and applicable information about how to visually present data using visual learning theory.

Tanya decided to take full advantage of professional development sessions and is looking forward to Evaluation-Specific Methodology and Leading Through Evaluation later this week. Like many attendees, she chose sessions based on their descriptions.

Tanya downloaded the new conference app and is interested in using is as a networking tool. “I’m excited about being among a group of people who have similar interests who are excited about data and are trying to make a positive impact on the world.”

Next, I met Diane Mashburn, Instructor for Program Planning, Evaluation, and Accountability at the University of Arkansas Division of Agriculture. Before I could even ask her impressions of the workshop, she shared a “small world” story with me. Diane sat at a table with Mark Parman, Evaluation Outcomes Measurement Specialist of the Cherokee Nation. Conversing about bookstores and restaurants in California, Diane and Mark discovered that Diane was born in the same town as Mark’s wife. They then figured out that Mrs. Parman graduated high school with Diane’s mother and Mark remembers having been at their house in years past! Talk about networking!

Diane chose today’s workshop because she’s in charge of federal reporting. “I’m new in the position so I’m always looking for ideas for how to present all the data I’m in charge of collecting.” As for what she’s learned thus far: “I have so many ideas I’m going to have to make a priority list for what I’m going to tackle first!”

To choose sessions for the rest of the week, Diane found the TIG most closely related to what she does – the Extension Evaluation TIG – and decided to attend a number of their sponsored sessions. She started with the online catalog, but then downloaded the conference app to add more to her schedule.

As for networking, she jokes, “There’s no telling who else I might meet who knows my family!” She’s excited about professional networking, too. “I’ve already met a couple of other people that do extension work. I can tell that networking will be really good with this conference.”

Stay tuned this week for more #eval14 action!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Susan Kistler on Tips for First Time Conference Attendees
  2. GSNE Week: Alice Walters on The Art and Science of Networking
  3. GSNE Week: A Rae Clementz on Planning Your Best AEA Annual Conference Experience

Beverly Parsons on Visionary Evaluation for a Sustainable, Equitable Future and Welcome to Evaluation 2014!

American Evaluation Association 365 Blog - Sun, 10/12/2014 - 08:55

Hi everyone! I’m Beverly Parsons, 2014 AEA president. I’m also executive director of InSites, a research, evaluation, and planning organization.

Evaluation 2014 is finally here! I’d like to kick off this week of conference-focused AEA365 posts by highlighting the conference theme, Visionary Evaluation for a Sustainable, Equitable Future. The graphic summarizes the key message. It was created by our fabulous conference chair, Matt Keene, and his amazing friend and colleague, Chris Metzner. 

Here’s the basic idea.

Behind the “Visionary Evaluation Kaleidoscope” is a representation of a desired version of the Denver area—one in which natural and human systems are in a sustainable balance.

By thinking in terms of a desired future, evaluators are not trying to predict the future. Rather, having a picture in one’s mind of a desired future encourages us to use our professional capacities and personal commitments differently. Our world is experiencing sobering trend lines of unjust social conditions. They may be in areas such as health, education, and the economy. They may be related to diminishing natural resources such as clear air and water. We want to use evaluation to shift those trend lines in the direction of a sustainable and equitable future for many generations to come.

Here’s where Visionary Evaluation comes into play.

Visionary evaluation is not a particular method but rather is shorthand for encouraging evaluators to support movement toward a desired future. You might think of it as three creative turns of the Visionary Evaluation Kaleidoscope in the graphic:

 

Systems thinking: emphasizes seeing interconnections especially related to competing values and ripple effects of various actions.

Building relationships: emphasizes working across disciplines and partners in new ways.

Equitable and sustainable living: draws attention to matters such as the interface between human justice and the use of natural resources.

My desire is that we all end this week with a renewed sense of what a sustainable, equitable future is and how we can use a visionary approach to evaluation to contribute to that future.

Hot Tip: Check out AEA Evaluation 2014 to review the program, the conference theme, and much more. Even if you are not attending the conference, you can be involved. Join the Twitter conversation at #Eval14. Check out the e-library as presenters post their materials. Contact the presenters whose work is of interest to you.

Rad Resources: There’s a new feature this year. Recordings of the Presidential Strand and Plenaries from the conference will be available for purchase after the conference! Information will be available online at AEA Evaluation 2014.

See you in Denver!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Sheila B Robinson on #Eval14 – Visionary Evaluation and the Big Blue Bear
  2. AEA Conference Week: Beverly Parsons on Linking the 2013 and 2014 AEA Conferences
  3. Sheila B. Robinson on the Annual Call for Proposals – Evaluation 2014

Jayne Corso on using Riffle to Enhance your Twitter Experience

American Evaluation Association 365 Blog - Sat, 10/11/2014 - 06:00

Hello, my name is Jayne Corso and I am a Community Manager for the American Evaluation Association (AEA). As the voice behind AEA’s Twitter presence (@aeaweb), I am always looking for new ways to connect with evaluators and surface online conversations focused on evaluation. I recently came across a tool called Riffle that is instrumental in helping to identify twitter users who are interested in evaluation and trending topics.

Riffle turns your browser into a pop-up informative Twitter analytics platform and allows you to quickly research users and read up on their Twitter habits. When you download Riffle (it’s probably easiest to add as a Google Chrome browser extension), a small triangular icon will appear on the right side of your browser. When you click this icon, Riffle will open up a sidebar that will allow you to search individual Twitter users. For example, if you search @aeaweb in Riffle, here’s what you’ll see:

When you’re looking at your home feed on Twitter, you’ll notice the Riffle icon now appears next to each Twitter user’s handle. Simply click     that icon and the sidebar will pull up their Riffle information.

Rad Resource: Expand your Twitter community

If you type a user name into the search bar, such as @aeaweb, you can easily see who AEA is mentioning in posts. As you can see, AEA often mentions @clysy (Chris Lysy), @evalu8r (Stephanie Evergreen), and @bettereval (Better Evaluation). All of these users are highly focused on evaluation best practices and are folks that we’d recommend you follow! Take Riffle one step further and search one of the usernames in AEA top mentions to find out who they follow.

Rad Resource: Uncover new trends

Similar to top mentions, the tool also returns the top hashtags used in Tweets by the Twitter user you search. Let’s look at another example, when you search for @evalu8r in Riffle, you can see that Stephanie Evergreen commonly uses these hashtags: #dataviz #eval #p2i #eval14. You’ve just found four evaluation focused hashtags that you can begin following on Twitter, opening the door to some great new content and information that you may have been missing in the past. You’ll find it easier to stay up-to-date on new trends in evaluation through reading these hashtags and participate in online conversations by using the hashtags in your own tweets.

Hot Tip: Get ready for Evaluation 2014

If you are joining us at Evaluation 2014, keep an eye on the #eval14  hashtag for the most up-to-date information about the event. Also use the hashtag in your Tweets to share your experiences, conference photos, or to connect with other attendees. We look forward to seeing you in Denver!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Dan McDonnell on Even More Ways to Analyze Tweets with Twitonomy
  2. Dan McDonnell on Using Lists to Become a Twitter Power User
  3. Innovative #Eval Week: Schwalbe, Miller, and Gensinger on #EvalAHA! What Can Twitter Tell Us about the AEA Conference?

Rick Stoddart on Leveraging Libraries for Evaluation Success

American Evaluation Association 365 Blog - Fri, 10/10/2014 - 01:23

My name is Rick Stoddart and I am Head of User & Research Services at the University of Idaho. Libraries offer evaluators many useful tools including access to community data, methodological resources for evaluation, research expertise, and even public spaces to present findings. Here are some tips to get you started:

Rad Resource: Library Card – This might seem a no-brainer, but your library card is the key to accessing a bunch of resources both online (ebooks & databases) and in print (periodicals & books). Besides your public library card, some academic libraries offer community user cards to checkout their materials. More information at http://atyourlibrary.org/how-get-library-card.

Rad Resource: Evaluation and Assessment Methodology Sources – Whether you need to consult the Handbook of Evaluation: Policies, Programs, and Practicesor to access an article in the journal Evaluation and Program Planning — a library is a good starting point to locating a plethora of evaluation related resources to help you plan your next project. See more evaluation methodology sources available in a library near you: http://bit.ly/EvalMethod.

Rad Resource: Statistical Sources - Libraries contain various handbooks, databases, and expertise in locating statistical and marketing data that may inform your evaluation practices. Whether it is utilizing librarian expertise in accessing demographic statistics about a community you are studying at Census.Gov or consulting the ProQuest Statistical Abstract of the United States for specific data on the amount of money spent weekly by families on food – your library has a statistical resource for you.

Rad Resource: Online Databases – Many libraries purchase access to online databases that include articles, peer-reviewed research, and other data of interest to evaluation experts. You can access these resources by visiting your local library or even directly from your own computer if you are an authorized user. In addition, many states purchase statewide access to online materials for their citizens. For example, Oregon offers access through their Libraries of Oregon website and Idaho via their Libraries Linking Idaho website. Check your own state library for more information.

Hot Tip: Ask a librarian - Seeking a book on participatory evaluation (http://bit.ly/ParEval) or access to a resource in this blog post? Ask a librarian! Most libraries have chat, text, and email reference services — so you don’t even have to leave your office. More information at http://www.atyourlibrary.org/how/expert-staff.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Rebecca? He’ll be presenting as part of the Evaluation 2014 Conference Program, October 15-18 in Denver, Colorado.

Related posts:

  1. Internal Eval Week: Sue Hunter and Cindy Olney on Using Appreciative Inquiry in Evaluation
  2. Lindsey Dunn and Lauren Fluegge on Organizing Data
  3. Harlan Luxenberg on When to Use a Database Solution instead of Excel

JT Taylor and Emily Drake on Participatory Qualitative Data Analysis

American Evaluation Association 365 Blog - Thu, 10/09/2014 - 01:21

Hi folks! I’m JT Taylor, Director of Research and Evaluation at Learning for Action (LFA), and I’m here with my colleague Emily Drake, Senior Consultant and Director of Portfolio Alignment. LFA is a San Francisco-based firm that enhances the impact and sustainability of social sector organizations through evaluation, research, strategy development, and capacity-building services. Emily and I can’t wait to share an easy and reliable approach to facilitating participatory, collaborative qualitative analysis processes at this year’s AEA conference.

Lessons Learned: Effective facilitation is essential for leading participatory and collaborative evaluation processes: (1) it helps us to surface and integrate a multitude of perspectives on whether, how, and to what extent a program is working for its intended beneficiaries; (2) it is necessary for building and maintaining trust among stakeholders: trust that they are being heard, that their perspectives are weighted equally among others, and that their participation in the evaluation process is authentic and not tokenized; and (3) it is important for producing the buy-in of stakeholders and relevance of results that ensure evaluation findings will inform real action.

Engaging a variety of stakeholders, including program beneficiaries, in the analysis and interpretation of data in a way that authentically includes their perspective and contributions is important—and takes a set of facilitative skills and tools that go beyond evaluators’ typical training in technical analysis. In our work implementing collaborative evaluations, we have found that the same facilitation techniques that produce great meetings and brainstorming sessions can also be used to elicit great insights and findings from a participatory qualitative analysis process.

Hot Tip: Use participatory analysis techniques when you want to synthesize qualitative data from multiple perspectives and/or data collectors—whether those data collectors are part of your internal team, evaluation partners, or members of the community your work involves.

  • Do the work of “meaning-making” together, so that everyone is in the room to clarify observations and themes, articulate important nuances, and offer interpretation.
  • Use a 1-2 hour working meeting with all data collectors to summarize themes and pull out key insights together. Have each participant write observations from their own data collection, each on a large sticky note. Then group all observations by theme on the wall, having participants clarify or re-organize as needed.
  • Save reporting time later by asking participants to annotate their sticky note observations with references to specific interviews, transcript page numbers, and even quotes from their data collection to make it easy to integrate examples and quotes into your report.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from th eAmerican Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Rebecca? They’ll be presenting as part of the Evaluation 2014 Conference Program, October 15-18 in Denver, Colorado.

Related posts:

  1. Kylie Hutchinson on Group Facilitation
  2. Viengsavanh Sanaphane and Katie Moore on Dynamic Data Presentation and Recording for an Empowerment Evaluation Approach
  3. Video in #Eval Week: Paul Barese on the Value Added of Video

Rebecca Woodland on Evaluating and Measuring Teacher Collaboration

American Evaluation Association 365 Blog - Wed, 10/08/2014 - 01:14

Hello friends in evaluation, my name is Rebecca Woodland and I’m an associate professor of educational leadership at UMass Amherst. I’ve been a contributor to AEA in a variety of ways on the topic of evaluating and improving organizational and inter-professional collaboration. I’m especially passionate about using evaluation to cultivate meaningful teacher collaboration in PreK – 12 school settings. In this post I’d like to share some tips and tools for assessing teacher collaboration. Evaluators can use these tools to help stakeholders avoid “collaboration lite,” whereby mere congeniality and imprecise conversation is confused with the serious professional dialogue vital to instructional change, student learning, and school improvement.

Hot Tip – K-12 educators are passionate about teacher collaboration, and know that it is the vehicle to instructional improvement. Unfortunately, the term collaboration, although ubiquitous, persists as a messy (under-empiricized, under-operationalized) construct. Fortunately, evaluators are uniquely positioned to help stakeholders make sense – to raise shared literacy – about what teacher collaboration ideally looks and feels like.

Rad Resources – 1) Evaluating and Improving the Quality of Teacher Collaboration: A Field-Tested Framework for School Leaders ©2008 NASSP Bulletin

and 2) Evaluating the Imperative of Inter-Personal Collaboration: A School Improvement Perspective ©2007 American Journal of Evaluation. (http://aje.sagepub.com/content/28/1/26.short)

Co-authored with my colleague Chris Koliba, both present a theoretical frame for inter-professional collaboration, and specific suggestions for how evaluators can facilitate shared stakeholder understanding of collaboration.

Hot Tip – Collaboration can be operationalized (and measured)! Teacher collaboration entails on-going cycle of dialogue, decision-making, action and evaluation, through which teachers build their knowledge and skills and make targeted changes to classroom practice – the primary factors attributed to improvements in student learning.

Rad Resource – The Teacher Collaboration Assessment Survey (TCAS). The TCAS is a validated instrument for the systematic assessment and targeted improvement teacher collaboration. Evaluators can use this tool in a variety of ways to evaluate the process and outcomes of teacher collaboration. Access the TCAS in: Woodland, et al. (2013) A Validation Study of the Teacher Collaboration Assessment Survey in Educational Research and Evaluation: An International Journal of Theory and Practice. ()

The evaluation of teacher collaboration can help build educator capacity to recognize and strengthen attributes of teacher teaming, and to make systematic, evidenced-based improvements to instructional practice that lead to greater student learning.

See you in Denver for AEA 2014!

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Rebecca? She’ll be presenting as part of the Evaluation 2014Conference Program, October 15-18 in Denver, Colorado.

Related posts:

  1. Siobhan Cooney on Effectively Evaluating Teacher Professional Development
  2. CREATE Week: James Stronge and Xianxuan Xu on Lesson Learned Regarding Teacher Effectiveness and Teacher Evaluation
  3. PD CoP Week: Donna Campell on Evaluating Professional Development: Guskey Level 3 – Organizational Support and Learning

Ian Patrick and Anne Markiewicz on Establishing Outcomes to Impacts

American Evaluation Association 365 Blog - Tue, 10/07/2014 - 01:55

Greetings from Ian Patrick and Anne Markiewicz, in Melbourne, Australia – evaluators active in evaluation design, implementation and training for a range of domestic and international clients. We’ve been reflecting on a tortured area of evaluation practice – that being expectations frequently placed on evaluators to identify the IMPACT of a program.

Every evaluator breathes a sigh of relief when their clients or stakeholders are knowledgeable about evaluation and hold reasonable expectations about what it can and can’t do. But how many evaluators instead have felt the heavy weight of expectations to establish high level results demonstrating a program has make a big difference to a region, country or the world! Or in a related scenario, an eagerness to establish longer term results from a program which has only been operating for a limited duration! Other unrealistic expectations can include adopting a program-centric focus which sees all results as attributable to the program, minimizing the contribution of stakeholders and partners to change. Or in another context, adopting a limited lens on the perceived value of different types of results.

Such situations call for cool-headedness and a calm educative approach from the evaluator. Where possible, the evaluator has much to gain from open discussion and exchange of views, tempering unrealistic aspirations and negotiating realistic expectations from an evaluation. Here are some of the strategies that we have found productive in such contexts:

HOT TIPS:

Reflect on Impact: As an upfront strategy, become clear with clients/stakeholders about what is meant by ‘impact’. Be aware, that the term is used loosely, and lazily, often to support sweeping expectations. Introduce other helpful terminology to identify and demarcate different categories of results such as intermediate outcomes. A sense of realism in discussions may well clarify that these types of results can be realistically identified within the program time frame. Intermediate results, once identified and understood are often highly valued, and stand in contrast to more elusive, longer term impact.

Decompress Time: Proactively address a tendency for time factors associated with a program’s results to become compressed. A fixation on end states can obscure the important intermediary stages through which change evolves. Utilisation of program theory and program logic approaches can provide a means to identify expected changes over realistic time frames.

Remember Others: Resist a tendency for change to be unilaterally attributed to a program. Recognise and focus on the contribution made by related stakeholders/partners to change.

Adopt Pluralist Approaches: promote application of various perspectives and ways of identifying and measuring change rather than using a single method. Use of mixed methods approaches will promote a more subtle and nuanced view of change, particularly how it manifests and is experienced during a program’s life cycle.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Ian and Anne? They’ll be presenting as part of the Evaluation 2014 Conference Program, October 15-18 in Denver, Colorado.

Related posts:

  1. Steve Mumford on Defining Advocacy Champions and Evaluating Their Impact
  2. Cultural Competence Week: Asma Ali and Anthony Heard on Beyond the Findings: Reflections on the Culturally Competent Evaluator’s Role
  3. APC TIG Week: Carlisle Levine on Using Contribution Analysis to Explore Causal Relationships in Advocacy and Policy Change

Christine Frank on Style Matters

American Evaluation Association 365 Blog - Mon, 10/06/2014 - 01:52

My name is Christine Frank and I am an independent Canadian evaluator. I have a couple of questions for you. Do your reports intrigue your audience or send them for coffee? Do people grasp your message easily?

Although I am best known as a program evaluator, I have also taught courses on business communications and co-authored a textbook on that subject. Experts in business communications focus on dynamic, readable writing. Plain writing experts promote a similar style. Both areas of expertise afford simple strategies to make functional documents more inviting and compelling.

Evaluators sometimes hinder their effectiveness by writing in an overly academic style. For instance, in journal articles, you often find sentences 60 words in length or more. One of the pivotal rules of both business writing and plain writing is to limit sentence length. Even if readers have excellent reading skills and are grounded in the subject matter, you can construct your text to propel them forward, not slow them down. My own frustration in reading unnecessarily lengthy, wordy text drives me to strive for instant clarity.

Hot Tip: For evaluators, I suggest a maximum of 20 words per sentence. You might stretch this limit when a short sentence just won’t convey the message. However, another fundamental rule is to check your text to see if you have used the least number of words possible. If you do this, you may find you can achieve the limit. Many strategies can be applied to maximize clarity. One is to avoid an over-abundance of nouns, especially in sequence. In the following sentence adapted from an actual Request for Proposals, you will see eight nouns, five of them in a row.

  • Our first task is the development of a best practice guideline implementation evaluation plan.

Better

  • First we will develop a plan for evaluating the implementation of best practice guidelines.

Hot Tip: A strategy that reduces sentence length and makes the text more compelling is use of the active voice of the verb.

  • The top three reasons given by students for choosing a career were successfully predicted by teachers.

Better

  • Teachers successfully predicted students’ top three reasons for choosing a career.

Rad Resource: Federal Plain Language Guidelines (2011)

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Christine? She’ll be presenting as part of the Evaluation 2014 Conference Program, October 15-18 in Denver, Colorado.

Related posts:

  1. DOVP Week: John Kramer on Universal Design Principle 3: Making Language Understandable for Everyone
  2. DVR Week: Rakesh Mohan and Margaret Campbell on Making Evaluation Reports Reader Friendly
  3. Susan Kistler on How to Win a Copy of Visualize This

Ricardo Wilson-Grau on Outcome Harvesting

American Evaluation Association 365 Blog - Sun, 10/05/2014 - 01:48

I am Ricardo Wilson-Grau an evaluator based in Rio de Janeiro but working internationally. Over the past 9 years, co-evaluators and I have developed the Outcome Harvesting tool while performing two dozen developmental, formative and summative evaluations. Half of these evaluations were of international social change networks. The other half of the evaluations were of the programmes of international development funding agencies.

The two dozen evaluands had in common that they did not have plans that could be conventionally evaluated because the original definition of what they aimed to achieve, and what they would do to achieve it, were either not sufficiently specific and measurable to compare and contrast what was planned with what was done and achieved, or they had to cope with dynamic, uncertain circumstances. In part this complexity was explained because all were attempting to influence changes in the behaviour of social actors over whom they had no control in order to make progress towards improvements in people’s lives, the conditions of society or in the state of the environment.

An August 2013 discussion paper from the United Nations Development Program summarized: Outcome Harvesting is “an evaluation approach that — unlike some evaluation methods — does not measure progress towards predetermined outcomes, but rather collects evidence of what has been achieved, and works backward to determine whether and how the project or intervention contributed to the change”.

Hot Tip: You can periodically demonstrate and be accountable for concrete, verifiable and significant results, negative as well as positive, of your work even if the outcomes were not planned, are unintended, and your contribution as been one amongst that of others and direct or indirect.

One instrument that will support you: see the Outcome Harvesting Brief.

Rad Resources:

Here are links to three diverse examples of Outcome Harvesting use:

The summative evaluation of the Oxfam Novib’s €22 million program to support 38 grantees working on sustainable livelihoods and social and political participation documents outcomes from 111 countries

report on the evaluation experience of identifying and documenting 200 emergent outcomes of the BioNET global network.

After ten World Bank Institute teams piloted a customized version of Outcome Harvesting last year, in June 2014 the WB published a booklet of the cases and now lists the tool amongst its resources for monitoring and evaluation.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from theAmerican Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Ricardo? He’ll be presenting as part of the Evaluation 2014Conference Program, October 15-18 in Denver, Colorado.

Related posts:

  1. PD Presenters Week: Kaia Ambrose and Simon Hearn on A Whistle-Stop Tour of Outcome Mapping
  2. APC Week: Rhonda Schlangen on Joint Evaluation Strategies for Advocacy and Services
  3. Ann Zukoski on Participatory Evaluation Approaches

Zachary Grays on the Evaluation 2014 Agenda Builder

American Evaluation Association 365 Blog - Sat, 10/04/2014 - 01:15

Greetings! My name is Zachary Grays and I am a part of the staff here at AEA. Evaluation 2014 is less than two weeks away, and we know you are in the process of planning your schedule to maximize the value from the 700+ sessions, workshops, and business meetings in the Mile High City! An agenda you can take on the go is incredibly useful, and building your agenda on our agenda builder is easy and a great way to narrow down what you’re interested in attending at the conference.

Hot Tip: If you are logged in, build your personal conference agenda for Evaluation 2014 using the agenda builder tool here! If you are not logged in, click the link and click ‘View Your Registration Details’ to log in and complete your agenda.  Please note that you must be registered for Evaluation 2014 before you can build your agenda. Next, click ‘My Agenda’, ‘Edit Agenda’ and start adding sessions by date and time. You have a few options in adding sessions to your agenda:

  1. Date and Time: For each date and time slot you may add an individual session by clicking ‘Click to add a Session’. This will bring up the title of each session scheduled for this time slot and their location. You may add as many sessions as you want in a particular date/time slot.
  2. Track: You may also filter sessions by track. Simply select a track from the drop down and proceed to each date and time using the above instructions to see what presentations are happening at a particular time under that track.
  3. Search: Click ‘Search’ to take you to the Evaluation 2014 searchable program. This comprehensive function allows you to search by track, time slot, room, session type, presenter name, organization, and level. You may then ‘Add’ the session of your choice to your agenda.

 Hot Tip: Import your agenda to your calendar. In the true theme of going green, forego the printable options and import your agenda to your mobile calendar. Import your agenda to your preferred calendar by following these instructions found through the ‘Download My Agenda’ button.

Cool Trick: Add A Personal Appointment. While in edit mode, click ‘Click to add a personal appointment’ to add a personal note that you don’t won’t to forget while dashing between sessions.

Caution:  Please keep in mind that the mobile app and the agenda builder do NOT sync with one another. Be sure to take the time to build your agenda separately on the mobile app (coming soon) prior to visiting Denver.

Rad Resource: Have questions about building your agenda? Contact the AEA staff at info@eval.org. See you in Denver!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Lars Balzer on The Evaluation Portal Event Calendar
  2. Eric Graig on Requall
  3. Susan Kistler on Getting More From Google

NPF TIG Week: Claire Sterling on Packing Up This Week’s Lessons…and Bringing Them to Evaluation 2014!

American Evaluation Association 365 Blog - Fri, 10/03/2014 - 01:15

Hello, everyone!  I’m Claire Sterling, Program Co-Chair of the AEA Nonprofit & Foundations TIG and Director, Grant Strategies at the American Society for the Prevention of Cruelty to Animals (ASPCA), the oldest American animal welfare organization.  For nearly 150 years, the ASPCA has been saving and improving the lives of companion animals through a wide variety of approaches, with grants being officially added to our toolkit in 2008.  Although the ASPCA has a long history in New York City, its impact is also national, leveraged in part by grants to animal shelters, rescue groups, humane law enforcement agencies, sanctuaries, spay/neuter providers, and other organizations all across the country.  Last year alone, the ASPCA gave close to $17.5 million.

Hot Tip:  One of the many perks of my job (apart from having foster cats as co-workers) is having the opportunity to see things from the perspective of both a nonprofit and a foundation since, as a grantmaking public charity, the ASPCA is a hybrid of both.  But even if you work at an organization that is purely one or the other, this week’s AEA365 posts provide a glimmer of both perspectives as well. On behalf of the Nonprofit & Foundations TIG’s leadership, many thanks to this week’s contributors for their pearls of wisdom!

Lesson Learned:  As evaluators at nonprofits and foundations, we often find ourselves at the crossroads where the biggest challenges in direct-service and philanthropic work can converge in overwhelming ways:  urgent community needs that must be addressed with limited resources, mandates to operate with incomplete information, speedy priority shifts, and disconnects between theory and practice.  But as this week’s posts so succinctly demonstrate, where there’s a will, there’s always a way forward.

Rad Resource:  We hope these posts inspire conversations with your TIG peers at Evaluation 2014 October 15-18 in Denver.  Session information for our TIG’s track is now live.  There’s simply no substitute for face-to-face connection!

Rad Resource:  And speaking of good opportunities for connection, while you’re at Evaluation 2014, we hope you’ll attend the Nonprofit & Foundations TIG’s business meeting on Thursday, October 16 from 3:00-4:30pm, which will include a panel discussion for 2nd Edition of Empowerment Evaluationby David M. Fetterman, Shakeh J. Kaftarian, and Abraham Wandersman.  The book presents assessments by notable evaluators from academia, government, nonprofits, and foundations on how empowerment evaluation has evolved since the previous edition’s publication in 1996.

See you in Denver!

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

Related posts:

  1. Sue Hoechstetter on Resources for Evaluating Community Organizing
  2. NPF TIG Week: Trina Willard on Moving from Measurement Strategy to Implementation in Small Nonprofits
  3. NPF TIG Week: Gretchen Shanks on What Do We Mean by “Evaluation Resources”?

NPF TIG Week: Patrick Germain on Powerful Evaluation on Limited Resources

American Evaluation Association 365 Blog - Thu, 10/02/2014 - 01:15

Hello from Patrick Germain! I am an internal evaluator, professor, blog writer, and the President of New York Consortium of Evaluators.  Working as a nonprofit internal evaluator teaches you a few things about evaluating with very few resources. Even as our sector gets better at using validated evidence for accountability and learning, the resources to support evaluative activities remain elusive.  I have written elsewhere about how nonprofits should be honest with funders about the true costs of meeting their evaluation requirements, but here I want to share some tips and resources for evaluators who are trying to meet higher evaluation expectations than they are receiving funding for.

Hot Tip #1: Don’t reinvent the wheel.

  1. Use existing data collection tools: ask your funder for tools that they might use or check out sites like PerformWell, OERL, The Urban Institute, or others that compile existing measurement instruments.
  2. The internet is your friend. Websites like surveymonkey, d3js (for fancy data viz), chandoo (for excel tips), and countless others have valuable tools and information that evaluators might find useful.  And places like Twitter or AEA365 help you stay on top of emerging resources and ideas.
  3. Modify existing forms or processes to collect data; this can be much more efficient than creating entirely new data collection processes.

Hot Tip #2: Use cheap or free labor.

  1. Look into colleges and universities to find student interns, classes that need team projects, or professors looking for research partners.
  2. Programs like ReServe and your local RSVP group place older adults who are looking to apply their professional skills to part time or volunteer opportunities.
  3. Crowdsourcing or outsourcing through websites like Skillsforchange, HelpFromHome, or Mechanical Turk, can be a cheap way of accomplishing some of the more mundane and time-consuming aspects of your projects.
  4. Organize or join a local hackathon, or find data analysts to volunteer time.

Hot Tip #3: Maximize the value of your efforts.

  1. Use resources allocated for evaluation as an opportunity to build the evaluation capacity of your organization – leverage your investment to help the organization improve its ability to conduct, participate in, and use evaluations.
  2. Focus your efforts on what is needed, be deliberate about eliminating as much unnecessary work as you can, and be very efficient with your time.

What other tools or resources do you use when you have limited resources?

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. NPF TIG Week: Trina Willard on Moving from Measurement Strategy to Implementation in Small Nonprofits
  2. NPF TIG Week: Gretchen Shanks on What Do We Mean by “Evaluation Resources”?
  3. Randahl Kirkendall and Ellen Iverson on Integrating Web Analytics into Mixed Methods Evals

NPF TIG Week: Trina Willard on Moving from Measurement Strategy to Implementation in Small Nonprofits

American Evaluation Association 365 Blog - Wed, 10/01/2014 - 01:15

My name is Trina Willard and I am the Principal of Knowledge Advisory Group, a small consulting firm that provides research and evaluation services to nonprofits, government agencies and small businesses. I’ve worked with a variety of nonprofit organizations over the years, many of which have limited staff and financial resources.

Such organizations sometimes have the opportunity to secure a small grant from a funder, awarded with good intentions to “nudge” their evaluation capacity in the right direction. These dollars may be adequate to create a measurement strategy or evaluation plan, but support is rarely provided for implementation. Consequently, many recipients leave these efforts with the feeling that they’ve accomplished little. So how do we effectively guide these organizations, but avoid leaving them in the frustrating position of being unable to take next steps? These three strategies have worked well for me in my consulting practice. 

Hot Tip #1: Discuss implementation capacity at the onset of measurement planning. Get leadership engaged and put the organization on notice early that the evaluation plan won’t implement itself. Help them identify an internal evaluation champion who will drive the process, provide oversight and monitor progress.

Hot Tip #2: Leave behind a process guide. Provide clear written guidance on how the organization should move forward with data collection. The guide should answer these questions, at a minimum:

  • Who is responsible for collecting the data?
  • What are the timelines for data collection?
  • How and where will the data be stored?
  • What does accountability for data collection look like?

Hot Tip #3: Create an analysis plan. Great data is useless if it sits in a drawer or languishes in a computer file, unanalyzed. Spend a few hours coaching your client on the key considerations for analysis, to include assigning responsibilities, recommended procedures, and how to find no/low-cost analysis resources.

Below are a few of our favorite go-to resources for small nonprofits that need support implementing evaluation strategies.

Rad Resources: Creating and Implementing a Data Collection Plan by Strengthening Nonprofits. Try this if you need a quick overview to share with staff.

Analyzing Outcome Information by The Urban Institute. This resource, referenced in the above-noted overview, digs into more details. Share it with the organization’s evaluation champion as a starting point to build analysis capacity.

Building Evaluation Capacity by Hallie Preskill and Darlene Russ-Eft. I’ve recommended this book before for nonprofits and it bears repeating. The tools, templates and exercises in the Collecting Evaluation Data and Analyzing Evaluation Data sections are particularly valuable for those that need implementation support.

What tips and resources do you use to prepare small nonprofits for implementing measurement strategies with limited resources?

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Catherine Jahnes on Learning Communities
  2. Clare Nolan and Sonia Taddy on Capacity Building for Nonprofits
  3. Karen Elfner Childs on Measuring Fidelity of School-Wide Behavior Support