Monitoring, Evaluation and Learning Systems

Lisa Kohne on Developing Evaluation Newsletters to Disseminate Results To Multi-Site Participants

American Evaluation Association 365 Blog - Fri, 07/04/2014 - 01:14

Hello! I’m Lisa Kohne, an independent evaluator and I work for SmartStart Consulting in Orange County, California. We specialize in conducting project evaluations for federally funded grants, primarily from the National Science Foundation. Most of our clients are math, science, and engineering professors from four-year universities. One of our big challenges is that some of our projects are multi-institutional, multi-state, and multi-country. It’s very difficult to bring multiple partners together at the same time to discuss evaluation findings – and most don’t have the time, inclination, or enough evaluation knowledge to read lengthy reports.

Hot tip:
To overcome these challenges we began to develop Evaluation Newsletters. They are usually two pages with lots of graphs, maps, and tables. We try to make them colorful, high-interest, and eye-catching. Some are wordier than others and our “skills” have evolved over the years. You can see the evolution from one of our earlier version (very wordy) to our more recent version (less wordy, more white space).

We only offer these to our larger, multi-site projects.  The reactions and feedback have been extremely positive.  No PI has ever turned down the offer to create a newsletter.  They are also great to distribute at advisory board meetings and project conferences.

Rad Resources:

  • Google Images works great for the simple clipart needed for newsletters. Simple is better. Just be careful to not use copyrighted ones.
  • Microsoft Publisher is our current choice of software.  We’ve tried Word but Publisher lines up the information much better.  Also, the new online subscriptions to MS Office 365 include Publisher.

SmartArt is our go to graphic developer.  Only available in MS Word, not Publisher.  So you need to create it in Word and paste it into Publisher.

Lessons Learned:

  • Less is more.  Less words, more pictures, lots of bullet points.
  • Make it personal and make it positive.  Add university and project logos, project goals, maps that indicate locations of participating institutions, funders’ logs, and anonymous quotes from participants.
  • Newsletters take a lot of time so build the cost into the budget.
  • Recruit your most artistic employee to create your newsletters – someone who understands color, balance, and brevity of words.
  • Send a sample to stakeholders and ask if it would be helpful to get evaluation results out.
  • Get commitment from your principal investigator to email the newsletters out to all stakeholders and/or project participants and post them on their project webpage.  Here is a webpage containing out newsletters on a NSF PIRE project.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Gwen Fariss Newman on Using Newsletters to Connect with Others
  2. Susan Kistler on Leaving Wordle for Tagxedo
  3. Tarek Azzam on Geographic Information Systems (GIS)

Bethany Laursen on How to Evaluate the Situation, Not Just the Program: It’s Complex!

American Evaluation Association 365 Blog - Thu, 07/03/2014 - 01:15

I’m Bethany Laursen, Evaluation Outreach Specialist with the Solid & Hazardous Waste Education Center (SHWEC) at the University of Wisconsin. I’m also principal consultant at Laursen Evaluation and Design, LLC. At SHWEC, I help staff design programs that engage opportunities to achieve our mission. Opportunity hunting requires a form of situation assessment, which has not been widely or deeply discussed in evaluation—especially when it comes to evaluating opportunities in complex, dynamical situations.

Rad Resource: AEA’s EvalTalk and TIG group listservs as peer learning communities.

Through EvalTalk, several colleagues helped me distinguish among three approaches/tools that all claim to be useful in developing programs in complex situations: needs assessment (NA), developmental evaluation (DE), and strengths, weaknesses, opportunities and threats (SWOT) analysis.

Lesson Learned: NA, DE and SWOT are all necessary parts of evaluating complex situations and program responses.

To summarize this discussion so far, we have the following options, where () means “as a part of” e.g. NA is a part of SWOT:

  1. NA–>SWOT–>DE
  2. SWOT(NA)–>DE
  3. NA–>DE(SWOT)
  4. DE(NA, SWOT)

Any of these combinations is logical, although #4 might be difficult without one of the others occurring first. What is not logical is leaving one of the triumvirate out (NA, DE, and SWOT). Here’s why:

SWOT is inherently evaluative: it assigns data a certain value label (S, W, O, or T), based on the criteria “effect on our organization’s goals.” Clearly, we need data to do a reality-based SWOT, and this is why we must include a needs assessment. But a NA per se is not going to be enough data, because many clients think a NA is just about external stakeholders’ needs (Os), not internal capacity (Ss and Ws) or larger system realities (often Ts). (If preferred, one could also frame a NA as an ‘asset assessment.’) These external and internal ‘lessons learned’ from our situation should inform developmental program evaluation.

In complex situations, needs assessment is more usefully framed as ongoing situation assessment. This is what I see as the main evaluation task in the Creative Destruction phase of the adaptive cycle. Once we have a lay of the land (situation assessment) and we’ve evaluated the best path to start down (SWOT analysis), then we can jump into developmental evaluation of that path. Of course, what we find along the way might cause us to re-assess our situation and strategy, which is why #4 above is a logical choice.

Lesson Learned: Listen to the language your clients are using to identify relevant evaluation approaches and tools. In SHWEC’s case, our connection to the business sector led me to SWOT analysis, strategic planning, and Lean Six Sigma, all of which are evaluative without necessarily marketing themselves as evaluation approaches.

Figure 1: Augmenting a traditional logic model, this is a metaphorical picture of how SHWEC understands our complex, dynamical situation and our potential evaluation questions. (Each sailboat is a staff member.) Next, I had to find evaluation approaches that would fit.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Michael Quinn Patton on Developmental Evaluation
  2. DE Week: Kate McKegg and Nan Wehipeihana on Talking to Clients and Communities about Developmental Evaluation
  3. MNEA Week: Pat Seppanen on Evaluating Complex Adaptive Systems

Heather King on Tips for Working With School District Research Review Boards

American Evaluation Association 365 Blog - Wed, 07/02/2014 - 01:15

Hi!  I’m Heather King, an Associate Project Director at Outlier Research & Evaluation at the University of Chicago. I’d like to share some tips for applying to conduct research in school districts.

Research review boards (RRBs) and institutional review boards (IRBs) are tasked with ensuring that research and evaluation projects meet the requirements for protecting human subjects. If you are collecting interview, questionnaire, focus group, or any other data directly from human subjects, you are required to earn IRB/RRB approval. You’ll need to apply separately for IRB/RRB approval at your own institution and for each school district you’ll collect data in.

I’ve completed successful IRB/RRB applications for some of the largest school districts in the United States and I’d like to share some tips for success.

Lessons Learned:

Start early. Earning district IRB approval is a prerequisite for each of our research and evaluation projects, so we do everything we can do ensure that our IRB applications are well received. The first step is beginning applications early, at least 2 months before the deadline. This gives you time to collect or create the necessary documents, such as instruments and consent forms, and to ensure that your institutional IRB has been approved.

Know the deadlines. Many districts meet only a few times a year to read and approve IRB applications, so meeting the deadline is critical. You might not have another chance to submit your application for another 6 months! Knowing the deadlines can help you plan your evaluation too. For example, if your project begins after a district IRB application deadline has already passed, you can plan in advance to begin data collection around the next IRB deadline.

Read everything. After you’ve done a few IRB applications, it can start to feel like they’re all the same, and generally they are. But each district has its own nuances; don’t wait until you get a rejection letter to learn that! In particular, read details about consent, compensation/incentives, and data collection timing because policies vary widely from district to district. For example, the Chicago Public School RRB requires that your instruments be physically stamped with approval from your home institution IRB.

Make some friends in the IRB office. Navigating the IRB process for each district takes a lot of time, and you’ll undoubtedly have questions. It helps to have a contact in the IRB office that can help explain the process and answer any questions you might have. In my experience, IRB offices appreciate being asked detailed questions because they get so many applications that have not been carefully prepared.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

Related posts:

  1. Felix Blumhardt on Simplifying the Data Collection Process with User-Friendly Workbooks
  2. Ed Eval TIG Week: Tara Donahue on Avoiding Surprises: Collaborating with the District’s Research Departments
  3. MNEA Week: Katherine Drake on Requesting Data from School Districts

Tamara Young on New Directions in Evaluation Based on Trends in Education

American Evaluation Association 365 Blog - Tue, 07/01/2014 - 01:15

I’m Tamara Young and I am an associate professor in Educational Evaluation and Policy Analysis at North Carolina State University.  I teach evaluation theory and practice to school principals, superintendents, and college and university administrators. Today, I am sharing a few hot tips I have surmised about future directions in evaluation based on trends in education.

Hot Tip:

1. States are developing data systems (e.g., North Carolina’s Home Base®) for schools that not only integrate different types of data (attendance, transcripts, student achievement scores, and teacher evaluation) that have traditionally been accessed through different software platforms, but also creating data warehouses that facilitate interagency—such as, school districts, human and health services, and juvenile justice—sharing of records (e.g., Stanford’s John W. Gardner Center for Youth and Their Communities Youth Data Archive  and University of Pennsylvania’s Kids Integrated Data System). These integrated systems can allow evaluators to gather data on persons across time and agencies, allowing for the inclusion of a wider range of data on the state and conditions of the target and making longitudinal analysis, which was previously impeded because of bureaucratic constraints and costs, possible.

Rad Resource: Interesting ideas about the legal, ethical, and data quality issues associated with integrated data systems can be found at Actionable Intelligence for Social Policy.

Hot Tip:

2. Information and communication technology is becoming increasingly more pervasive and transforming who, what, when, where, and how we teach and students learn. Massive open online courses (MOOCs), game-based learning, online learning management systems, online assessments, virtual schools are becoming widespread. These and other technologies have built-in features that are rich sources of data. Evaluators need to become more familiar with how usage data and archived content can be used in evaluation.

Hot Tip:

3. In this era of increased accountability, increasing costs associated with higher education, concerns about student debt on the rise, and the massive budget cuts that many states and localities initiated during the recession and have kept during the recovery, educational institutions are increasingly concerned about not only about “what works” but also “how much.” As such, costs analysis, which was seldom done in the past, is increasingly being considered. Evaluators may want to develop a more sophisticated understanding of cost analysis and promote the creation of guidelines and principles for cost analysis for specific domains.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. BLP Week: Ellen Steiner on Energy Efficient Evaluation
  2. Kristina Mycek on Spatial Analysis
  3. Systems Week: Glenda Eoyang on Complexity Demands Simplicity

Shannon Griswold, Alexandra Medina-Borja, and Kostas Triantis on Linking Scientific Discoveries to Societal Needs

American Evaluation Association 365 Blog - Mon, 06/30/2014 - 01:15

We are Shannon L. Griswold, Ph.D., a scientific research evaluator and member of AEA’s Research Technology and Development TIG, Alexandra Medina-Borja, Ph.D., Associate Professor of Industrial Engineering at University of Puerto Rico-Mayaguez, and Kostas Triantis, Ph.D., Professor of Systems Engineering at Virginia Tech. We are thinking about new ways to envision and evaluate impacts from discovery-based scientific research. Tracing dollars spent on funding research in universities to societal impacts is very difficult due to the long time lag between experimentation and commercialization, and the serendipitous nature of discovery.

Lesson Learned: Even though we can’t predict every outcome of scientific research, we can apply a general framework that allows us to envision the complex system of scientific discovery and identify areas of inquiry that could lead to major breakthroughs.

Hot Tip: Gather your research community and ask them to think backwards from societal needs (e.g., in transportation research this might be a solution for traffic congestion). This can be HARD for fundamental researchers; they are accustomed to letting curiosity drive their research questions. From societal needs, ask them to map several enabling technologies that could meet that need. Enabling technologies should be things that could solve that need but that don’t exist yet (e.g., teleportation). Finally, from enabling technologies, ask your research community to map out knowledge gaps. These are the things that we don’t know yet, which prevent us from developing enabling technologies (e.g., how do you convert all the mass in a human body into energy without blowing things up? How do you reassemble that energy at the destination into a human body?). It can be helpful to frame knowledge gaps as questions.

Hot Tip: Use societal needs, enabling technologies, and knowledge gaps to perform a content analysis of your research portfolio. How many of the topics are already funded? How many topics are not yet represented in the portfolio? This analysis should be performed in the context of a portfolio framework, which may help you envision the scope of your funding program’s discipline and relation to other funding streams.

Rad Resource: When mapping societal needs, enabling technologies, and knowledge gaps, it can be helpful to place them in a hierarchical framework to track their relationships. In this diagram, dotted lines show the direction in which the logic framework is generated, working backwards from societal needs. Solid arrows show the flow of scientific knowledge, from discoveries (knowledge gaps) to technologies that meet societal needs.

Rad Resource: The flow of knowledge and information in the scientific process is rarely linear. It is probably more accurately represented as a “ripple effect”. We can predict some discoveries and technologies (darker polygons), but others are emergent, and knowledge flows in all directions.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Leigh Wang on Database Design
  2. NA Week: Roger Kaufman on Needs Assessment
  3. Andrea Velasquez on Using the TPACK Framework for Evaluating Tech-Mediated Instruction

Sherry Campanelli and Laura Newhall on Facilitation Not Dictation: How a QI Team Succeeds

American Evaluation Association 365 Blog - Sun, 06/29/2014 - 06:01

Hi.  We’re  Sherry Campanelli, Program Compliance Manager and Laura Newhall, Clinical Training Coordinator, from the University of Massachusetts Medical School’s Disability Evaluation Services (DES). Although DES conducts evaluations regarding whether an applicant for public benefits can be found disabled, evaluation as a research endeavor is not our primary focus. Nevertheless, as an organization, we are committed to ongoing quality improvement efforts to enhance our services for people with disabilities. We use a team-based iterative approach to define and address problem functions and processes.

For example, we used the process described herein to develop Quality Assurance systems for our clinical, clerical and technical support processes. We have also used this method to tackle caseload backlogs, and effective processing of incomplete applications.

We’ve discovered over time, regardless of the issue or problem involved, that there are common techniques that help a quality improvement (QI) team be successful. We would like to share some of these lessons learned with you.

Lesson Learned: 

  • Determine and clearly state the issues to be solved and team goals.
  • Involve key staff (line staff doing the work and managers supervising the work) in the development of any QI initiative. They are in “the know” about areas that may be problematic.
  • Incorporate non-judgmental facilitation to keep up the momentum. Key components include:

o   Involving all participants in decision making/discussion;

o   Keeping meeting minutes and agendas;

o   Keeping track and sharing “to do” lists, “next steps” and progress towards goals;

o   Meeting on a regular and ongoing basis (don’t cancel meetings unless absolutely necessary);

o   Seeking management decisions and input as needed; and

o   Making sure you hear from the quiet folks in the room – they may need a little encouragement to speak up, but often offer great insights.

  • Utilize team members/subcommittees to perform specific tasks between meetings.
  • Utilize available qualitative and quantitative data.
  • Collect specific data, as necessary, to help define the problem and suggest solutions.
  • Do fact finding to support decision-making.
  • Maintain a “living” working document(s) as decisions are made to be incorporated into a final product.

Utilize pilot testing to determine feasibility and make changes (i.e., “fix bugs”) prior to full implementation.

  • Provide periodic communication to the rest of the department or organization during the project and at its conclusion.
  • Train all impacted staff on process improvements.
  • Conduct periodic assessments after implementation to assess success of the project.
  • Refine processes as new issues and changes occur.

Hot Tips:

  • Sometimes QI processes take longer than expected. “Keep going even when the going is slow and uncertain.”  G.G. Renee Hill
  • “To discover new ways of doing something – look at a process as though you were seeing it either for the first or last time.” Mitchel Martin

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. AKEN Week: Vanessa Hiratsuka on Continuous Quality Improvement, Quality Assurance, Evaluation, and Research: Where does my project fit?
  2. Karen Widmer on Knowledge Flow: Making Evaluation the Reference Point
  3. CP TIG Week: Helen Singer, Sally Thigpen, and Natalie Wilkins on Understanding Evidence: CDC’s Interactive Tool to Support Evidence-Based Decision Making

Linda Cabral and Jillian Richard-Daniels on Using Qualitative Data Collection to Better Understand Measure Development and Reporting

American Evaluation Association 365 Blog - Sun, 06/29/2014 - 05:02

Hello, we are Linda Cabral and Jillian Richard-Daniels from the Center for Health Policy and Research at University of Massachusetts Medical School. Collecting and reporting data on a set of predefined measures is something that many evaluators are asked to do. This typically quantitative process involves gathering data, often from multiple sources and stakeholders, to assess the amount, cost, or result of a particular activity. But have you ever thought about what goes into measure development, collection and reporting? A recent evaluation that we completed included interviews with the people involved in this process.

A federal grant interested in testing a set of child health quality measures was awarded to a group of stakeholders in Massachusetts. An example of such a quality measure is the percentage of infants who reached age 15 months during the measurement year, and who has six or more well infant visits during the first 15 months of life. Given that this was the first time that this particular set of measures was being collected, there was interest in learning about the feasibility, relevance and usefulness of these measures. Qualitative data collection in the form of interviews and focus groups were done with a variety of stakeholders, ranging from the people who defined and calculated the measures to the providers who were seeing measure results specific to their site.

Lessons Learned:

  • Do your homework ahead of time – When talking to people involved in the nitty-gritty of data measurement and calculation, you need to have a solid understanding of the technical aspects involved so that you don’t spend the entire interview asking background questions. Be comfortable with the subject matter.
  • Be flexible – The measure development process takes time. There can be unanticipated challenges or circumstances that can delay any component of the project. If interviews are planned for the end of a particular component, be flexible with the timing.
  • Orient interviewees, if necessary – Not all stakeholders, particularly consumers, will have a strong understanding of what a measure is. In order to get the desired feedback, you may need to spend time providing some background and education before you start asking questions.

Hot Tips:

  • In a project that takes place over the course of a number of years with several different components, try to complete interviews when the information will be the most current for the interviewee and you.
  • Have a tracking mechanism set up to help you stay organized with your data collection. For us, this takes the form of an Excel spreadsheet containing fields such as interviewee contact information, dates of contact and data collection, and staff responsible for transcription and quality assurance.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Linda Cabral and Jillian Richard-Daniels on Using Qualitative Data Collection to Better Understand Measure Development and Reporting
  2. Alice Hausman on Measuring Community -Defined Indicators of Success
  3. GOVT Week: David Bernstein on Top 10 Indicators of Performance Measurement Quality

Dan McDonnell on 4 Recent Social Media Changes You May Have Missed

American Evaluation Association 365 Blog - Sat, 06/28/2014 - 07:21

Hello, my name is Dan McDonnell and I am a Community Manager at the American Evaluation Association (AEA). In the fast-paced world of social media, things are always changing. Just as soon as you stop to take a breath, Facebook has tweaked its algorithm again. Or Twitter has updated its design. Recently, there have been a slew of changes that hit just about every major social network, and I thought I’d take the opportunity to give a few quick hits on what’s new, and how and how it affects evaluation professionals.

Hot Tip: Facebook Design Changes & More

Do you run a Facebook fan page? Facebook took a cue from Twitter who recently gave a facelift to the layout of fan pages. As of June 5th, the size of fan page  cover photo images has been adjusted to now require 851 x 315 pixels. In addition, Facebook removed what used to be called ‘page tabs’, replacing them with a simple menu of the major sections of your page – Timeline, About, Photos, Likes and More.

If the above changes weren’t enough, Facebook also gave users more freedom to customize the leftmost column of their fan page. Want your page ‘Reviews’ to be front and center? You can do that now! The entire left sidebar can be ordered entirely to your liking.

Hot Tip: Google + Authorship Limited

Remember that post I made a few months ago about Google Authorship? Well, it turns out, Google changed the game when it rolled out some major changes this week. While Google Authorship still exists, the biggest benefits have now been removed. Unfortunately, pictures are no longer supported in Google search results, nor will the author’s Google + circle information be shared. Now, if Authorship is correctly implemented into a blog post, only the name of the author will be added to the search result. Bummer.

Hot Tip: LinkedIn Premium Gets an Update

LinkedIn has become an essential tool for job seekers these days, as well as an excellent way to network. For the power users of the world, LinkedIn has a service called LinkedIn Premium, which I would highly recommend to anyone actively in job search mode. It’s a bit pricey though at $23.99 or $47.99 a month options, but with the addition of LinkedIn Premium Spotlight, a starter package that runs at just $7.99 a month, evaluation professionals can enjoy many enhanced benefits of LinkedIn without breaking the bank.

With Premium Spotlight, LinkedIn will make your profile stand out more among search listings, offer you suggestions for keywords to include in your profile to make yourself more visible for hiring managers and more. Check it out!

Hot Tip: Twitter Changes Fonts

Ok, so unless you’re a Helvetica purist, this one isn’t too big of a deal. Back on May 30th, Twitter angered (or delighted, depending on who you ask) font geeks around the world by changing the default typeface of Tweets from Helvetica Neue to Gotham. Some users have reported that the new font makes it more difficult to read, while others have embraced it fully. What’s your take?

Twitter Font Change

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Dan McDonnell on Google + | Dan McDonnell on Twitter

 

Related posts:

  1. Dan McDonnell on Improving Your Google + Authorship Efforts
  2. Dan McDonnell on Upcoming Changes to Twitter
  3. Dan McDonnell on Adding a Photo to Your Blog Post in Google Search Results

DVR TIG Week: Ann K. Emery and Stephanie Evergreen on the Data Visualization Checklist

American Evaluation Association 365 Blog - Fri, 06/27/2014 - 01:15

Hey friends! We are Ann Emery, Co-Chair of the DVRTIG, and Stephanie Evergreen, Founder of the DVRTIG – two evaluators who are crazy about data visualization.

“I love your examples, but how do I know what I should do next time I’m creating a graph?” We heard comments like this when we talked with evaluators about good graph design. We saw that evaluators had thirst for better graphs and a clear understanding of why better graphs were necessary, but they lacked efficient guidance on how, exactly, to make a graph better.

Rad Resource: Introducing the Data Visualization Checklist

Take the guesswork out of your next graph. Download our 25-item checklist for clear guidelines on graph text, color, arrangement, lines, and overall messaging. Read about what makes a memorable graph title (spoiler alert: it’s not “Figure 1”). Learn how to arrange your bar chart based on whether your categories are nominal or ordinal. Decide which default settings in your software program to keep and which ones to toss.

Not familiar with the terminology? The last page is a Data Visualization Anatomy Chart. Watch that example’s before-and-after remake in Stephanie’s training on Ignite presentations.

Hot Tip: How can you use the Checklist?

Get in the habit of producing several drafts before sharing final graphs with your clients. Draft, score, edit, repeat!

In this example, we printed an existing graph (page 6 here) in both color and black and white to see how the final chart looked for viewers. We scribbled notes all over the graph and the checklist as we scored. Overall, the graph earned 91% of the possible points—just above the cutoff that enables viewers to read, interpret, and retain the content.

Cool Trick: What’s next for the Checklist?

We are publishing examples to illustrate the 25 items as well as before-and-after remakes. Check out the growing galleries at http://annkemery.com/tag/data-visualization-checklist/ and http://stephanieevergreen.com/tag/data-visualization-checklist/. And we’re taking requests: Which checklist items would you like examples for?

We’re also hoping to present the checklist at the American Evaluation Association’s annual conference in October. Let’s high five there! Please please please can you take a picture of your existing data visualization, apply the checklist, and then take another picture? Tweet your redesigns to @annkemery and @evalu8r. Email them to annkemery@gmail.com and stephanie@evergreenevaluation.com. Fold them into paper airplanes and fly them to us! Send your redesigns and show people how awesome you are!

Big thanks to our pilot reviewers: James Coyle, Amy Germuth, Chris Lysy, Johanna Morariu, Jon Schwabish, David Shellard, Rob Simmon, Kate Tinworth, Jeff Wasbes, and Trina Willard.

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. DVR Week: Amy Germuth on Using Visual Design Principles
  2. DVR TIG Week: Ann K. Emery on Dataviz2: Visualization and Reporting about the DVR TIG
  3. DVR TIG Week: Johanna Morariu on the DVR Logic Model & Introducing the DVR Week

DVR TIG Week: Gretchen Biesecker on Applying Structures of Good Stories to Reporting Data

American Evaluation Association 365 Blog - Thu, 06/26/2014 - 15:54

Greetings AEA365! My name is Gretchen Biesecker, and I am the Vice President of Evaluation at City Year. City Year is an education-focused, nonprofit organization founded in 1988 that partners with public schools and teachers to keep students in school and on track to succeed.

This year I competed in my first storytelling slam—an event where people tell five-minute, first-person, true stories. Constructing and telling my story was really fun. I started thinking about new ways our staff at City Year could think about incorporating numbers or data into our communications and reporting.

Lessons Learned:

  • Sharing findings in context is important, especially for audiences that may be unfamiliar with the data. To someone within a school, improving the average daily attendance rate by 2% may be a huge win, but to someone outside education, without context, that increase may sound miniscule.
  • Taking a look at some resources to organize good stories was really helpful to me. Reviewing the ways to construct a good story: 1) Helped me generate ideas for sharing different kinds of numbers and data to share our story; and 2) Emerged as a foundational step before I think about data visualizations and creating reports or presentations.

Rad Resource: Nancy Duarte’s book, Resonate, is now available for free! Duarte shares multiple examples of effective story structures, which may inspire you.

Resonate by Nancy Duarte


Hot Tips: Here are some additional ideas you might take from storytelling. You can use numbers to:

  • Create Drama—good stories may be formatted as a sweeping saga, and you can use numbers to convey scale (e.g., City Year serves in 242 schools, reaching 150,000 students). Pairing that with a personal story and results from one child or case is powerful.
  • Set the Stage—numbers can be used to share the problem or give the context for results (e.g., fewer than 40% of students in the nation’s schools score at or above proficiency in English/Language Arts and math).
  • Share the Transformation—good stories have a beginning, middle, and end, or conflict and resolution—so we can share the “before and after” through numbers. You can also “show your work” and the effort or conflict that it took to achieve the results (e.g., in 2012-2013, students in City Year schools spent over 589,100 hours in extended learning time programming).
  • Catch the eye or ear with Repetition—sometimes good stories or speeches have repeating rhymes, words, or numbers, so think about when repeating a particular number may be effective or impactful.

I encourage you to find inspiration and new ideas from the things you love. They may not be within evaluation, but translating them into our field will help us reach more people to put results into action.

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Sharon Smith-Halstead on Storytelling in Evaluation
  2. René Lavinghouze on Sharing Your Program’s Story
  3. Sherry Boyce on Using Photos in Evaluation Reports

DVR TIG Week: Tony Fujs on Dataviz with the Grammar of Graphics: R you Ready?

American Evaluation Association 365 Blog - Wed, 06/25/2014 - 01:15

Hello! I’m Tony Fujs, Director of Evaluation at the Latin American Youth Center, a DC based non-profit organization. Today, I want to share my experience using R and ggplot2 for data visualization. Ggplot2 is a great tool from the R toolbox (a package, in R lingo). It relies on the powerful Grammar of Graphics framework, which helps “shorten the distance from mind to paper” (Hadley Wickam).

I started using R three years ago, and it has now become my main tool for data analysis and visualization.R is known to have a steep learning curve though, so before getting started, it’s probably a good idea to do a little bit of “cost-benefits” analysis, and check if R is a good fit for you.

Lessons Learned: Benefits

  • More flexibility: R currently meets 99.9% of my dataviz needs. From simple Bar charts, to Maps, to Social Networks… you name it! I can do it directly from R.  No need to learn multiple tools anymore.

  • Increased productivity:  I often need to generate the same charts on multiple data sets, or multiple subsets of the same dataset. With R it takes almost no effort to do this.

  • Transparency: Who never did open an Excel file with numbers that seemed to come directly from a magic hat? Coding forces you to become more transparent, and makes your analyses and dataviz easier to replicate. Your colleagues and your future self will thank you for this!

Lessons Learned: Costs

By the way, did I mention that R is free? Your costs-benefits analysis is looking pretty good! You still have some time investment ahead though… In my experience, there are two main barriers when starting dataviz with R and ggplot2:

-       Understanding the Grammar of Graphics: There is a strong logic behind the grammar of Graphics framework. Taking some time to understand it will make your learning experience much smoother.

-       Learning to code: If you don’t have any prior programming experience, moving from a point and click environment to writing code may entail some frustration.

Rad Resources: Here is a list of resources to help minimize learning costs:

1)    Get familiar with the grammar of graphics:

  • Read this paper by ggplot2 creator Hadley Wickam.
  • Check out this slide deck from my dataviz workshop using R and ggplot2. Pay special attention to exercise on slide #25

2)    Get familiar with the R environment:

3)    Start playing with ggplot2

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. DVR TIG Week: Johanna Morariu on the DVR Logic Model & Introducing the DVR Week
  2. DVR Week: Stephanie Evergreen on the Data Visualization and Reporting TIG
  3. Manny Straehle on the ICA Data Visualization Competition

DVR TIG Week: Rakesh Mohan, Lance McCleve, Tony Grange, Bryon Welch, and Margaret Campbell on Sankey Diagrams: A Cool Tool for Explaining the Complex Flow of Resources in Large Organizations

American Evaluation Association 365 Blog - Tue, 06/24/2014 - 01:15

We are Rakesh Mohan, Lance McCleve, Tony Grange, Bryon Welch, and Margaret Campbell of the Office of Performance Evaluations, an independent agency of the Idaho State Legislature.

Last year, the Legislature asked us to explain how funds move through the Department of Health and Welfare—the agency with the state’s largest budget. Legislators, including budget committee members, had difficulty understanding the department’s financial information. Given agency complexities and its sheer number of financial transactions (over two million in 2013), no one in recent history had brought together the separate parts of its financial records to explain how the parts function.

Communicating the flow of dollars was the trickiest piece of the study. We considered narratives, tables, and traditional flowcharts. Ultimately, we used Sankey diagrams that helped stakeholders visualize funds moving through the department making them the most useful features of our report.

One diagram shows how dollars flow through a program over a year.

The second diagram illustrates funds flowing into, out of, and within a program.

Sankey diagrams depict resources within a system by mapping their flow from an initial set of values to a final set of values. They were especially useful because the lines have widths proportional to their value throughout the system. Legislators could easily see the proportion of each fund throughout the flow. Initially we tried to create the diagrams in Visio, but manually sizing each line was extremely laborious.

Department officials were surprised how clearly our report presented complex aspects of their fund management. They are now using our report for in-house training. They also had us train them on creating Sankey diagrams for future reporting. The feedback from three key stakeholders further illustrates the usefulness of Sankey:

 

“This…really, really was a fine body of work.”

Representative Maxine Bell, Cochair, Joint Finance-Appropriations Committee

 

“The study does much to further enhance the transparency and understandability of the largest budget in state government.”

State Controller Brandon Woolf

 

“[The report] will provide valuable assistance for future planning.”

Governor Butch Otter

Hot Tips:

  1. We used 11×17 paper so the diagram would be big enough to show detail and keep the same orientation as the text.
  2. Subtle colors within similar families were more effective than contrasting colors. We also made colors transparent to clarify intersecting lines.
  3. We wanted to edit text during publishing, so we imported the diagram as an image and added text within the publishing software.

Rad Resources:

  1. We purchased e!sankey, which comes in two versions. The pro version allowed us to link to data in Excel.
  2. Google Charts Sankey
  3. Sankey Diagram History

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Sheila B. Robinson on Delightful Diagrams from Design Diva Duarte
  2. Rakesh Mohan and Lance McCleve on Why Evaluators Should Respond to the Interests of Sponsors and Stakeholders
  3. Dan Jorgensen on Effective Data Management

DVR TIG Week: Ann K. Emery on Dataviz2: Visualization and Reporting about the DVR TIG

American Evaluation Association 365 Blog - Mon, 06/23/2014 - 08:08

Welcome! I’m Ann Emery, Co-Chair of the Data Visualization and Reporting Topical Interest Group (“DVR TIG”). It seems only fitting to tell you more about the DVR TIG through, well, data visualization and reporting! Some highlights:

  • AEA has 48 topical interest groups. They specialize in everything from quantitative methods to advocacy evaluation to data visualization.
  • The Nonprofits and Foundations TIG is the largest, with 1,290 members in 2014. The Lesbian, Gay, Bisexual, and Transgender Issues TIG is the smallest, with 115 members. The DVR TIG falls in the middle, with 821 members this year.
  • 3 out of 4 DVR TIG members are women.
  • Most of the DVR TIG members–92%–have a master’s degree or higher.
  • 28 different countries are represented in the DVR TIG. Roughly 9 out of 10 members are from the United States.
  • The DVR TIG members work in a variety of settings: in nonprofits (33%), private businesses (26%), colleges and universities (26%), federal, state, and local agencies (12%), and school systems (4%).

What about demographic patterns over time? And how does the DVR TIG compare to other TIGs, or to AEA as a whole? This analysis is just the beginning. If you dig deeper into the data, let me know! I look forward to seeing what you find. Have you displayed evaluation findings through a dataviz video like this? How’d it go? I storyboard portions of my workshops and webinars to grab the audience’s attention (especially at the beginning and end of a presentation) and to offer step-by-step explanations of complex charts and diagrams. Do you have questions about how I designed this video? Want to make your own? Share your questions and comments below, or connect with me through @annkemery or annkemery.com. The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.  

Related posts:

  1. Susan Kistler on Learning From DVR Innovation
  2. DVR Week: Stephanie Evergreen on the Data Visualization and Reporting TIG
  3. Susan Kistler on Suggesting Data Visualizations to Win a Copy of Evaluation Strategies for Communicating and Reporting

DVR TIG Week: Johanna Morariu on the DVR Logic Model & Introducing the DVR Week

American Evaluation Association 365 Blog - Sun, 06/22/2014 - 01:15

Hey there! I’m Johanna Morariu, a Director of Innovation Network and the Co-Chair of the DVRTIG. DVR is the Data Visualization and Reporting TIG, and we work within the AEA community and through our evaluation work to improve the quality of communications through better data visualization and improved approaches to reporting evaluation findings.

You could say the first DVRTIG meeting was in 2010, when Stephanie Evergreen convened a small, rowdy group of us to discuss interest in founding the a new TIG to advance issues related to data and information design and reporting within the evaluation community. Since then, we’ve been a fast growing TIG with more than 800 members. (Thanks Stephanie!)

Since those earliest days, TIG members and leaders have worked hard to develop a knowledge and resource base for members and the broader AEA community. One of those newest resources that we’d like to share is a DVRTIG logic model. Yes, our very own logic model!

Rad Resource: The DVRTIG Logic Model was developed by DVRTIG leadership coming out of the Evaluation 2013 DVRTIG business meeting. The logic model reflects the ideas and ambitions suggested by TIG members at the business meeting. There are more activities than we can hope to achieve, but this is our record of ideas and possibilities for the TIG for the next two years. Check it out and leave a comment in the AEA resource library.

Rad Resource: Another great resource if you’re interested in learning more about the DVRTIG is our website. Anne Worthington is our TIG webmaster and maintains an excellent resources hub of dataviz resources, DVRTIG videos and content, and much more!

Hot Tip: Did you know there is a forum for chatting directly with AEA members interested in data visualization and reporting? Through the DVRTIG website, you can access the DVRTIG eGroup to start a conversation, get feedback on your visualization or report, or share new resources!

We’ve got a great week of posts lined up for you! First, Ann Emery (my colleague at Innovation Network and DVRTIG Co-Chair) will wow you with some stats and visualizations about the DVRTIG. Up next will be Rakesh Mohan and his colleagues from Idaho State’s Office of Performance Evaluations to explain Sankey diagrams. Then we’ll hear from Tony Fujs of the Latin American Youth Center about using R and ggplot2 for data visualization. From there we’ll turn it over to Gretchen Biesecker, Vice President of Evaluation at City Year, to explore storytelling as an effective communication method. And we’ll wrap up the DVR week with Ann Emery and Stephanie Evergreen with a test drive of their Data Visualization Checklist.

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. DVR Week: Amy Germuth and Johanna Morariu on Kicking off the DVR TIG Week
  2. DVR Week: Stephanie Evergreen on the Data Visualization and Reporting TIG
  3. DOVP Week: Aimee Sickels on the Two Minute Logic Model

Best of aea365: Sheila B Robinson on Being an AEA365 Sponsored Weeks Archaeologist!

American Evaluation Association 365 Blog - Sat, 06/21/2014 - 01:15

Hello! Sheila B. Robinson here, aea365′s Lead Curator and sometimes Saturday contributor. I originally composed this post in May of 2013, but feel it deserves another go as I’m feverishly queuing up some fabulous sponsored weeks for summer. Keep up with aea365 through weekly Headlines and Resources, the new and improved AEA website, and AEA newsletters.

I work in PK12 education at Greece Central School District, and in higher education at the University of Rochester’s Warner School of Education. As aea365’s current Lead Curator, I’ve had the pleasure of working with a number of groups – American Evaluation Association Topical Interest Groups (AEA TIGs), AEA Affiliates, and other groups that are united by evaluation practice in various contexts.

Hot Tip: Leave no stone unturned! In other words, don’t skip entire weeks. You can learn a lot even when a sponsored week’s group name doesn’t resonate with you. During sponsored weeks, you can read about how evaluators in different contexts from your own have grappled with evaluation challenges, learned something from working in diverse communities, or tried new technologies to enhance their evaluation practice and are now willing to share their experiences with all of us.

Hot Tip: Dig for enticing artifacts! Look for posts with content that transcends the focus of the sponsored week. For example, while I am not an environmental program evaluator, nor do I evaluate extension education programs, I found these two gems during sponsored weeks:

  • In this post, Sara El Choufi shared resources for learning Excel during the Environmental Program Evaluation (EPE TIG) sponsored week.
  • In this post, Melissa Cater shared information on creating a Community of Practice during Extension Education Evaluation (EEE TIG) week.

Lesson Learned: While our sponsored week authors may share evaluation foci with each other, they offer Hot Tips, Cool Tricks, Lessons Learned, and Rad Resources that appeal to and can be educative for a broad range of evaluators.

 

Cool Trick: Get your hands dirty! Sift through the archive and unearth your own gems in sponsored (and non-sponsored!) weeks.

Lesson Learned: Many sponsored weeks have themes that cut across evaluation contexts. In addition to TIG-sponsored weeks,we’ve hosted Cultural Competence Week, Innovative #Eval Week, Video in #Eval Week, AEA affiliate weeks, Bloggers Series Weeks, and Local Area Working Group Weeks, among others.

Rad Resource: History in the making: Check out aea365 and our archive for a list of over 1000 nuggets of evaluation wisdom from hundreds of authors. With about 70 sponsored weeks on aea365, there’s a lot to learn! So, get into comfortable clothes, get your virtual trowel, sieve, and brush and get your read on!

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Sheila B. Robinson on Being an AEA365 Sponsored Weeks Archaeologist!
  2. Cultural Competence Week: Dominica McBride on Cultural Competence and aea365
  3. Jeremy Jewell on Using Wait List Control Groups in Evaluation

CLEAR Week: Claudia Maldonado and Alicia López on Creating “Enabling Environments” for Monitoring & Evaluation

American Evaluation Association 365 Blog - Fri, 06/20/2014 - 01:15

We are Claudia Maldonado and Alicia López from the CLEAR Center for Spanish-speaking Latin America, proud members of the CLEAR Initiative.

Would you recognize an “enabling environment” if you “saw” it?  The enabling environment concept is very popular among professionals in evaluation. But what does it mean?   How you create and cultivate this kind of environment requires some out-of -the-box kind of thinking.

Last November the CLEAR Centers of Latin America and Anglophone Africa organized a South-South roundtable in Pretoria, South Africa that tried to do something different.  We brought together members of government, parliament and technical experts from eight developing countries (Benin, Colombia, Ghana, India, Mexico, Peru, South Africa, and Uganda).  We had them role-play, discuss and reflect informally about the role of evidence in development. The idea was to provide a neutral space for people in high-level decision positions, with the ability to push legislation, and technical experts to share experiences and knowledge in a very open forma.  And what did we get out of it?  A fascinating working-group, specific country-level commitments and a lot of fun.

Lessons Learned: Plan, plan, plan, ahead! The selection of participants and a well-thought agenda are essential.

Hot Tip: We held bimonthly meetings with a steering group of senior specialists from Mexico, South Africa and Ghana and the facilitation team to plan the round-table’s content and goals. In-depth, participatory work for the construction of the agenda and the selection of adequate strategic participants (combining enough experience, decisiveness and expertise) was crucial for success.

Lessons Learned: A traditional lecture format was just not going to cut it! We needed participants to get to know each other and be willing to openly share the frustrations and challenges they face. A flexible format and a facilitation process that enabled collaboration and engagement across institutional roles and national boundaries helped.

Hot Tip: Carefully select well prepared and experienced facilitators. Group dynamics help to create an atmosphere of trust.  At first these activities may seem silly or a waste of time. They are not!

Lessons learned: Ok, meeting our colleagues is rewarding, but we came here for results!  Aim for written commitments.

Hot Tip:   Set aside enough time to establish agreements. The last three sessions of the event were dedicated to drafting country and regional action plans taking into consideration learning and insights from the interactions.

Country action plans included: Designing of a legal/administrative framework to promote compliance and learning through evaluation; effectively supply relevant information; ensure that recommendations outputs are evidence-based, timely, clear, and politically and economically feasible; and create a parliamentarian forum on M&E.

Rad Resource: South-South Round Table on the Demand and Use of Evidence in Policy Making and Implementation.

(Share Clip)

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. CLEAR Week: Tim Clynick on Beyond Rhetoric: Africa’s National Evaluation Movement
  2. CLEAR Week: Diva Dhar and Urmy Shukla on Designing Courses for M&E Capacity Building: Lessons from South Asia
  3. FIE/MME Week: Donna Podems on Applying Feminist Evaluation for Non-feminist Evaluators

CLEAR Week: Boubacar Aw on Evaluation Capacity Building in Francophone Africa: Development of teaching materials through training of trainers (TOT)

American Evaluation Association 365 Blog - Thu, 06/19/2014 - 01:15

I am Boubacar Aw, Coordinator of the Regional Centers for Learning on Evaluation and Results (CLEAR) for Francophone Africa hosted at Centre Africain d’Etudes Superieures en Gestion (CESAG) in Senegal, Dakar. I am writing today to offer practical tips on how to develop teaching materials through a Training of Trainers (ToT) model. These tips are especially helpful when you are trying to develop teaching materials adapted to different contexts.

Lessons Learned Through Experience:

There are numerous teaching materials on M&E in English. The main challenge faced by Francophone Africa is to develop materials in French– there is work do to! It is not just about translation; it is about how to adapt materials to Francophone African context with “real example” case studies to make them useful to the practitioners in the field. A great way to develop such materials is through a ToT approach.

Before a ToT program begins, teaching materials are prepared by a team of master trainers. During a ToT event, trainers use these materials for the training. At the same time, trainees are asked to divide themselves into groups by modules of their interests and to provide feedback on the teaching materials. Moreover, trainees share their own experiences in M&E and provide “real examples.” Such examples are incorporated into the teaching materials as case studies.

During the ToT event, mock-training is organized so that trainees can already test the materials as well as case studies. When trainees go back to their own countries and work places, they can further test the materials and provide further suggestions of necessary adjustments to the trainers.

Hot Tips:

  • Involving trainees to develop teaching materials turns out to be a very effective way to make necessary adaptations to the materials to a Francophone African context.
  • Organizing a mock-training during a ToT event is a good way to make necessary modifications to teaching materials. Trainees also feel more at ease to use case studies suggested by them during a mock-training.
  • It is important to have one trainer responsible for harmonizing and finalizing the teaching materials!

Rad Resources:

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Seth Kaplan & Sue Sarpy on Organizational Factors When Trying To Evaluate Training Initiatives
  2. Marybeth Neal on Using a Wall to Engage Stakeholders
  3. Ed Eval TIG Week: Amy Gaumer Erickson on Evaluating the Quality of Professional Development

CLEAR Week: Gemma Stevenson on Building M&E Skills and Knowledge: Lessons from Pakistan

American Evaluation Association 365 Blog - Wed, 06/18/2014 - 01:15

Hello – My name is Gemma Stevenson. I am Associate Director for the Center for Economic Research in Pakistan (CERP) where we run rigorous research projects and deliver evaluation trainings as part of CLEAR South Asia.

So what have we learnt over the last three years delivering trainings on M&E to the Pakistani government and NGO community? What are their most pressing constraints to conducting quality evaluations, and what do they need in the way of training?

Cool Trick: Taking the time to conduct a demand assessment is a great way of answering such questions. CERP conducted an assessment at the end of last year through a brief survey and in-depth interviews with our partners. The exercise unearthed a number of interesting findings for the Pakistani context

Lesson Learnt: First, there remain a number of conceptual hurdles in M&E among many government and NGO partners. A common confusion is mixing up inputs and outputs and outputs and outcomes. For example, a project to build a library – the outcome was seen as the completion of the physical building and the purchase of all the books rather than, say, an improvement in literacy or an increase in IT skills. Well, good to know so we can try to tackle these fundamental issues head-on when engaging with certain partners during our training activities.

Lesson Learnt: Another interesting finding was that our partners in Pakistan are less immediately focused on developing skills for collecting their data, but more concerned about up-skilling when it comes to analysing data sets. In fact our partners expressed an overwhelming level of interest in developing their skills using statistical software such as STATA.

But here is something which is really telling: when asked about the most significant challenge in conducting more frequent monitoring & evaluation activities, it was not a lack of infrastructure, nor a lack of qualified personnel that posed the biggest challenge, but the lack of specific technical capacity of their personnel. So CLEAR still has a very important role to play in Pakistan! We’ll continue to roll out further training and other capacity building initiatives to try to meet this demand.

Rad Resources: Did you know that if you are teaching a short course using STATA, you can contact STATA Corporation to arrange for a free temporary license for you and your students to load on their laptops.  It’s not advertised, so call them in their Texas offices.

(Share Clip)

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. CLEAR Week: Diva Dhar and Urmy Shukla on Designing Courses for M&E Capacity Building: Lessons from South Asia
  2. CLEAR Week: Tim Clynick on Beyond Rhetoric: Africa’s National Evaluation Movement
  3. Pedro Mateu on Evaluation Meta-analysis

CLEAR Week: Tim Clynick on Beyond Rhetoric: Africa’s National Evaluation Movement

American Evaluation Association 365 Blog - Tue, 06/17/2014 - 01:15

Tim Clynick, Acting Director of the Centre for Learning on Evaluation and Results in Anglophone Africa, (University of Witwatersrand in South Africa, Ghana Institute of Public Administration and the Kenya School of Government)

Greetings from CLEAR Anglophone Africa!

We are often asked where our programme – now in its third year – is having the greatest impact.

Lesson Learned: Growing the Supply. Since 2012, we have trained nearly 1,150 M&E practitioners and public servants, some in advanced methods and the majority in M&E and Result-based Management. But our training programmes also need to align better to national systems and standards. Supply constraints hamper efforts to deepen and improve national evaluation system building and significantly more effort is required to grow more skilled practitioners.

Responding to Demand. Our 11 country M&E studies have allowed us to identify opportunities to play a convening role so that national stakeholders can be mobilized around a common diagnostic such as need for a national evaluation policy or framework. CLEAR’s growing body of knowledge of national M&E systems has also been important in enabling mobilization of resources to meet specific local gaps, e.g. technical advisory services to sectoral departments to manage or conduct impact evaluations, or to support policy making and guidelines for ministries responsible for coordinating government programmes.

Receptive Environment. There is no lack of appreciation amongst African governments or stakeholders of the need for evidence-based learning and decision making. But we now understand the political economy of this demand and where it is real and meaningful – as opposed to symbolic or merely rhetorical – and act or respond accordingly.

Beyond Rhetoric: A successful partnership in South Africa. CLEAR recently participated in a session with the South African Presidency reflecting on successes in consolidating the National Evaluation System. The system is coalescing around the national norms and standards and guidelines. Thirty-eight National Evaluations are underway, procured or have been completed since 2011. A further 60 national – and between 50-100 provincial and departmental – evaluations are planned by 2016. M&E officers and programme managers are now demanding support to deepen their own professional evaluative practice. That public service managers are responding in this way can be considered a huge success for the evaluation movement in South Africa. The implications are galvanizing government and service providers across the country. The highpoint reached in South Africa is however part of a larger groundswell across the African Continent – in Kenya, Ghana, Zambia, Uganda, and elsewhere where we can look for similar results.

Rad Resources: In addition to following CLEAR Anglophone Africa, keep on top of what’s happening in African evaluation through the African Evaluation Association-AfrEA.

(Share Clip)

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Frank Meintjies on Embedding M&E Within Organisations
  2. CLEAR Week: Diva Dhar and Urmy Shukla on Designing Courses for M&E Capacity Building: Lessons from South Asia
  3. Frank Meintjies on How Evaluation and Strategic Planning Go Together

CLEAR Week: Diva Dhar and Urmy Shukla on Designing Courses for M&E Capacity Building: Lessons from South Asia

American Evaluation Association 365 Blog - Mon, 06/16/2014 - 01:15

We are Diva Dhar (Program Director) and Urmy Shukla (Capacity Building Manager) at the CLEAR South Asia Regional Center, based at J-PAL South Asia at the Institute for Financial Management and Research.

We work to strengthen M&E skills in South Asia, focusing on India, Pakistan, Bangladesh, Sri Lanka, and Nepal. As part of our work, we conduct custom workshops and M&E technical advisory services for government, NGOs, donor organizations, and associations of professional evaluators. Recent examples including capacity building workshops with Population Council (Bangladesh), the Sri Lankan Evaluation Association, USAID/India, and the Indian Administrative Services.

In this post, we put together a few helpful lessons for designing customized M&E courses and technical advisory services.

Lessons learned: Know your audience! Many different organizations seek out M&E capacity building services, each with different needs, skills, and interests. Needs assessments are a useful tool to better understand your partners’ M&E background and goals. The results of these needs assessments will help you design customized courses and services that target your partners’ specific M&E needs and interests. Organizations also appreciate the effort in getting to know them better.

Needs assessments can be conducted in a variety of ways:

  • Online forms and surveys
  • Structured or semi-structured interviews
  • Diagnostic tools to review organizational systems and processes

Hot Tip: While online surveys are faster and easier, they are often produce inaccurate results. Respondents tend to under- or over-estimate their M&E skills and knowledge, and we often do not get enough information on specific challenges in implementing good M&E practices. Interviews are time-consuming, but helpful in getting a better understanding of M&E practices and abilities.

Lessons Learned: Needs assessment interviews need to be framed correctly to be useful. Organizations should be informed that these exercises are for learning and training purposes, and are not meant to be an appraisal on their M&E capabilities. This also helps in getting buy-in for the needs assessment exercise, as well as ensuring that employees are available for interviews.

Cool Trick: When planning workshops, use the needs assessments to divide participants into more uniform groups for break-out sessions or facilitator-led group exercises. This can be done based on their M&E skills, interests, or focus areas. This ensures that each group can be taught at their level and with relevant examples. For example, participants working on health projects with a basic level of M&E understanding can be grouped together. Similarly, facilitators or trainers can also be assigned to groups based on their levels and interests.

Rad Resource: Check out CLEAR South Asia’s Interactive Course Guide – a quick and easy-to-read manual on conducting effective and interactive training events.

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Cindy Banyai on Social research and evaluation in an Asian context
  2. PD Presenters: Michele Tarsilla on a new Framework for Evaluation Capacity Development
  3. IC TIG Week: Susan Wolfe on Networking to Enhance Your Business or Career