Monitoring, Evaluation and Learning Systems

David Fetterman on Google Glass Part I: Redefining Communications

“Ok, glass.” That’s how you activate Google Glass. I’m David Fetterman and that’s me to the right wearing Google Glass. I’m an empowerment evaluation synergist and consultant, busy father and spouse, and owner of Fetterman & Associates.

Rad Resource – Google Glass: Google Glass is a voice and gesture activated pair of glasses that lets you connect with the world through the internet. You can take a picture, record a video, send a message, listen to music, or make a telephone or video call – all hands free.

Hot Tips – Redefining Communications: Google Glass is not just another expensive (currently about $1500) gadget. It can free us up to do what we do best – think, communicate, facilitate, and, in our case, assess. Here is a brief example.

I said “Ok, Glass,” then “make a call to Kimberly James.” She is a Planning and Evaluation Research Officer I am working with at the W.K. Kellogg Foundation.

Kimberly asked how the evaluation capacity building webinar is coming along. Via Glass, I took a screenshot and mailed it to her so we can discuss it. When a colleague is mentioned, with a few swipes of my finger on the frame, I find a picture on the web, and miraculously remember who we are talking about.

Mid-conversation, Kimberly needed to step away briefly. While on hold, I sent a note to colleagues in Arkansas to ask them to check on the data collection for our tobacco prevention empowerment evaluation.

Kimberly returned to the call and we discussed a recent survey. With a simple request, the display of our results appeared, reminding me what the patterns look like.

Did I mention that I did all of these things while making lunch, picking up my son’s clothes off the floor, letting the dogs out, and emptying the dishwasher?

Later in the day, with a tap on the frame, I confirmed our scope of work with Linh Nguyen, the Vice President of Learning and Impact at the Foundation, while dropping my son off for piano lessons.

Later in the week I plan to use Google Hangout to videoconference with another colleague using Glass. When she connects during a project site visit, she will be able to take pictures and stream video of her walk around the facilities, bringing me closer to the “hum and buzz” of site activities.

Lessons Learned:

Respect people’s privacy – do not wear Google Glass where it is not wanted, will put people off, or will disrupt activities. Do not take pictures without permission. Remove it when you enter a bathroom.

Rad Resources

Hot Tip: Stay tuned for Part II tomorrow when I will cover using Google Glass as an evaluation tool.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. David Fetterman on Participation and Collaboration
  2. David Fetterman on Google
  3. CPE Week: David Fetterman on Empowerment Evaluation

Emergent Design Webinar

Networking Action - Wed, 04/16/2014 - 11:41

Topic:  Emergent Design

Traditional planning strategies do not work for complex challenges:  too much is changing and there is continual learning about appropriate action.  Another approach relevant for such situations is emergent design.  This is iterative design that chrysalises as …

Susan Kistler on Innovative Reporting Part II: Book Giveaway and #altreporting Videos

American Evaluation Association 365 Blog - Wed, 04/16/2014 - 01:16

My name is Susan Kistler and I am on a crusade to expand our reporting horizons. Earlier this month, we looked at little chocolate reports. Today, let’s consider adding videos to your evaluation reporting toolbox.

Get Involved: But first, a little incentive for you to share your best alternative reporting ideas. And possibly get a reward for doing it. In the notes to this blog, or via twitter using the hashtag #altreporting, share either (a) your best unique evaluation reporting idea, or (b) a link to a great alternative evaluation report, and in either case note why you love it. I’ll randomly draw one winner from among the commenters/tweeters and send you a copy of “How to Shoot Video That Doesn’t Suck,” a book that can help anyone create video that isn’t embarrassing. Contribute as often as you like, but you will be entered only once in the random drawing on May 1.

Back to our programming. If you are reading this via a medium that does not allow you to view the embedded videos, such as most email, please click back through to the blog now by clicking on the title to the post.

Rad Resource – Unique Reporting Videos: Kate Tinworth, via a post on her always thought-provoking ExposeYourMuseum blog, recently shared three wonderful short video reports made by her audience insights team when she was working at the Denver Museum of Nature and Science. Each uses everyday objects to help visualize evaluation findings in an engaging way.

This video is my favorite of the three. It introduces the evaluators, reports demographics via a stacked bar chart built from jellybeans, and is at once professional and accessible.

Cool Trick: Kate’s team met museum volunteers and staff at the door with small bags of jellybeans that included a cryptic link to the report in order to get people to view the video.

Rad Resource – Unique Reporting Videos: This video from a team in Melbourne, Australia, shares findings from an evaluation of a primary school kitchen gardening program. It introduces the key stakeholders and deepens our understanding of the program without listing its components.

Rad Resource – Unique Reporting Videos: I wrote before on aea365 about getting this mock reporting video made for $5. I can still envision it embedded on an animal shelter’s website, noting how the shelter is using its evaluation findings. My favorite part is that it talks about evaluation use – how things are changing because of the evaluation at a small business.

Rad Resource: Visit the Alternative Reporting – Videos Pinterest Page I’m curating for TheSmarterOne.com for more reporting video examples and commentary.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Video in #Eval Week: Kas Aruskevich on Telling the Story Through Video
  2. Courtney Heppner and Sarah Rand on Producing Online Evaluation Reports
  3. Susan Kistler on Learning From DVR Innovation

Ann Price on The (Evaluation) Road Less Traveled

American Evaluation Association 365 Blog - Tue, 04/15/2014 - 01:40

Hello fellow evaluators! My name is Ann Price and I am President of Community Evaluation Solutions, near Atlanta, Georgia. A few weeks ago a friend and I spent the weekend in the Georgia Mountains at the Hike Inn, a state park only accessible via a 5 mile “moderate” hike. There is no cell phone, no tv, no internet. It was nice to disconnect and reflect on life and work. This blog about my reflections over the weekend as an external evaluation consultant.

My friend and I have found over the years that even though we work in different areas, our processes and our relationships with clients are quite similar. We both have a penchant for metaphor so we had fun over the weekend applying metaphors to our clients and our work.

The first thing we did was spend ½ hour just trying to find the trail head. I told my friend this was similar to programs not doing the ground work for an evaluation (i.e. failing to design a program logic model or a strategic plan or in our case, having the map but not following it). When all else fails, read the directions….

The hike was a lovely, albeit up and down trek. So the second thing we learned was something my son’s scout leader once said, “Everyone is on their own hike.” We reminded ourselves of that as folks of all ages passed us by (that was a bit discouraging). But the main point is to start on the path. Similarly, you may not have the biggest, most well-funded program. But it is important to start the evaluation journey or you will never “get there.” You do this by building your program’s organizational and evaluation capacity.

Tips and Tricks:
The hike was pretty steep at times, so we had to stop every once in awhile and catch our breath. We kept ourselves motivated by setting goals (Let’s just make it to the next tree! Think benchmarks and indicators). Evaluation work is the same way. It’s important to take a break and look at your data. If you don’t you might miss some pretty awesome sites (or findings). So stop every once in awhile and see where you are. Is your program where it needs to be? If your program is not, make an adjustment. And if you need help, here are a few great resources to guide you on your way.

Rad Resources:
Start with baby steps if you must. There are plenty of free resources out there to help you on your journey:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Dan McDonnell on Twitter Etiquette and Data Archiving
  2. Stephanie Evergreen on Project Management Tools
  3. GEDI Week: D. Pearl Barnett on Cultural Responsiveness in a Representative Bureaucracy

Jordan Slice on How Being a Creator Informs Being an Evaluator

American Evaluation Association 365 Blog - Mon, 04/14/2014 - 01:58

Greetings to my fellow #DataNerds! My name is Jordan Slice. I am a Research Specialist at Richland One, an urban school district in Columbia, South Carolina. In addition to being a full-time evaluator, I create handmade pieces for my Etsy shop, resliced.

As a handmade business owner, many of the sales I make are custom orders. People really appreciate when something is tailored to meet their needs. The same is true for evaluation stakeholders: your results are much more likely to be appreciated (and used!) if they answer the questions your stakeholders need to know.

Lesson Learned: Whether I’m making a custom purse (that’s one of my bags to the right) or designing a program evaluation, clear communication is key. For example, if a customer sends me her grandfather’s favorite shirt and requests that I make her a purse using the fabric, it is imperative that we come to a clear agreement about the design of the purse before I start constructing. Similarly, when evaluating a program, it is imperative that you consult with the stakeholders before developing your evaluation if you expect the results to be utilized.

Hot Tip: Keep it simple. While you and I may love geek speak, flooding your stakeholders with evaluation jargon may impair their ability to understand your results. Whether you are talking with stakeholders, constructing a presentation, or writing a report, commit to the mantra that less is more. Once I gather my summary in writing, I use a two step revision process. First, I focus on organizing the content for better flow. Second, I put on my minimalist cap and cut out all the excess fluff (usually repetitive statements or unnecessary detail). Before finalizing any report, always ask a colleague (or stakeholder when appropriate) to proof and provide feedback. I employ the same technique when I am building newsletters (Rad Resource: Mail Chimp – free & user-friendly!) or item listings on Etsy.

Rad Resource: Stephanie Evergreen has some really great posts (like this one!) on her blog with tips for creating better visualizations with your data.

Another Hot Tip: Allow yourself time to focus on something creative (even just a daydream) several times a week. This can give your mind the break it needs to process information and improve your focus. Pursue a new hobby or build on an existing interest. You may be surprised at how this new skill can help you grow as an evaluator.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Paul Watkins on Itemized Report-Writing Template
  2. Judith Kallick Russell on Translating Findings Into Action
  3. Stephanie Evergreen on Project Management Tools

Kylie Hutchinson on The Ever Expanding Evaluator’s Toolbox

American Evaluation Association 365 Blog - Sun, 04/13/2014 - 05:34

My name is Kylie Hutchinson.  I am an independent evaluation consultant with Community Solutions Planning & Evaluation.  In addition to evaluation consulting and capacity building, I tweet at @EvaluationMaven and co-host the monthly evaluation podcast, Adventures in Evaluation along with my colleague @JamesWCoyle.

When I started out in evaluation 26 years ago, I was focused on being a good methodologist and statistician.  After deciding to work primarily with NGOs I learned the importance of being a good program planner.  Employing a participatory approach required me to become a competent facilitator and consensus-builder.  These days, the increased emphasis on utilization and data visualization is forcing me to upgrade my skills in communications and graphic design.  New developments in mobile data collection are making me improve my technical skills.  A recent foray into development evaluation has taught me the important role that a knowledge manager plays in evaluation. Finally, we are starting to understand evaluation capacity development as a process rather than a product, so now I need expertise in organizational development, change management, and the behavioral sciences.  Whoa.

Don’t get me wrong, I’m not complaining.  Every day I wake up and think how lucky I am to have picked such a diverse career as evaluation. But with all these responsibilities on my plate, my toolbox is starting to get full and sometimes keep me awake a night.  How can I manage to be effective at all of these things?  Should I worry about being a Jack of all trades, Master of none?

Hot Tip:  You don’t have to do it all.  Determine your strengths and outsource your weaknesses. Pick several areas of specialization and ask for assistance with the others.  This help may come in the form of other colleagues or departments.  For example, if you think you need help with change management, sub-contract an organizational development consultant to your team.  If you work in an organization with a communications or graphic design department, don’t forget to call on their expertise when you need it.

Hot Tip:  Take baby steps.  If you want to practice more innovative reporting, don’t assume you have to become an expert in communication strategies overnight. Select one or two new skills you want to develop annually and pick away at those.

Hot Tip:  If you can, strategically select those evaluations that will expose you to a new desired area, e.g. mobile data collection or use of a new software.

Rad Resource:  Even if you’re not Canadian, the Canadian Evaluation Society’s Competencies for Canadian Evaluation Practice provide a great basis from which to reflect on your skills.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Stephanie Evergreen on Graphic Design
  2. Cheryl Poth on Articulating Program Evaluation Skills Using the CES Competencies
  3. Kerry Bruce on Getting Started with Mobile Phones

Sheila B. Robinson on Delightful Diagrams from Design Diva Duarte

American Evaluation Association 365 Blog - Sat, 04/12/2014 - 05:45

Hello! I’m Sheila B. Robinson, aea365′s Lead Curator and sometimes Saturday contributor with a new cool tool to spice up your evaluation presentations and reports!

Do you know the feeling you get when you stumble upon something so good you want to share it, but then again, part of you wants to keep it all to yourself? It will be apparent from this post which side won out for me.

Lesson Learned: Based on advice from respected presentation and information designers, I now shy away from canned, cliche, or clip art images, including charts and diagrams. I’m no designer though, and often find it challenging to start with a blank page when I have something to share that calls for a good visual representation of a relationship.

I’ve enjoyed Microsoft’s SmartArt graphics that come with Office, and they are quite customizable, but with only 185 choices or so, I find I start recognizing them in other people’s presentations, especially when they are not customized by the user, and they begin to remind me of the overused, 20th century clip art we’ve all come to loathe.

Rad Resource: Turns out, one of my favorite presentation designers, Nancy Duarte, has offered her expertise in a fabulous resource she has made available to all of us, and it’s FREE! Diagrammer™ is “a visualization system” featuring over 4,000 downloadable, customizable diagrams. Duarte, Inc. makes it easy to search for exactly what you need by allowing you to search all diagrams, or filter by relationship (flow, join, network, segment, or stack), style (2D or 3D), or number of nodes (1-8) needed.

Once you choose a diagram (and “shopping” for one is half the fun!), you simply download it as a PowerPoint slide, and fill in your text, or customize the various components. You can change shapes, colors, sizes and more. Diagrams range from the very simplest to somewhat complex. Here are just a few examples:

Most diagrams you see come in a variety of configurations. Each of the above examples are also available with different numbers of nodes.

Hot Tip: Duarte’s diagrams are in a gorgeous color palette if you ask me, but often it’s the colors you want to customize to match your report style or the colors of your organization. Here’s a before and after with the original digram, and my redesign.

Cool Trick: Take some time searching diagrams as you’re thinking about the relationship you want to communicate. This added reflection time will give you the opportunity to dig a little deeper into your data and you may be rewarded with new insights.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. John Nash on Creating Outstanding Presentation Slides
  2. Michelle Landry and Judy Savageau on No Need to Reinvent the Wheel: Project Management Tools for Your Evaluation Projects
  3. Best of aea365 week: John Nash on Creating Outstanding Presentation Slides

EEE Week: Cheryl Peters on Measuring Collective Impact

American Evaluation Association 365 Blog - Fri, 04/11/2014 - 01:08

My name is Cheryl Peters and I am the Evaluation Specialist for Michigan State University Extension, working across all program areas.

Measuring collective impact of agricultural programs in a state with diverse commodities is challenging. Many states have an abundance of natural resources like fresh water sources, minerals, and woodlands. Air, water and soil quality must be sustained while fruit, vegetable, crop, livestock and ornamental industries remain efficient in yields, quality and input costs.

Extension’s outreach and educational programs operate on different scales in each state of the nation: individual efforts, issue-focused work teams, and work groups based on commodity types. Program evaluation efforts contribute to statewide assessment reports demonstrating the value of Extension Agricultural programs, including public value. Having different program scales allows applied researchers to align to the same outcome indicators as program staff.

Hot Tip: Just as Extension education has multiple pieces (e.g., visits, meetings, factsheets, articles, demonstrations), program evaluation has multiple pieces (e.g., individual program evaluation about participant adoption practices, changes in a benchmark documented from a secondary source, and impact assessment from modeling or extrapolating estimates based on data collected from clientele).

Hot Tip:  All programs should generate evaluation data related to identified, standardized outcomes. What differs in the evaluation of agriculture programs is the evaluation design, including sample and calculation of values. Impact reports may be directed at commodity groups, legislature, farming groups, and constituents. State Extension agriculture outcomes can use the USDA impact metrics. Additionally, 2014 federal requirements for competitive funds now state that projects must demonstrate impact within a project period. Writing meaningful outcomes and impact statements continues to be a focus of USDA National Institute of Food and Agriculture (NIFA).

Hot Tip: Standardizing indictors into measurable units has made aggregation of statewide outcomes possible. Examples include pounds or tons of an agricultural commodity, dollars, acres, number of farms, and number of animal units. Units are then reported by the practice adopted. Dollars estimated by growers/farmers are extrapolated from research values or secondary data sources.

Hot Tip: Peer-learning with panels to demonstrate scales and types of evaluation with examples has been very successful. There are common issues and evaluation decisions across programming areas. Setting up formulas and spreadsheets for future data collection and sharing extrapolation values has been helpful to keep program evaluation efforts going. Surveying similar audiences with both outcomes and program needs assessment has also been valuable.

Rad resource: NIFA  provides answers to frequently asked questions such as when to use program logic models, how to report outcomes, and how logic models are part of evaluability assessments.  

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Lisa Townson on Tailoring Evaluation to Your Audience
  2. EEE Week: Suzanne Le Menestrel on Developing Common Measures
  3. EEE Week: Melissa Cater on Extension Evaluation

EEE Week: Laura Downey on Community-Based Participatory Research Logic Models

American Evaluation Association 365 Blog - Thu, 04/10/2014 - 01:04

Hi! This is Laura Downey with Mississippi State University Extension Service. In my job as an evaluation specialist, I commonly receive requests to help colleagues develop a program logic model. I am always thankful when I receive such a request early in the program development process. So, I was delighted a few weeks ago when academic and community colleagues asked me to facilitate the development of a logic model for a grant proposing to use a community-based participatory research (CBPR) approach to evaluate a statewide health policy. For those of you who are not familiar with CBPR, it is a collaborative research approach designed to ensure participation by communities throughout the research process.

As I began to assemble resources to inform this group’s CBPR logic model, I discovered a Conceptual Logic Model for CBPR available on the University of New Mexico’s School of Medicine, Center for Participatory Research, website.



(Share Clip)

Rad Resource:

What looked like a simple conceptual logic model at first glance was actually a web-based tool complete with metrics and measures (instrument) to assess CBPR processes and outcomes. Over 50 instruments related to the most common concepts in CBPR, concepts such as organizational capacity; group relational dynamics; empowerment; and community capacity are profiled and available through this tool. The profile includes the instrument name; a link to original source; the number of items in the instrument; concept(s) original assessed; reliability; validity; and identification of the population created with.

With great ease, I was able to download surveys to measure those CBPR concepts in the logic model that were relevant to the group I was assisting. Given the policy-focus of that specific project, I explored those measures related to policy impact.

Hot Tip:

Even if you do not typically take a CBPR approach to program development, implementation, and/or evaluation, the CBPR Conceptual Logic Model website might have a resource relevant to your current or future evaluation work.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Laura Myerchin Sklaroff on Community Based Participatory Research
  2. Sally Honeycutt on Developing Logic Models
  3. Michael Duttweiler on Talking Your Way Into a Logic Model

EEE Week: Siri Scott on Conducting Interviews with Youth

American Evaluation Association 365 Blog - Wed, 04/09/2014 - 01:02

Hi, I’m Siri Scott, and I work as a Graduate Assistant with the University of Minnesota Extension.

Conducting interviews with youth is one way to consider gathering information for program improvement, especially when you want to bring youth voice into your improvement process. While more time intensive than doing a survey, interviews will provide you with rich contextual information. Below is a brief overview of the planning process as well as tips for conducting interviews.

Hot Tips: Planning Phase

One main decision is whether or not you will need IRB approval for conducting interviews. Even when done for program improvement purposes, it is a good idea to comply with IRB regulations for data practices and protection for youth. Other major decision points in the planning process include how many individuals you will interview, how you will choose your participants, and how you will collect and analyze your data. In addition, you must decide what type of interview you want to conduct, the purpose of the interview, and then create an interview protocol (if appropriate).

Hot Tips: Conducting interviews

Here are some tips for conducting interviews:

  • Practice: Test the use of the protocol with a colleague (or a young person who you know well) and ask for feedback about the questions themselves and how they fit together.
  • Space: Find a quiet, secluded space for the interview in a public setting (or in a private home if the young person’s guardian can be present). You don’t want other people overhearing your conversation or your participant being distracted by anything.
  • Warm up: Start the interview with some informal chit chat. This will build your rapport with the participant and ease the participant’s (and your) nerves.
  • Probe: If you are not doing a structured interview, make sure to ask participants to clarify or elaborate on their responses. This will provide you with much better data.
  • Notes: If you are using an audio recorder, don’t trust it (fully). Jot down some quick notes during the interview. You can elaborate on these later if the audio recorder malfunctions.
  • Relax! If you mess up, that’s okay. Also, if you’re nervous and tense, the participant will sense that. Do whatever you can to put the participant (and yourself) at ease.
  • Learn More:  A good resource for learning about how to conduct interviews is this newly released, comprehensive overview of the interviewing process:  InterViews: Learning the Craft of Qualitative Research Interviewing (3rd Edition) by Brinkman and Kvale (2014).

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Lisanne Brown on Involving Youth in Interviewing
  2. Dreolin Fleischer on Organizing Quantitative and Qualitative Data
  3. Nicole Jackson on Improving Interview Techniques During Formative Evaluations

EEE Week: Kevin Andrews on Connecting Students with Extension

American Evaluation Association 365 Blog - Tue, 04/08/2014 - 01:59

Howdy! I am Kevin Andrews, a program specialist at Texas A&M AgriLife Extension Service. In addition to my Extension duties, I co-teach a graduate evaluation course at Texas A&M University.

I came across a post from March about students partnering with community agencies to apply their evaluation skills. I’d like to build upon Dr. Brun’s idea for evaluators who have ties to a university, especially those in Extension.

Many of our students have no idea what extension (or any other agency) is. Any engaged university seeks to tie together the scholarships of teaching, research, and service, and hands-on evaluations are a perfect way to accomplish this.

Lessons Learned: By allowing students to partner with us on evaluations, they not only receive practical experience and make an impact, they also get to learn who we are. This can aid in recruiting talented students to work for the agency; we’ve had several ask about careers in extension.

Hot Tip: Students are going to ask a lot of questions. We can get pretty set in our ways and think we know our agency well. When you have to pause to explain why we do what we do in basic terms, you are forced to reflect on exactly why it is we have been doing things a certain way all these years!

Hot Tip: Our employees just want their voices heard. With students conducting interviews we get far more coverage than a single evaluator using a sample, and employees are able to feel their opinions matter. Our staff is also much more likely to be open with a student than they are a peer.

Lessons Learned: I like to be in total control over my projects, but part of delegating work is letting others do their own thing. By developing goals together early in the project, I can ensure the outcome is as I intended while allowing students to experiment and develop their own processes.

Hot Tip: Often, when a class is over, the student-teacher relationship ends. Keep contact information and follow up with students a year later to let them know the impact of their work. No matter where life takes them, they are your stakeholders and you want them to hold you in high esteem.

Lessons Learned: I’m lucky to get to straddle teaching and Extension. For those who don’t simply reach out and ask! I’ve been approached by others with projects for students, and I’ve approached others with projects of my own. Everyone has something they need done!

Two years ago, I was the student participating in a class evaluation. Three from my class, including myself, now work for Extension and our report generated $200,000 of funding – the model works!

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Aubrey Perry on Ensuring a Positive Practica Experience
  2. GSNE Week: Kristin Woods on Gaining Practical Experience as a New Evaluator
  3. PD CoP Week: David Brewer on Evaluating Professional Development: Guskey Level 5: Student Learning Outcomes

EEE Week: Suzanne Le Menestrel on Developing Common Measures

American Evaluation Association 365 Blog - Mon, 04/07/2014 - 01:52

Greetings! My name is Suzanne Le Menestrel and I am a National Program Leader for Youth Development Research at the 4-H National Headquarters, National Institute of Food and Agriculture, U.S. Department of Agriculture.  4-H is a national youth development organization serving 6 million youth throughout the country. We partner with the nation’s Cooperative Extension system operated by the more than 100 land-grant universities and colleges and with National 4-H Council, our private, non-profit partner. Recent trends in funding have elevated the importance of illustrating impact and accountability for nonformal educational programs.  We were also interested in building capacity for evaluation through the creation of easy-to-use and accessible tools.  We partnered with National 4-H Council, state 4-H program leaders, 4-H specialists and Extension evaluators from around the country to create a national 4-H common measures system that will also enable us to aggregate data across very diverse 4-H programs.

I have learned a number of lessons through the implementation of this new system.

Lessons Learned:

    • Common measures must be developmentally appropriate. Children and youth who participate in 4-H range in age from ages 5 to 19.  Because of concerns about reading levels and developmental appropriateness, we focused the common measures on ages 9 to 18. We also divided up the measures into two levels—one for children and youth in grades 4 through 7 and one for youth in grades 8 through 12.
    • Common measures must have strong psychometric properties.  As much as possible, we drew from existing measures but have been conducting analyses with both pilot and preliminary data.
    • Measures must be applicable to a broad variety of programs. 4-H looks very different from county to county and state to state. We started with the creation of a national 4-H logic model that represents desired program outcomes.


(Share Clip)

 

  • Common measures must be available through a flexible, easy-to-use, and robust on-line platform.  This includes the ability to add custom items.
  • Training and technical assistance are key to the implementation of common measures in a complex, multi-faceted organization such as 4-H.
  • Buy-in and support from stakeholders is critical as is creating an ongoing system for soliciting stakeholder feedback.
  • Such a system cannot be developed without sufficient funding to support the on-line platform, technical assistance, and on-going formative evaluation.
  • Common measures are a flexible product that needs to grow and change with the outcomes of the organization.

Rad Resource:

Check out this article written by Pam Payne and Dan McDonald on using common evaluation instruments.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Pam Larson Nippolt on Soft Skills for Youth
  2. Lisa Townson on Tailoring Evaluation to Your Audience
  3. Elizabeth Harris on A Measure of Youth Resiliency

EEE Week: Melissa Cater on Extension Evaluation

American Evaluation Association 365 Blog - Sun, 04/06/2014 - 01:42

My name is Melissa Cater, and I am an assistant professor and evaluation specialist at Louisiana State University AgCenter. I am also serving as Chair of the AEA Extension Education Evaluation Topical Interest Group (EEE-TIG) this year. The EEE-TIG provides a professional development home for Extension professionals who are interested in program evaluation; we also welcome other individuals who are evaluating non-formal education outreach programs in a community setting. The EEE-TIG goals provide a guiding framework for the membership.

Hot Tip: Our TIG has provided a place for Extension professionals to become more collaborative. If you are searching for a way to become more involved in evaluation, join a TIG. The networking opportunities are endless.

This week’s aea365 blog posts are sponsored by the EEE-TIG. I invite you to learn more about who we through this week’s series of posts. You’ll see that we have a range of interests within our membership from evaluating agricultural programs, to teaching evaluation, to supporting participatory community research, to building evaluation capacity.

Hot Tip: You can learn even more about the EEE-TIG and the varied interests of our members by viewing our archived blog posts.



(Share Clip)

Hot Tip: Want to learn about the diversity of programs that are being evaluated in Extension? Check out the Journal of Extension to see the breadth of topics.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. EEE Week: Sarah Baughman on Building Evaluation Capacity Across Disciplines: From Feral Hogs and Fire Ants to Families
  2. EEE Week: Mary Arnold on Building Capacity
  3. Lisa Townson on Tailoring Evaluation to Your Audience

Dan McDonnell on Evaluating Your Tweets

American Evaluation Association 365 Blog - Sat, 04/05/2014 - 13:09

Hello, my name is Dan McDonnell and I am a Community Manager at the AEA. 

I’ve written much in the past about different tools and tricks that can help evaluators be more productive in using Twitter, which hopefully have proved worthwhile in helping you make smart use of your time on social media. By evaluating your Twitter activity and engagement, you can better understand what content resonates with your followers, and how your tweets might help you expand your network of contacts and followers.

Hot Tip: Monitor Tweet Click Throughs with a URL Shortener

While you can’t necessarily measure if people are reading your tweets, you can see who is taking action and clicking the links that you share – which in turns lets you know that you’re sharing content your followers find interesting! Using a link shortening tool like Bit.ly or Ow.ly (HootSuite’s built-in shortener) will automatically track the number of times followers click on your links. Periodically check in to see the types of content that get the most attention. Are tweets using certain hashtags or are shorter tweets getting clicked more often? Let that help you inform future content and topics for things you tweet about.

Hot Tip: Measure Your Most Engaging Tweets

Another set of metrics that you can look to for wisdom is engagement. Keep an eye on the number of times your tweets are being retweeted, favorited or replied to through the basic Twitter client, or sign up for a free tool along the lines of Sprout Social or HootSuite. This lets you keep track of your top-engaging tweets so you can easily see what stories, resources and thoughts are most likely to be be engaging to your followers.
Hot Tip: Evaluate your Favorite Hashtags

In my last post, I mentioned Tweetbinder as a handy tool for digging into hashtag data. The amount of data you can find is staggering! Simply visit the site and type in your hashtag of choice. The report you pull up with show you the top contributors, when the hashtag is most active and examples of recent tweets. With this knowledge, you can find new, interesting people to follow, analyze good times to tweet on the hashtag and see where you rank among tweeters for impact, influence and more.

This is really just scratching the surface on what Twitter metrics can tell you, and how you can use them to your advantage in evaluation. In a future post, I hope to be able to expand upon these topics and provide additional tips and tricks on digging into the data. How do you use Twitter metrics to your advantage?

 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Dan McDonnell on Making New Friends and Mastering Lesser-Known Twitter Features Without Third Party Apps
  2. Dan McDonnell on Twitter Etiquette and Data Archiving
  3. Dan McDonnell on 5 Social Media Tools For Curation and Visualization

Erin Blake on Healthy Tips for Traveling Evaluators

American Evaluation Association 365 Blog - Fri, 04/04/2014 - 01:15

My name is Erin Blake and I am worried about your health!

Working for the Caribbean Public Health Agency has raised my awareness of the growing problems with obesity and associated diseases. The go to axiom in public health is ‘the health of the nation, is the wealth of the nation.’ Well, I think that is also true for evaluators too. We need to take care of our physical and mental health in order to do the best job possible for our clients and stakeholders.

Many of us (myself included) struggle with our weight and maintaining our health. It can be hard making good choices when travelling frequently, working long hours, under stressful deadlines, in places where food options are limited and/or working in locations that have no facilities for exercise.

So how can we better take care of ourselves when we are on the road?

Hot Tip: Do your research. When booking accommodation, look for places that have facilities for exercise or are close to places you can go for a jog/walk/swim. You don’t have to take exercise seriously, just regularly. Thirty minutes a day!

Hot Tip: Be prepared. ALWAYS pack your gym gear and swim wear when you are heading out on the road. Be prepared to exercise in your room. Sit ups, push-ups, squats, yoga, dancing, star jumps and many more exercises don’t need a gym. There are loads of resources on the web that can help you identify exercise that can work for you.

Hot Tip: If you can’t measure it, you can’t manage it. Download some apps for tracking your calories and exercise (or just write it down!).

Rad Resource: MyFitnessPal is a free app (available for iPhone, Android, Blackberry, and Windows) that includes a daily diary for tracking your nutritional intake and calories burned. It has a surprisingly large database of different foods and their nutritional content which is particularly handy. It also has some fun graphs for your inner data viz nerd!

Get Involved: I want to encourage more AEA members to share their experiences and tips for maintaining their health.

(Share Clip)

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Shortcut Week: John Paul Manning on RescueTime for Time Tracking
  2. Clare Nolan on Transforming Health Care: Evaluating Accountable Care Organizations
  3. BLP TIG Week: Michelle Baron on The Importance of Strategic Planning in Building a Culture of Evaluation

Susan Kistler on Innovative Reporting Part I: The Data Diva’s Chocolate Box

American Evaluation Association 365 Blog - Thu, 04/03/2014 - 01:34

Hello wonderful aea365ers. My name is Susan Kistler and I am, or will be when we launch on June 1, a contributing editor at TheSmarterOne.com, where we aim to “Increase Your ROI on Life and Have Fun While Doing It.” I’m also the Executive Director Emeritus of the American Evaluation Association, and originator of aea365. But enough about me, let’s talk about chocolate!

Back in February, Stephanie Evergreen wrote on her blog about “findings cookies” – homemade fortune cookies with a tiny tidbit from an evaluation report inside.  I loved the idea, but ran into two potential problems with personal execution: 1. I burn most of what I cook, and 2. I needed something that was more portable. Our beloved aea365 curator, Sheila Robinson, came to the rescue when she suggested “findings chocolates” in the comments to Stephanie’s post. These were perfect for making ahead for an upcoming dataviz workshop I was giving where colleagues from the St. Paul Public Schools (SPPS) would be in attendance. The data from their 2011-2012 Vision Cards served as the basis for these examples.

Hot Tip: Wrap Hershey’s Nuggets in hand-made overlays and you’ll have three sides available for information. When we made the ones at the top of this post, we put a graph illustrating the measure on top, the interpretation and goal on one side, and a link to the full report on the other.

Cool Trick: 3/4″ color coding dots fit perfectly on the bottom of Hershey’s Kisses. Buy the dots that can be printed on and add a key finding. Then, peel and stick!

Rad Resource – Chocolate Stickers and Wrappers Templates: We’re launching TheSmarterOne.com on June 1, but I posted a step-by-step tutorial on how to make these over there already, including templates for both the kisses and the nugget wraps. While you’re there, feel free to poke around (with the caveat that we’re still working things out) and sign up for our weekly newsletter. If you’d like to learn more about what we’re doing, or consider writing for us, see this post.

This is the first in what will be an ongoing series on alternative reporting, exploring ways to get your report off the shelf and into people’s hands and heads.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Susan Kistler on Evaluating Website Traffic
  2. Elissa Schloesser on 5 Steps for Translating Evaluation Findings into Infographics
  3. Kristi Pettibone on Evaluating Environmental Change Strategies

Large Systems Change: Producing the Change We Want

Networking Action - Wed, 04/02/2014 - 12:18

Waddell, Steve, Hsueh, Joe, Birney, Anna, Khorsani, Amir, & Feng, Wen. (2014). Turning point – large systems change: Producing the change we want. Journal of Corporate Citizenship, 2014(53), 5-8. doi: 10.9774/GLEAF.4700.2014.ma.00003

Transformation and large systems change is not something that …

Susan Wolfe on When You Can’t Do An Evaluability Assessment

American Evaluation Association 365 Blog - Wed, 04/02/2014 - 01:15

My name is Susan Wolfe and I am the owner of Susan Wolfe and Associates, LLC, a consulting firm that applies Community Psychology principles to strengthening organizations and communities. Prior to initiating my consulting practice, I was employed as an internal evaluator in more than one organization.

Have you ever had a job where they sent you to evaluate a multi-site program or initiative, and find that there was no clearly defined single intervention, no specific goals or objectives, and the performance measures lacked established norms or benchmarks?  This has happened to me on more than one occasion. In each case I managed to produce a useful report. How did I do it?

Lesson Learned:  Sometimes you are unable to convince the powers that be that you need to first address evaluability.  If this happens, describe the evaluation challenges and how they will limit what you will be able to do in writing and negotiate a longer timeline for the project. Such projects can become quite complex and you will need extra time.

Hot Tip:  Consider using a comparative case study approach that utilizes quantitative, qualitative and participatory methods. After completing a case study of each site, you can then summarize common activities and outcomes.  You can also determine which sites showed better outcomes and which did not, and identify successful strategies and barriers to success.

Rad Resource: Case Study Research: Design and Methods. Fifth Edition (2014) by Robert K. Yin.

Hot Tip:  Identify the common core elements for the program or initiative across sites.  Make sure one of your recommendations includes the development of Specific, Measureable, Attainable, Realistic and Time Bound (SMART) objectives and development of a framework or model of change.

Rad Resource:  The Community Toolbox (one of my favorite resources) provides instruction and tools for Developing a Framework or Model of Change.

(Share Clip)

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. CP TIG Bonus: Susan Wolfe on Using Performance Data to Empower Staff and Build Capacity
  2. Susan Kistler on the Community Toolbox
  3. CEA Week: Tania Rempert, Leanne Kallemeyn, David Ensminger, and Megan Polanin on Developing Evaluation Capacity Through Coaching

Kylie Hutchinson on Searching for Inspiration

American Evaluation Association 365 Blog - Tue, 04/01/2014 - 01:15

My name is Kylie Hutchinson.  I am an independent evaluation consultant and trainer with Community Solutions Planning & Evaluation.  In addition to my usual evaluation projects, I deliver regular webinars on evaluation topics, Twitter weekly at @EvaluationMaven, and co-host the monthly evaluation podcast, Adventures in Evaluation along with my colleague @JamesWCoyle.

Sometime when I feel a I need an energizer or refresher in my evaluation practice I like to go back to my favorite evaluation books and flip through pages that I’ve specially tabbed and highlighted for inspiration.

Rad Resource:  One of my obvious go-to resources is the body of work by Lahjik Maadel, and in particular his book, On Doing, And Done.  I always find Lahjik’s insights into how programs work to be highly practical, timely, and relevant.  Thirty years on, I still find his views on evaluation highly prescient for his time.

Rad Resource:  Perhaps a less widely known, but still inspiring, resource for evaluators is the book, Quality Assessment, by E. Val Speuff.  While Val is originally known for her groundbreaking work on evaluation methodology, she has also written extensively on the role of evaluators as social change agents.

Hot Tip:  Books can inspire, but don’t always believe everything you read.  Sometimes you need to go to the source!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Dawn Hanson Smart on Reading Outside Your Field
  2. Dreolin Fleischer on the Encyclopedia of Evaluation
  3. Cameron Norman on Complexity Science for Evaluators

Humberto Reynoso-Vallejo on Innovative Approaches for Latino Caregivers of Dementia Elders

American Evaluation Association 365 Blog - Mon, 03/31/2014 - 14:50

I am Humberto Reynoso-Vallejo, a private consultant on health services research. A few years ago, I was part of an exploratory study of Latino caregivers in the Boston area caring for a family member suffering Alzheimer’s disease. Difficulties facing those families coping with the disease have promoted the rise of support groups for diverse population group. Support groups for racial/ethnic diverse caregivers were scarce, and in the case of Latino caregivers in the Boston area nonexistent. To respond to this need, I tried to develop a support group for Latinos with the assistance of the Alzheimer’s Association. After several unsuccessful attempts, I conducted a focus group with four caregivers to identify barriers to participation. Findings indicated that caregivers faced a number of issues including: lack of transportation; lack of available time to take off from other responsibilities; the absence of linguistically appropriate support groups; caring for other family members dealing with an array of health problems (multiple caregiving); and, other personal and social stressors.

I designed an alternative and pragmatic model support group, which took the form of a radio program. The “radio support group” directly targeted caregiver’s concerns and aimed to:

a) Disseminate culturally relevant information, largely from the point of view of the caregivers themselves, either as guest in the program or when calling into; and,

b) Reduce the sense of isolation that many caregivers feel on a daily basis as a result of their caregiving roles.

I facilitated the radio support group with the participation of caregivers, professionals and service providers. Four programs were aired exploring topics such as memory problems, identifying signs of dementia, caregiver needs, and access to services. After each radio program was aired, I called the 14 participant caregivers to explore their reactions, and found that the majority of them were not able to participate. Since the “live” radio support group was not accomplishing its original purpose of disseminating information and reducing caregivers sense of isolation, I decided to distribute the edited audiotapes of the 4 programs to all caregivers. Overall, caregivers found the information useful and many established contact with others. 

Lessons Learned:

  • This model of intervention, the radio support group, showed that innovation simultaneously with cultural relevant material is promising.
  • Research and evaluation should adapt to the particular needs and social context of Latino caregivers of family members with Alzheimer’s disease.
  • There is a need for more culturally appropriate types of interventions that mobilize caregivers’ own strengths, values, and resources.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

Related posts:

  1. Humberto Reynoso-Vallejo on Cultural Competence and Cultural Humility in Evaluation
  2. CEA Affiliate Week: Grisel M. Robles-Schrader on Increasing Research Literacy, Evaluation, and Engagement led by and with Latino Communities
  3. Cultural Competence Week: Lisa Aponte-Soto and Leah Christina Neubauer on Increasing the AEA Latino/a Visibility and Scholarship