Monitoring, Evaluation and Learning Systems

SEA Week: Dr. Moya Alfonso on the Benefits of Being A Grant Reviewer

American Evaluation Association 365 Blog - Fri, 08/29/2014 - 01:15

My name is Moya Alfonso, and I’m an Assistant Professor at Georgia Southern University and University Sector Representative and Board Member for the Southeast Evaluation Association (SEA), a regional affiliate of the American Evaluation Association (AEA).

So, you need to improve (or develop) your grant writing skills and perform service. A perfect way to address both of these needs is to serve as a grant reviewer!

Lesson Learned: I have honed my grant writing skills by reviewing for local nonprofits, the Centers for Disease Control and Prevention, and the Department of Education, and learning what is expected and seeing the mistakes made by others.  At the same time, I performed an important service to the fields of public health and educational research and evaluation.

Hot Tip: Select the Right Opportunity. When looking for opportunities to be a grant reviewer, consider where your strengths lie. If you’re a program evaluator with a background in education, for example, the Department of Education might be a good place to start. Targeting opportunities will increase your odds of being selected for a review panel – even if you are new to reviews.

Hot Tip: Know What You’re Getting Into. So you’ve found an opportunity that is right up your alley. Now what? It’s time to determine logistics. If detailed information is not provided in the call for reviewers, contact the review administrator about in-person versus remote reviews, estimates of time required, number of applications assigned, grant review dates or time periods, and travel reimbursement or stipends.

Hot Tip: Be Critical Yet Constructive. There’s nothing worse than receiving a “Great Job!” back from a reviewer. No one is perfect. Read (and reread) each application with an eye toward both its strengths and weaknesses. Keep feedback constructive; there is no room for personal insults in grant reviews.

Hot Tip: Know You’re Not Alone. Grant reviewers typically serve on panels comprised of individuals with a variety of perspectives and skill sets. You are not expected to know everything! Feel free to draw upon the wisdom of your grant review administrator and your fellow reviewers.

Hot Tip: Don’t Trust Technology. Technology is amazing – when it works! When completing reviews, you will likely need to learn new technology to complete your reviews. Don’t trust it! Perform your reviews in a word processing program, save your files to your computer, and use the copy and paste functions to complete your reviews.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Leslie Goodyear on Serving as a Reviewer for the National Science Foundation
  2. STEM TIG Week: Susan Eriksson on Scientists Becoming Evaluators
  3. SEA Week: Sean Little on Book Reviewing for Skill Growth

SEA Week: Jason Lawrence on Making the Most of Graduate Education in the Evaluation Profession

American Evaluation Association 365 Blog - Thu, 08/28/2014 - 01:15

My name is Jason Lawrence, Grants Manager at the Florida Office of the Attorney General and Student Sector Representative and Board Member for the Southeast Evaluation Association (SEA), a regional affiliate of the American Evaluation Association (AEA). I am also a second-year Masters of Public Administration (MPA) student in the Askew School of Public Administration and Policy, at Florida State University.

A career in evaluation often begins with solid graduate-level education. But the classroom isn’t the only place where you can prepare for the field. You’ll need additional skills to complement your degree before entering the job market. And those competencies can be acquired in three ways: internships or volunteer assignments; joining professional organizations; and presenting research at conferences.

Hot Tip: If you’re not already working as an evaluator or in a related position, then acquiring practical skills is a critical first step. Finding internship programs can help you overcome this deficiency, but acquiring one can be difficult and highly competitive.

Hot Tip: There is a shortcut. Instead of going through the painstaking application and interview process for an internship, you may consider volunteering your time with a local non-profit organization. These organizations spend a great deal of time measuring the results and effectiveness of their services, but may not have the resources to conduct rigorous evaluations.

As a volunteer, you may inquire how you could be part of their evaluation process. This quid-pro-quo gives the organization the human capital it needs to be effective and equips you with practical skills you need to advance your evaluation career.

Rad Resources: Many times such arrangements take shape through budding professional relationships.  As a member of AEA, you have access to a cadre of evaluation professionals who joined the organization for the very purpose of making connections and sharing skills of the trade. Memberships in professional organizations also afford opportunities to present your academic research at conferences. This is an impressive addition to a fledgling professional resume. Both SEA and AEA offer presenting opportunities year-round.

These are just a few of the ways an aspiring evaluator can break into the field. If you haven’t managed to do any of the above, there is still time. Having a year or even a semester left in graduate study means you have plenty of time to develop the skills needed to land your dream job. Enrolling in a graduate program is only a starting point.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. SEA Week: Sheena Horton on Breaking Into the Evaluation Field
  2. Tamara Bertrand Jones on Finding and Working With a Mentor
  3. SEA Week: Jennifer Johnson and Melanie Meyer on Using Mentoring and Training Programs to Develop Knowledge and Skills

SEA Week: Jennifer Johnson and Melanie Meyer on Using Mentoring and Training Programs to Develop Knowledge and Skills

American Evaluation Association 365 Blog - Wed, 08/27/2014 - 01:15

We are Jennifer Johnson and Melanie Meyer, and we are evaluators at the Florida Legislature’s Office of Program Policy Analysis and Government Accountability.  Jennifer has served as Past President and currently serves as Secretary for the Southeast Evaluation Association (SEA), and Melanie holds a Master’s degree in Adult and Continuing Education and is a member of SEA.

The purpose of continued professional development is to help individuals maintain competence in their profession.  It is critical to organizational success and ensures individuals possess current knowledge and skills to effectively do their jobs and contribute to organizational goals.

Cool Tricks: Two ways organizations can address continued professional development are through mentoring and training.  Mentoring consists of a one-on-one relationship in which an individual serves as a mentor or coach to an individual less experienced or new to the organization.  A training program occurs in a group setting in which one or several individuals adopt a trainer/teacher role.

Hot Tips: Developing mentoring and training programs requires significant organizational commitment.  Below are a few tips organizations should consider:

Mentoring Programs

  • Build trust. Ensure the mentor relationship is only about support and growth, not supervision or management.  Mentors should not provide performance evaluations.
  • Specify the mentoring role. Determine what areas mentors should address, e.g., specific skills and knowledge, or organizational processes and culture.
  • Clarify the frequency and method of contact.  Determine the length and format of interactions (e.g., face-to-face or phone; structured meetings or less formal interactions) and whether mentors should be available between sessions.
  • Determine the duration of relationship.  Determine whether the mentoring relationship should be a time limited or ongoing. Periodically assess the relationship to ensure both individuals are happy with the process.

Training Programs

  • Assess internal expertise.  Identify individuals in your organization to provide training; it is the most effective way to develop and sustain a training program and is inexpensive, flexible, and allows the trainer to be available for post-training consultation.
  • Vary the scope and format.  Provide focused training for essential skills and broader training for areas that require a general knowledge base.  Use a range of formats including classroom-style presentation, interactive and hands-on sessions, and one-on-one tutorials. Develop self-guided materials.
  • Tailor programs to skill levels Specialize training for beginner, moderate, and advanced skill levels.
  • Encourage participation by management.  Demonstrate to the organization that training is important by encouraging managers to attend and look for opportunities for individuals to apply what they learn.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Tamara Bertrand Jones on Finding and Working With a Mentor
  2. Seth Kaplan & Sue Sarpy on Organizational Factors When Trying To Evaluate Training Initiatives
  3. John Wedman on Using the Performance Pyramid in Evaluation

SEA Week: Dr. Fred Seamon on Evaluation Careers in the Private Sector

American Evaluation Association 365 Blog - Tue, 08/26/2014 - 01:15

Hi, I`m Fred Seamon, Senior Partner with MGT of America, Inc. and Past President of the Southeast Evaluation Association (SEA), a regional affiliate of the American Evaluation Association (AEA). I am also a former Assistant Professor for the Askew School of Public Administration and Public Policy at Florida State University, and former Research Associate for the Pepper Institute on Aging and Public Policy at Florida State University.

Like many evaluation professionals, I had a natural tendency to think of public sector and not-for-profit organizations when considering evaluation career opportunities. I was shocked and amazed by what I found.

Lesson Learned: Evaluation has been a staple for many years in the private sector and the “bread and butter” for small boutique or niche firms that only conduct evaluation work. In the private sector, evaluation takes on several guises, including program assessment, program improvement, organizational assessment, service delivery systems review, impact analysis, and quality improvement reviews, to just name a few.

Hot Tip: Look beyond common evaluation terms and titles and focus on the required skills and expertise evaluation is evaluation regardless of what it`s called or the context in which it takes place. If you are looking to expand your horizon and career in evaluation, first learn to broaden how you think about evaluation. Start with who you know and who they might know, and use your research skills to uncover and discover opportunities in the private sector. Finally, broaden how you look at the private sector and potential evaluation career opportunities to reveal more career options.

Hot Tip: In addition to major private sector firms like Accenture and KPMG, other private sector evaluation career opportunities may be found at foundations, think tanks, economic development organizations, and business and industry professional associations.

Rad Resource: Consulting News is an excellent website to use as a starting point for researching information on evaluation careers and opportunities in the private sector and for staying current on the latest in the field.

Once you start to look more broadly at evaluation itself and other sectors, you will be amazed to discover that evaluation is, in fact, everywhere.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Ayesha Tillman on Finding an Entry to Mid-Level Evaluation Position within a Government Agency
  2. WE Week: David J. Bernstein on Sustaining an Evaluation Community of Practice
  3. SEA Week: Sheena Horton on Breaking Into the Evaluation Field

SEA Week: Sean Little on Book Reviewing for Skill Growth

American Evaluation Association 365 Blog - Mon, 08/25/2014 - 01:15

My name is Sean Little, Consultant for Sean Little Consulting and Newsletter Editor for the Southeast Evaluation Association (SEA). I’ve been reviewing books for SEA’s newsletter since 2007. While I began reviewing books to network, and to build my Internet presence, I soon realized a positive unintended consequence.  My reviews constituted Internet accessible writing samples, demonstrating my writing skills and evaluation knowledge.

Cool Trick: Reviewing books on a regular basis can become a form of professional development. I once heard George Grob, President of the Center for Public Program Evaluation,speak at an AEA conference; he described his strategy for professional development.  Every year, he would pick one hard, and one soft evaluation skill.  For that year, he would focus on improving those two skills.  He explained that if you picked two new skills every year, you would develop a well-rounded skill set. I’ve adopted this strategy to select books to review.  For each of the last three years, I‘ve picked a hard evaluation skill and organized my review around it. In order to review a book, you have to engage more deeply than to simply read it; in writing the review, you have to organize your experience of that engagement. Regular book reviewing can become a self-directed mini-course.

Hot Tips: What should be in a review? I summarize the major themes of the book, pointing out strong points and weaknesses. Given people’s time and money constraints, I include the price and the number of pages. While my reviews run between 500 and 1000 words, I usually can find time to write two or three reviews a year. It’s good to limit potential reviews to books published within the last 1-2 years.

Rad Resources: For non-teachers who cannot obtain free review copies, used copies on Amazon are available at reduced rates. Some university libraries will allow outsiders to purchase library membership fees. These fees may be cheaper than a new evaluation book, and you can use them all year long. While you can’t highlight or underline in a library book, you can take notes on a laptop while you’re reading, a deeper form of engagement.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. SEA Week: Sheena Horton on Breaking Into the Evaluation Field
  2. Dawn Hanson Smart on Reading Outside Your Field
  3. Camesha Hill-Carter on Looking at Assessment in a Different Light

SEA Week: Sheena Horton on Breaking Into the Evaluation Field

American Evaluation Association 365 Blog - Sun, 08/24/2014 - 05:56

My name is Sheena Horton, Senior Analyst at MGT of America, Inc., and Private Sector Representative and Board Member for theSoutheast Evaluation Association (SEA). I have been in the evaluation field professionally since 2008, and would like to offer a few tips from my experience breaking into the evaluation field.

Lesson Learned: Evaluation opportunities are everywhere, but simply searching “Evaluation” and “Evaluators” will not expose you to all of them. Evaluation job and organization titles vary greatly and can be vague. Similarly, desired skill sets vary across sectors and subject areas. Whether you are starting or switching your career, there are steps you can take to become more acquainted with the evaluation field.

Hot Tip: Investigate and evaluate. Start by researching online and expanding your keyword searches to other common words you encounter and by focusing on agency resource webpages. Evaluation is everywhere, so look everywhere: think tanks, consulting firms, law enforcement agencies, health/human services offices, governors’ offices, legislatures, and universities. Many organizations have their own evaluation units for assessing their processes, performance, and services.

Rad Resources:

  • Browse job listings – even if you are not actively seeking a new position. This includes organization listings and other directories such as AEA’s Jobs Database, The Project Management Network, and Better Evaluation. This will help you become familiar with common titles in the field and desired skill sets by employers, and help you frame your professional development plan.
  • Using websites like LinkedIn will not only connect you to evaluation organizations and groups, but you can browse the profiles of evaluators to see the skills they have for their position and learn what is available, such as sector-specific licensures or certifications.

Hot Tip: Connect and get involved. Contact evaluators to discuss their experiences and learn about the skills sought after in your interest area(s). Such contacts may open volunteer or job opportunities. Most importantly – take initiative! Attend/present at conferences, become a peer reviewer, subscribe to email lists or feeds, or submit newsletter articles or online posts. Seize every opportunity!

Rad Resources:

  • Project Management Institute offers certifications that are highly desired in evaluation.
  • Many websites offer free online courses, including MIT’s Open Courseware and Coursera.
  • Advanced Excel skills in data manipulation are critical. Many websites and YouTube channels offer free tutorials.
  • Other valuable skills that may not be obvious to newcomers include marketing, budgeting, contract management, client relations, and implementation management and strategies. Ask any evaluator – quality knowledge and skills in these areas will definitely get you noticed.

Southeast Evaluation Association (SEA) is a regional affiliate of the American Evaluation Association (AEA).

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. IC Week: Susan Wolfe on Small is Beautiful–the “Jack of All Trades”
  2. Susan Kistler on AEA Resources for Finding Work as an Evaluator
  3. SEA Week: Jason Lawrence on Making the Most of Graduate Education in the Evaluation Profession

Jayne Corso on Using HootSuite to Monitor Evaluation Trends and Conversations on Social Media

American Evaluation Association 365 Blog - Sat, 08/23/2014 - 06:00

Hello, my name is Jayne Corso and I work with Dan McDonnell as a Community Manager for AEA. As a frequent social media user, and one of the voices behind @aeaweb, I am always searching for new tools that can organize my social media feeds and help me stay up-to-date on the latest conversations, topics, and hashtag surrounding the evaluation community.

HootSuite is my primary tool for monitoring industry news and evaluating our social media posts. The ease of access to industry information that that this tool provides makes research much more effective – and easy!

Rad Resource: Manage your Social Media Accounts Through the HootSuite Dashboard

Each HootSuite user has a personal dashboard, which can be customized to fit posting or research needs. The dashboard can manage multiple platforms including: Twitter, Facebook, LinkedIn, and WordPress— creating separate tabs for each platform. Each tab can be customized with ‘streams’, (feeds, keyword searches, lists, etc.) so you can curate the most relevant information on one screen.

This is a great way to see how the evaluation community is engaging with @aeaweb’s daily Tweets. The different streams help better identify good times to share posts, what content is most popular, and the best ways to present information. Using these insights, AEA seeks to better connect with the evaluation community on Twitter and other social media channels.

Here is a quick-and-easy guide to adding tabs and streams to your dashboard

Rad Resource: Using Hashtags and Keywords to Follow the Conversation

HootSuite is an excellent resource for staying connected with other evaluators on social media and joining evaluation-related conversations. Add streams to your dashboard that follow keywords or hashtags and HootSuite will search social platforms for the most recent and relevant posts. This is where you come in – jump in, and say hello! Offer your thoughts, insights, and experience to add value to one of the many conversations that are happening. You may just meet some new friends!

Choosing your hashtags depends on the topics you are interested in, be it evaluation (#eval), data visualization (#dataviz), or even helpful excel tips (#exceltip). Hashtags also allow you to follow along with industry events like AEA’s Evaluation 2014.  By adding #Eval14 as a stream to your dashboard, you’ll receive the most recent and updated information tweeted and what other evaluators are saying about the event.

Want to learn more? Here’s a helpful resource from Fresh View Concepts on how to set up your HootSuite Dashboard.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Dan McDonnell on Evaluating Your Tweets
  2. Dan McDonnell on Using Lists to Become a Twitter Power User
  3. Dan McDonnell on 3 Tools to Make Reading and Sharing On Twitter Even Easier

MIE TIG Week: Andrea Guajardo on Culturally Responsive Evaluation in a Health Setting

American Evaluation Association 365 Blog - Fri, 08/22/2014 - 01:15

I am Andrea Guajardo, Director of Community Health for CHRISTUS Santa Rosa Health System and a doctoral student at the University of the Incarnate Word seeking a PhD focused on Organizational Leadership and Program Evaluation.  My role at the hospital provides numerous opportunities for evaluation of community-based programs as well as community collaborations, but the concept of culturally competent evaluation is largely foreign to practitioners who are focused on day-to-day operations.

Lessons Learned: Program managers and coordinators of community-based programs recognize that evaluation is crucial for improved health outcomes and process improvement.  They also know how important cultural competence can be to the successful operation of a program, but unless these program managers take the proactive step of delving deeper, a more robust evaluation plan will not come to fruition.  Evaluators must build a bridge from the accepted notion of basic evaluation to that of culturally responsive evaluation by emphasizing the fact the same culture that shapes program operations might also significantly impact the evaluation process.  Making this leap for program managers is especially important in programs that serve diverse populations whose social determinants put them at risk of disparities in healthcare.

Managers, staff, and other stakeholders in community-based programs and interventions may not be fully aware of the implications for successful operations if evaluation is not conducted in a way that does not consider the culture in which the program exists.  Highlighting this and advocating for the continuous consideration of cultural competence during service provision as well as evaluation will facilitate program improvement and healthy communities.

Rad Resources:  Check out the Office of Minority Health’s page on cultural competence for their slant on the importance of this construct.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Cultural Competence Week: Karen Anderson on Cultural Competence in Evaluation Resources
  2. Cultural Competence Week: Melanie Hwalek on the Adoption of the AEA Public Statement on Cultural Competence in Evaluation – Moving From Policy to Practice and Practice to Policy
  3. Cultural Competence Week: Rupu Gupta, Dominica McBride and Leah C. Neubauer on Critical Self-Reflection and Cultural Competence: From Theory to Practice

MIE TIG Week: Maurice Samuels on Building an Evaluator’s Capacity to Conduct Culturally Competent Evaluations

American Evaluation Association 365 Blog - Thu, 08/21/2014 - 01:15

Greetings my name is Maurice Samuels and I’m a Lead Evaluation and Research Associate at Outlier Research and Evaluation, CEMSE|University of Chicago. Our group recently hosted an American Evaluation Association Graduate Education Diversity Intern (GEDI). This was a wonderful opportunity for me and my colleagues to influence the development of a new member to the field. She had the experience of conducting an evaluation, more importantly we supported her thinking about and practice of cultural competence in evaluation. Below are several helpful tips to introduce evaluators to cultural competence in evaluation:

Hot Tips:

  1. Immerse yourself in the literature – It is important to have an understanding of evaluation frameworks and approaches (e.g., culturally responsive evaluation, contextually responsive evaluation, cross- cultural evaluation) that are sensitized to culture and context in order to stimulate thinking about the role of culture in evaluation. Equally important is to have a comprehensive understanding of how culture has been characterized in other fields such as anthropology, health, and social work. This is particularly helpful due to the various ways in which culture can be understood. For articles on the role of culture in evaluation check out http://education.illinois.edu/crea/publications.
  2. Use the resources available through the American Evaluation Association (AEA) – The AEA has several Topical Interest Groups (TIGs) that have an explicit commitment to culture and diversity (e.g., Multiethnic Issues in Evaluation (MIE) TIG; Disabilities and Other Vulnerable Populations (DOVP) TIG; Lesbian, Gay, Bisexual, and Transgender Issues (LGBT) TIG; Indigenous Peoples in Evaluation TIG; Feminist Issues in Evaluation TIG; International and Cross Cultural Evaluation (ICCE) TIG). In addition, commit yourself to AEA’s Cultural Competence in Evaluation and review their Introduction to the Cultural Readings of The Program Evaluation Standards and the Guiding Principles for Evaluators.
  3. Create opportunities to engage in dialogue about cultural competence – Introduce or network with people in the field with similar interest and those that are enacting cultural competence is important to making the practice concrete. Further, this encourages open conversations about culture, which helps to refine ones notions of cultural competence and provides multiple perspectives to draw upon.
  4. Encourage strong field work practices and self-reflection – When in the field it is important that the evaluator builds relationships with clients and stakeholders, understand the context of the program and the surrounding community, and gives back to the community in tangible ways such as volunteering at the program or attending program sponsored events that are not related to the evaluation.   As for self-reflection, it is important to document and share decisions made and assumptions when in the field through journaling and debriefing with a colleague.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Cultural Competence Week: Dominica McBride on AEA 2013 and Spreading the Word on Cultural Competence
  2. Cultural Competence Week: Melanie Hwalek on the Adoption of the AEA Public Statement on Cultural Competence in Evaluation – Moving From Policy to Practice and Practice to Policy
  3. Cultural Competence Week: Osman Ozturgut, Tamara Bertrand Jones, and Cindy Crusto on Cultural Competence in Evaluation Dissemination Working Group: Teaching in the 21st Century

MIE TIG Week: Stafford Hood and Rodney Hopson on Continuing the Journey on Culture and Cultural Context in Evaluation

American Evaluation Association 365 Blog - Wed, 08/20/2014 - 01:15

Greetings AEA and evaluation family, we’re Stafford Hood, professor, University of Illinois-Urbana Champaign and Director, Center for Culturally Responsive Evaluation and Assessment (CREA) and Rodney Hopson, professor, George Mason University and Senior Research Fellow, Center for Education Policy and Evaluation.

We are members of the AEA Multi-Ethnic Issue TIG, having been long time members and having seen the TIG grow over twenty (20) years.  Additionally, we promote the historical and contemporary development of Culturally Responsive Evaluation (CRE).  Grounded in traditions of Robert Stake’s Responsive Evaluation in the 1970s and influenced by the work of Gloria Ladson-Billings, Jackie Jordan Irvine, and Carol Lee who coined Culturally Responsive Pedagogy twenty years later. CRE marries these perspectives into a holistic evaluation framework that centers culture throughout evaluation.  Of particular attention to groups historically marginalized, CRE seeks to balance their interests and matters of equity into the evaluation process.

Hot Tip:  Refer to CRE framework in the 2010 NSF User-Friendly Guide (especially the chapter by Henry Frierson, Stafford Hood, Gerunda Hughes and Veronica Thomas) and the previous Hot Tip to illustrate how CRE can be applied to evaluation practices. 

Lesson Learned: There is a recognizable growth in what some may now call our culturally responsive evaluation community, particularly in the presence of a younger and more diverse cadre of evaluators. A recent search of scholar.google.com of the terms culturally responsive evaluation (CRE) and culturally competent evaluation (CCE) anywhere in an article or chapter or title between 1990 and 2013 indicates the major increase in this discourse over a little more than a decade is illustrated in the table below:

Rad Resources:

  • CREA is an international and interdisciplinary evaluation center that is grounded in the need for designing and conducting evaluations and assessments that embody cognitive, cultural, and interdisciplinary diversity that are actively responsive to culturally diverse communities and their academic performance goals;
  • CREA’s second conference is upcoming!: “Forging Alliances For Action:  Culturally Responsive Evaluation Across Fields of Practice” will be held September 18-20, 2014 at the Oak Brook Hills Resort, Chicago – Oak Brook, IL and feature seasoned and emerging scholars and practitioners in the field;
  • AEA Statement on Cultural Competence in Evaluation is the (2011) membership-approved document as the result of the Building Diversity Initiative (co-sponsored by AEA and W.K.Kellogg Foundation in 1999);
  • Indigenous Framework for Evaluation, which synthesizes Indigenous ways of knowing and Western evaluation practice, is summarized in a Canadian Journal of Program Evaluation 2010 paper by Joan LaFrance and Richard Nichols.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Stafford Hood on the 2013 CREA Conference: Repositioning Culture in Evaluation and Assessment
  2. Cultural Competence Week: Karen Anderson on Cultural Competence in Evaluation Resources
  3. Cultural Competence Week: Leah Neubauer on Resources and Updates from the Practice Committee

MIE TIG Week: Ray Kennard Haynes on the Use of Domestic Diversity Evaluation Checklists in Higher Education

American Evaluation Association 365 Blog - Tue, 08/19/2014 - 01:15

My name is Ray Kennard Haynes and I am an Assistant Professor at Indiana University- Bloomington and I have a keen interest domestic racial Diversity in Higher Education (HE).   Since the 1970s the United States (U.S.) has attempted to address Diversity by focusing primarily on race and gender through Equal Employment Opportunity (EEO) legislation. This legislation produced some gains; however, those gains have now eroded and are under threat due to legal challenges.

HE institutions in the US have ostensibly embraced Diversity and even claim to manage it. Evidence of this commitment to diversity can be seen in the proliferation of Diversity offices and programs at HE institutions and with the advent of the position of Chief Diversity Officer (CDO). The casual observer could reasonably conclude that Diversity has been achieved in HE. Surely, we see evidence of this reality with the CDO position and ubiquitous Diversity commitment statements. Note too, that the term university can also be construed as: the many and different in one place. Given this meaning and the fact that one in every two U.S. residents will be non-white by the year 2050, Diversity in higher education is a fait accompli. Is HE really diverse with respect domestic racial groups (i.e. African-Americans and Latino-Americans)?

Hot Tips: Research suggests that despite increasing racial diversity, communities and schools are re-segregating to levels representative of the 1960s. In highly selective institutions, diversity has come to mean many things and underrepresented domestic students and faculty are becoming an increasingly smaller part of the Diversity calculus. The evidence suggests HE is becoming less domestically diverse because of the negative co-variation between increases in domestic racial diversity and decreasing access for African-Americans and Latino-Americans to higher education, especially at highly selective schools.

One way for HE to show its commitment to domestic Diversity is to define and evaluate it within the broader construct of DIVERSITY that includes visible and non-visible differences.

Evaluation checklists can be applied to assess domestic diversity deficits and related program implementation thoroughness.

For HE institutions and evaluators who believe that domestic diversity matters, a good place to start is to create Domestic Diversity Evaluation Checklists that assess for both Diversity and Inclusion. These checklists should include dimensions that capture:

  • Diversity investment: the budget (investment) associated with domestic racial diversity
  • Structural diversity: the numbers of underrepresented domestic students and faculty
  • Diversity Climate: decision making and the level of meaningful cross-race interaction and inclusion in shaping the culture and direction of the HE institution

Rad Resources: For practical help on checklists you may access, see Western Michigan University’s page on evaluation checklists and some examples of evaluation checklists.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Cultural Competence Week: Lisa Aponte-Soto and Leah Christina Neubauer on Increasing the AEA Latino/a Visibility and Scholarship
  2. GEDI Week: Faheemah Mustafaa on Pursuing Racial Equity in Evaluation Practice
  3. GEDI Week: Nnenia Campell and Saúl Maldonado on Amplifying Definitions of Diversity in the Discourse and Practice of Culturally Responsive Evaluation, Part 1

MIE TIG Week: Nicole Clark on Engaging Young Women of Color in Program Design & Evaluation

American Evaluation Association 365 Blog - Mon, 08/18/2014 - 01:15

Hello! I’m Nicole Clark, a licensed social worker and independent evaluator for Nicole Clark Consulting. I specialize in working with organizations and agencies to design, implement, and evaluate programs and services specifically for women and young women of color.

Young women of color (YWOC) face many issues, including racism, sexism, ageism, immigrant status, socioeconomic status, and sexuality. How can evaluators make sure the programs we design and evaluate are affirming, inclusive, and raise the voices of YWOC?

To help you be more effective at engaging young Black, Latina, Asian/Pacific Islander, and Native/Indigenous women in your evaluation work, here are my lessons learned and a rad resource on engaging YWOC:

Lessons Learned: Not all YWOC are the same- YWOC are not a monolithic group. Within communities of color, there are a variety of cultures, customs, and regional differences to consider.

Meet YWOC where they are- What are the priorities of the YWOC involved in the program or service? When an organization is developing a program on HIV prevention while the YWOC they’re targeting are more concerned with the violence happening in their community, there’s a disconnect. What the organization (and even you as the evaluator) considers a high priority may not be to the YWOC involved.

Be mindful of slang and unnecessary jargon- Make your evaluation questions easy to understand and free from jargon. Be mindful of using slang words with YWOC. Given cultural and regional considerations (along with the stark difference in age between you as the evaluator and of the YWOC), slang words may not go over well.

Start broad, then get specific- Let’s use an example of creating a evaluation questions on reproductive rights and YWOC. Creating evaluation questions around “reproductive rights” may not be as effective to YWOC as creating evaluation questions on “taking care of yourself.” While both can mean the same thing, “taking care of yourself’ evokes an overall feeling of wellness and can get YWOC thinking of specific ways in which they want to take care of themselves. This can be narrowed down to aspects of their health they want to be more empowered on, and you can help organizations hone in on these needs to develop a program or service that YWOC would be interested in.

Rad Resource: A great example of a YWOC-led program is the Young Women of Color Leadership Council (YWOCLC), a youth initiative through Advocates For Youth. Through thoughtful engagement of young people in their work, the YWOCLC cultivates a message of empowerment for young women of color, and it serves as a great example of a true youth-organization partnership framework. Pass this resource along to the youth-focused organizations you work with!

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Cheri Hoffman on Youth-Led Evaluation
  2. Kim Sabo Flores on Engaging Youth in Evaluation
  3. YFE Week: Katie Richards-Schuster on a Declaration for Youth Participation

MIE TIG Week: Dominica McBride on Living with the People

American Evaluation Association 365 Blog - Sun, 08/17/2014 - 01:15

Hi again, I’m Dominica McBride, Founder and CEO of Become: Center for Community Engagement and Social Change. A few weeks ago, I wrote a tip on the importance of cultural competence. I wrote on the perpetual sociopolitical dilemmas we face as a society. Today, I’m providing a way to contribute to alleviating these issues.

Start with the wisdom of Lao Tzu -Go to the people. Live with them. Learn from them. Love them. Startwith what they know. Build with what they have. But with the bestleaders, when the work is done, the task accomplished, the people willsay ‘We have done this ourselves.’”

It is out of this concept that real and sustainable transformation happens. I work in communities that are marginalized both socio-politically and economically. A remedy to this reality is co-creation; those affected by the decisions sit at the table, are equal partners in making the decisions, and co-create the conditions they desire.

Lesson Learned: For this to happen, framing and language are key. In one community project, we decided to call the community evaluation team the “elevation team,” which connotes a collective process of creating and realizing a vision. From this framing and subsequent relationship building, we built an evaluation team with parents, youth, elders, and organizational staff. Together, we’ve established a team vision and mission, evaluation question questions, methods, and are now collecting data.

Hot Tips:

Believe in people. Even though someone may not have completed high school or be over the age of 12 doesn’t mean they are not capable. The youth on our community evaluation team have come up with some of the best evaluation questions and now are engaging other youth in ways we, as adults, are not as able.

Ask. Some make the mistake of thinking that community members (especially in marginalized areas) would not want to be involved in an evaluation or social change process. I’ve found this to be far from the truth, especially if the evaluation targets an issue about which they are passionate. In the discovery process, we learned what they cared about and then asked if they would be involved.

Build genuine, interdependent relationships. One-on-ones are at the heart of community organizing. Why? It’s because relationships are necessary in developing and maintaining cohesion and motivation. They are the glue for teams, especially those addressing challenging social issues. If relationships fall apart, the initiative will likely fail.

Rad Resources:  Check out the Community Tool Box for tips and tools on strengthening partnerships, advocacy, and sustaining the initiative.

Read Whatever it Takes by Paul Tough about an inspiring story on how to create change.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. 

 

 

Related posts:

  1. YFE Week: Katie Richards-Schuster on a Declaration for Youth Participation
  2. Linda Lee on Using Visual Methods to Give Voice in Evaluations
  3. Pam Larson Nippolt on Soft Skills for Youth

Sheila B Robinson on Seeking an aea365 Intern!

American Evaluation Association 365 Blog - Sat, 08/16/2014 - 05:50

Hello loyal readers! I am Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor and I’m looking for a partner!

Get Involved: We’re looking for an aea365 intern  – a volunteer curator willing to donate perhaps an hour or two a week to help aea365 continue its commitment to maintaining a high quality daily blog by and for evaluators.

The aea365 intern will primarily assist with:

  • Recruiting contributors – sending invitations, communicating with leaders of sponsoring groups
  • Shepherding contributors – sending reminders, asking questions, giving thanks
  • Uploading contributions – entering posts into the aea365 wordpress-based website (very easy to learn!)
  • Contributing occasional aea365 posts

The commitment requires on average approximately 1-2 hours per week for six months beginning approximately October 1 and running through approximately April 1.

Lesson Learned: I began my position as Lead Curator with a six-month commitment and that was a year and a half ago! Yes, it’s just that fun and rewarding. I learn so much from reading each post, and I love “meeting” evaluators through my communication with them.

Hot Tip: The ideal intern has contributed before to aea365, or at a minimum is a regular reader familiar with the format, breadth, and style of entries. She or he has good writing skills and communications skills and is interested in making connections across the evaluation community. Finally, the work can be done remotely, from anywhere, and thus the intern should be self-directed, organized, and adept at meeting deadlines.

Serving as an aea365 intern is a great way to build your professional network and expand your knowledge of the breadth and depth of the field. The intern will receive ongoing mentoring throughout the term of the internship as well as support in learning how to use wordpress.

This is a volunteer position, and as such compensation will be in the form of our sincerest gratitude, thanks, and recognition of your contribution!

To apply – on or before Friday, September 12, send the following to aea365@eval.org: (1) a brief letter of interest noting your favorite type(s) of aea365 posts and why, and (2) an example original aea365 post following the contribution guidelines and demonstrating your writing/editing capacity.

Cool Trick: Be sure to follow the contribution guidelines, and proofread your work when submitting your example post.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Susan Kistler on Seeking an aea365 Intern
  2. Susan Kistler on Seeking an aea365 Intern
  3. Sheila B. Robinson on Joining the aea365 Author Community

Changing the Meme of Constant Growth

Networking Action - Fri, 08/15/2014 - 10:34

Constant growth is driving humanity off an ecological cliff, writes Dr. Sandra Waddock (Boston College). She describes how growth has become a “meme,” a core value, and she identifies diverse alternatives, from “thrivability” to “enough.” Dr. Waddock is active with

Kirk Knestis on Innovation Research and Development (R&D) vs. Program Evaluation

American Evaluation Association 365 Blog - Fri, 08/15/2014 - 01:15

Kirk Knestis, here, CEO of Hezel Associates—a research and evaluation firm specializing in education innovations. Like many of you, I’ve participated in “evaluation versus research” conversations. That distinction is certainly interesting, but our work in studying science, technology, engineering, and math education (STEM) leaves me more intrigued with what I call the “NSF Conundrum”—confusion among stakeholders (not least National Science Foundation [NSF] program officers) about the expected role of an “external evaluator” as described in a proposal or implemented for a funded project. This has been a consistent challenge in our practice, and is increasingly common among other agencies’ programs (e.g., Departments of Education or Labor). The good news is that a solution may be at hand…

Lessons Learned – The most constructive distinction here is between (a) studying the innovation of interest, and (b) studying the implementation and impact of the activities required for that inquiry. For this conversation, call the former “research” (following NSF’s lead) and the latter “evaluation”—or more particularly “program evaluation,” to further elaborate the differences. Grantees funded by NSF (and increasingly by other agencies) are called “Principal Investigators.” It is presumed that they are doing some kind of research. The problem is that their research sometimes looks like, or gets labeled “evaluation.”

Hot Tip – If it seems like this is happening (purposes and terms are muddled), reframe planning conversations around the differences described above—again, between research, or more accurately “research and development” (R&D) of the innovation of interest, and assessments of the quality and results of that R&D work (“evaluation” or “program evaluation”).

Hot Tip – When reframing planning conversations, take into consideration the new-for-2013 Common Guidelines for Education Research and Development developed by NSF and US ED Institute of Education Sciences (IES). The Guidelines delineate six distinct types of R&D, based on the maturity of the innovation being studied. More importantly, they clarify “justifications for and evidence expected from each type of study.” Determine where in that conceptual framework the proposed research is situated.

Hot Tip – Bearing that in mind, explicate ALL necessary R&D and evaluation purposes associated with the project in question. Clarify questions to be answered, data requirements, data collection and analysis strategies, deliverables, and roles separately for each purpose. Define, budget, assign, and implement the R&D and the evaluation, noting that some data may support both. Finally, note that the evaluation of research activities poses interesting conceptual and methodological challenges, but that’s a different tip for a different day…

Rad Resources – The BetterEvaluation site features an excellent article framing the research-evaluation distinction: Ways of Framing the Difference between Research and Evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Climate Ed Eval Week: Dan Zalles on Maintaining Flexibility in Evaluating Outcomes Across Varying Implementations
  2. STEM TIG Week: Kim Kelly on Key Insights on the Journey from Psychological Science Researcher to Program Evaluator
  3. STEM Week: Veronica Smith on Evaluating Student Learning in STEM Subjects

Carla Hillerns and Pei-Pei Lei on You had me at Hello: Effective Email Subject Lines for Survey Invitations

American Evaluation Association 365 Blog - Thu, 08/14/2014 - 01:15

Did we get your attention? We hope so. We are Carla Hillerns and Pei-Pei Lei – survey enthusiasts at the Office of Survey Research at the University of Massachusetts Medical School.

An email subject line can be a powerful first impression of an online survey. It has the potential to convince someone to open your email and take your survey. Or it can be dismissed as unimportant or irrelevant. Today’s post offers ideas for creating subject lines that maximize email open rates and survey completion rates.

Hot Tips:

  • Make it compelling – Include persuasive phrasing suited for your target recipients, such as “make your opinion count” and “brief survey.” Research in the marketing world shows that words that convey importance, like “urgent,” can lead to higher open rates.
  • Be clear – Use words that are specific and recognizable to recipients. Mention elements of the study name if they will resonate with respondents but beware of cryptic study names – just because you know what it means doesn’t mean that they will.
  • Keep it short – Many email systems, particularly on mobile devices, display a limited number of characters in the subject line. So don’t exceed 50 characters.
  • Mix it up – Vary your subject line if you are sending multiple emails to the same recipient.
  • Avoid words like “Free Gift” (even if you offer one) – Certain words may cause your email to be labeled as spam.
  • Test it – Get feedback from stakeholders before you finalize the subject line. To go one step further, consider randomly assigning different subject lines to pilot groups to see if there’s a difference in open rates or survey completion rates.

Cool Trick:

  • Personalization – Some survey software systems allow you to merge customized/personalized information into the subject line, such as “Rate your experience with [Medical Practice Name].”

Lesson Learned:

  • Plan ahead for compliance – Make sure that any recruitment materials and procedures follow applicable regulations and receive Institutional Review Board approval if necessary.

Rad Resource:

  • This link provides a list of spam trigger words to avoid.

We’re interested in your suggestions. Please leave a comment if you have a subject line idea that you’d like to share.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Shortcut Week: Ana Drake on Macros and Hotkeys
  2. Susan Kistler on Formatting Qualitative Questions for Online Surveys
  3. Katya Petrochenkov on Surveygizmo

Molly Ryan on Using Icon Array to Visualize Data

American Evaluation Association 365 Blog - Wed, 08/13/2014 - 01:15

Hello! I’m Molly Ryan, a Research Associate at the Institute for Community Health (ICH), a non-profit in Cambridge, MA that specializes in community based participatory research and evaluation. I am part of the team evaluating the Central Massachusetts Child Trauma Center (CMCTC) initiative, which seeks to strengthen and improve access to evidence-based, trauma-informed mental health treatment for children and adolescents. I would like to share a great resource that we use to visualize and communicate findings with our CMCTC partners.

Rad Resource: Icon Array University of Michigan researchers developed Icon Array to simply and effectively communicate risks to patients. For more information on why icons are so rad, check out Icon Array’s explanation and bibliography.

Hot Tip: Icon Array offers 5 different icons to choose from.

6 out of 11 reassessments (54.5%) received

Hot Tip: Icons aren’t just for risk communication! We use icons to help our partners understand and visualize their progress collecting reassessment data for clients.

14 out of 24 reassessments (58.3%) received
• 9 out of 14 (64.3%) complete
• 5 out of 14 (35.7%) incomplete

Cool Trick: Icon Array allows you to illustrate partial risk by filling only a portion of the icon. We used this feature to communicate whether a reassessment was complete or incomplete for a given client.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Sheila B Robinson on Being an “Iconic” Presenter! ;-)
  2. Susan Kistler on a Free Tool for Adding Interactivity to Online Reports: Innovative Reporting Part IV
  3. Loraine Park, Carolyn Verheyen, and Eric Wat on Tips on Asset Mapping

Emily Lauer and Courtney Dutra on Person-Centered Evaluation: Aging and Disability Services

American Evaluation Association 365 Blog - Tue, 08/12/2014 - 01:15

Hello, we are Emily Lauer and Courtney Dutra from the University of Massachusetts Medical School’s Center for Developmental Disability Evaluation and Research (CDDER). We have designed and conducted a number of evaluations of programs and projects for elders and people with disabilities. In this post, we focus on the topic of person-centered evaluations. We have found this type of evaluation to be one of the most effective strategies for evaluating aging and/or disability services, as it tends to provide results that are more valid and useful through empowering consumers in the evaluation process.

Why person-centered evaluation? Traditional evaluations tend to use a one-size-fits-all approach that risks supplanting judgment about consumers’ individual perspectives and may not evaluate components that consumers feel are relevant. In a person-centered evaluation, consumers of the program’s or project’s services are involved throughout the evaluation process. A person-centered evaluation ensures the program or project is evaluated in a way that:

  • is meaningful to consumers;
  • is flexible enough to incorporate varied perspectives; and
  • results in findings that are understandable to and shared with consumers.

Lessons Learned:

Key steps to designing a person-centered evaluation?

  1. Design the evaluation with consumers. Involve consumers in the development process for the evaluation and its tools.
  2. Design evaluations that empower consumers
    • Utilize evaluation tools that support consumers in thinking critically and constructively about their experiences and the program under evaluation. Consider using a conversational format to solicit experiential information.
    • Minimize the use of close-ended questions that force responses into categories. Instead, consider methods such as semi-structured interviews that include open-ended questions which enable consumers to provide feedback about what is relevant to them.
    • Consider the evaluation from the consumer’s perspective. Design evaluation tools that support varied communication levels, are culturally relevant, and consider the cognitive level (e.g. intellectual disabilities, dementia) of consumers.
  1. Involve consumers as evaluators. Consider training consumers to help conduct the evaluation (e.g. interviewers).
  2. Use a supportive environment. In a supportive environment, consumers are more likely to feel they can express themselves without repercussion, their input is valued, and their voices are respected, resulting in more meaningful feedback.

Hot Tip: Conduct the evaluation interview in a location that is comfortable and familiar for the consumer. When involving family or support staff to help the consumer communicate or feel comfortable, ensure they do not speak “for” the consumer, and that the consumer chooses their involvement.

  1. Involve consumers in synthesizing results. Involve consumers in formulating the results of the evaluation.

Rad Resource: Use Plain Language to write questions and summarize findings that are understandable to consumers.

Many strategies exist to elicit feedback from consumers who do not communicate verbally. Use these methods to include the perspective of these consumers.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. EPE TIG Week: Yvonne M. Watson on Buying Green Today to Save and Sustain Our Tomorrow
  2. MA PCMH Eval Week: Christine Johnson on Self-Assessment Medical Home Transformation Tools
  3. Joi Moore on Facilitating Systematic Evaluation Activities

PD Presenters: Kerry Zaleski, Mary Crave and Tererai Trent on Whose Judgment Matters Most? Using Child-to-Child approaches to evaluate vulnerability-centered programs

American Evaluation Association 365 Blog - Mon, 08/11/2014 - 01:15

Hi Eval Friends! We are Kerry Zaleski and Mary Crave of the University of Wisconsin-Extension and Tererai Trent of Tinogona Foundation and Drexel University. Over the past few years we have co-facilitated workshops on participatory M&E methods for centering vulnerable voices at AEA conferences and eStudies.

This year, we are pleased to introduce participatory processes for engaging young people in evaluation during a half day professional development workshop, borrowing from Child-to-Child approaches. Young people can be active change agents when involved in processes to identify needs, develop solutions and monitor and evaluate changes in attitudes and behaviors for improved health and well-being.

Child-to-Child approaches help center evaluation criteria around the values and perspectives of young people, creating environments for continual learning among peers and families. Children learn new academic skills and evaluative thinking while having fun solving community problems!

Child-to-Child approaches help young people lead their communities to:

  • Investigate, plan, monitor and evaluate community programs by centering the values and perspective of people affected most by poverty and inequality.
  • Overcome stigma and discrimination by intentionally engaging marginalized people in evaluation processes.

We are excited to introduce Abdul Thoronka, a community health specialist from Sierra Leone, as a new member of our team. Abdul has extensive experience using participatory methods and Child-to-Child approaches in conflict- and trauma- affected communities in Africa and the US.

Lessons Learned:

  • Adult community members tend to be less skeptical and more engaged when ‘investigation’ types of exercises are led by children in their community rather than external ‘experts’. The exercises make learning about positive behavior change fun and entertaining for the entire community.
  • Young people are not afraid to ‘tell the truth’ about what they observe.
  • Exercises to monitor behaviors often turn into a healthy competition between young people and their families.

Hot Tips:

  • Child-to-child approaches can be used to engage young people at all stages of an intervention. Tools can include various forms of community mapping, ranking, prioritizing, values-based criteria-setting and establishing a baseline to measure change before and after an intervention.
  • Build in educational curricula by having the children draw a matrix, calculate percentages or develop a bar chart to compare amounts or frequency by different characteristics.
  • Explain the importance of disaggregating data to understand health and other disparities by different attributes (e.g. gender, age, ability, race, ethnicity)
  • Ask children to think of evaluation questions that would help them better understand their situation.

Rad Resources:

Child-to-Child Trust

The Barefoot Guide Connection

AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members).

Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities.

Want to learn more? Register for Whose Judgment Matters Most: Using Child-to-Child approaches to evaluate vulnerability-centered programs at Evaluation 2014.

We’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Kim Sabo Flores on Engaging Youth in Evaluation
  2. Ed Eval Week: Silvana Bialosiewicz on Tips for Collecting Valid and Reliable Data from Children
  3. YFE Week: Kim Sabo Flores and David White on Youth Focused Evaluation: One Year And Growing Strong!