Resource Feeds

PD Presenters: Brian Yates on Doing Cost-Inclusive Evaluation. Part IV: Cost-Effectiveness Analysis and Cost-Utility Analysis

American Evaluation Association 365 Blog - Sun, 08/10/2014 - 05:30

Hi! I’m Brian Yates. This is the fourth piece in a series of AEA365′s on using costs in evaluation. I started using costs as well as outcomes in my program evaluations in the mid-1970s, when I joined the faculty of the Department of Psychology at American University in Washington, DC. Today I’m still including costs in my research and consultation on mental health, substance abuse, and consumer-operated services.

Three other 365ers focused on evaluating costs, benefits, and cost-benefit of programs; there’s even more to cost-inclusive evaluation!

Lesson Learned: What if important outcomes of a program are not monetary, and cannot be converted into monetary units? Easy answer: do a cost-effectiveness analysis or a cost-utility analysis!

Cost-effectiveness analysis (CEA) describes relationships between types, amounts, and values of resources consumed by a program and the outcomes of that program — with outcomes measured in their natural units. For example, the outcome of a prevention program for seasonal depression could be measured as days free of depression. Program costs could be contrasted to these outcomes by calculating “dollars per depression-free day” or “average hours of therapy A versus therapy B per depression-free day generated.”

Hot Tip: How to compare apples and oranges. “But how can you compare costs of generating one outcome with costs of generating another? Cost per depression-free day versus cost per drug-free day?!” No problem: compare these “apples” and “oranges” by bumping the units up one notch of generality, to fruit. Diverse health program outcomes now are measured in common units of Quality-Adjusted Life Years (QALYs), with a year of living with depression as being worth substantially less than a year of living without depression. This and other forms of cost-utility analysis (CUA ) are increasingly used for health services funding.

Lessons Learned:

Insight Offered: It’s easy to dismiss using of costs in evaluation with “…shows the price of everything and the value of nothing.” Actually, cost-inclusive evaluation encompasses types and amounts of limited societal resources used to achieve outcomes measured in ways meaningful to funders and other stakeholders.

More? Yes! Lately I’ve gained better understanding of relationships between resources invested in programs and outcomes produced by programs when I work with stakeholders to also include information on program activities and clients’ biopsychosocial processes. More on that later.

Rad Resources:

Cost-effectiveness analysis (2nd edition) by Levin and McEwan.

Analyzing costs, procedures, processes, and outcomes in human services by Yates.

Want to learn more? Brian will be presenting a Professional Development workshop at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Brian Yates on Doing Cost-Inclusive Evaluation. Part II: Measuring Monetary Benefits
  2. Brian Yates on Doing Cost-Inclusive Evaluation – Part I: Measuring Costs
  3. PD Presenters Week: Brian Yates on Doing Cost-Inclusive Evaluation. Part III: Cost-Benefit Analysis

Dan McDonnell on Becoming an Amateur Graphic Designer with Canva

American Evaluation Association 365 Blog - Sat, 08/09/2014 - 11:16

Hello, my name is Dan McDonnell and I am a Community Manager at the American Evaluation Association (AEA) and a regular contributor to AEA365. Sharing photos has always been a popular pasttime, and the rise of social media has made it even easier than ever. The only major obstacle is the differing requirements by channel when it comes to file dimensions: you’ll often have to resize a photo to make it fit within the dimensions of the channel you’re using, and there’s no uniformity in requirements between Facebook, LinkedIn and Twitter.

Canva is an online tool that turns you into an amateur graphic designer. Let’s say your research has uncovered some interesting facts, and you’d like to visualize this data in an interesting way. Canva can help you turn that data into a visually stunning Infographic, or a cool flyer – with no graphic design experience needed!

With drag-and-drop functionality, you can create collages, create or alter text, pull in stunning backgrounds or personal photos into preset templates and browse an extensive library of graphics, stock photos and layout options to use in your designs. The main graphics manipulation features come free, and some stock graphics are available for purchase (usually around $1.00 each).

Hot Tip: Create a Photo Collage

Canva

Starting a new design on Canva is a cinch. First, select from one of the preset templates from the top bar (see image above). You can choose from a number of options, including a header photo for a handful of social networks, business cards, general social media graphics or you can even set your own custom dimensions. Once you’ve made your selection, you can choose a layout to customize, from hundreds of different examples. I recommend starting with something simple – like a photo collage. Click ‘Search’ and select the ‘Grids’ option.

Make Your Selection

Select one of the different Grid layouts by clicking, then click the ‘Uploads’ button to upload photos from either your Facebook page or your harddrive. Once they’ve been uploaded, simply drag and drop into the design area, and once you’re pleased with how it looks, click ‘Link and Publish,’ and you’re ready to go. Simply download the image file, and share it on the social media platform of your choice!

This is really just scratching the surface of what Canva is capable of, but hopefully it gives you enough skills to be dangerous as an amateur graphic designer. Enjoy!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Dan McDonnell on Google + | Dan McDonnell on Twitter  

Related posts:

  1. Dan McDonnell on Setting Up Your New Twitter Profile Page
  2. Dan McDonnell on Using Lists to Become a Twitter Power User
  3. Dan McDonnell on Upcoming Changes to Twitter

Eritrea: Ending the Exodus?

Authoritarian rule, social malaise and open-ended national service drive thousands of young people to flee Eritrea every month, exposing the shortcomings of a leadership that has lost the confidence of the next generation. The International Crisis Group’s latest briefing, Eritrea: Ending the Exodus?, shows that while the government turned this flight to its advantage for a time, the scale – and attendant criminality – of the exodus are now pressing problems.

CP TIG Week: Karen Countryman-Roswurm and Bailey Patton on Qualitative Research Methods as an Empowering Practice with Marginalized Youth

American Evaluation Association 365 Blog - Fri, 08/08/2014 - 01:15

We are Karen Countryman-Roswurm, Executive Director, and Bailey Patton, Community Outreach Coordinator, at Wichita State University’s Center for Combating Human Trafficking (CCHT). CCHT provides education, training, consultation, research, and public policy services to build the capacity for effective anti-trafficking prevention, intervention, and aftercare responses.

A primary service of CCHT is training organizations on The Lotus Victim to Vitality Anti-Trafficking ModelTM – a model that includes practice tools such as the Domestic Minor Sex Trafficking Risk and Resiliency Assessment (DMST-RRA). The DMST-RRA is based on interviews with 258 youth and is intended to assist direct-service providers in 1) increasing identification of young people at-risk of and/or subjugated to DMST; and 2) providing individualized strengths-based prevention and intervention strategies. During the development of the DMST-RRA, we learned invaluable lessons on engaging youth in empowering practices through qualitative research.

Lessons Learned:

  • Allow the process to be organically healing – This could be the first time the participant has spoken or reflected on their experience. The process of sharing one’s story can be empowering and healing when done in safe and non-exploitive environment.
  • Let the participant lead – Be flexible and fully engaged in the process. Allow the participant to take the interview where it needs to go. Do not let your desire for information or research curiosities define the experience for the participant.
  • Reflect the participant’s words back to them – By hearing their words repeated back to them, participants have the opportunity to gain insight, process, and reach their own epiphany.
  • Allow participants the opportunity to find and use their own voice – Do not try to define the experience for the participant. Let them give words to their feelings, emotions and thoughts. Telling participants what you think of their situation is disempowering.
  • Offer Validation – Help relieve the participant of self-blame and guilt for their past experiences and encourage them to focus on resiliency factors and strength.

Hot Tip:

  • Towards the development of the DMST-RRA, facilitating, transcribing, and coding qualitative interviews – the real lives of those at-risk of and/or subjugated to DMST—was at times painfully heart wrenching. Therefore, 1) Have supportive, competent partners. Tara Gregory with Wichita State University’s Center for Community Support and Research was extremely helpful during this process. 2) Recognize yourself as a “wounded healer.” Whether engaging in therapeutic and/or research practices, we must consistently seek to heal ourselves in a manner that enables us to utilize our full professional selves.

Rad Resources:

  • Wichita State University, Center for Combating Human Trafficking. Our website includes resources such as Sharing the Message of Human Trafficking: A Public Awareness and Media Guide to assist those interested in joining the anti-trafficking movement.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. CP Week: Tara Gregory on Using Storytelling to Help Organizations Develop Logic Models
  2. Oliwier Dziadkowiec and Trish Peaster on Using Social Network Analysis in Evaluation
  3. Martha Henry on Data Confidentiality and Data Ownership

Sustainable Development Goals

ODI general feed - Fri, 08/08/2014 - 00:00

A United Nations working group has drawn up proposed sustainable development goals to come into force after 2015.
This page brings together the work done on this project by different departments at the Overseas Development Institute.

Categories: Resource Feeds

Share your feedback with us!

FSG - Thu, 08/07/2014 - 04:16
Take our annual communications survey and be entered to win a $100 Amazon gift card.
Categories: Resource Feeds

CP TIG Week: Jeff Sheldon on using SEEPPO to determine empowerment evaluation model fidelity, adherence to empowerment evaluation principles, and the likelihood of psychological well – being outcomes.

American Evaluation Association 365 Blog - Thu, 08/07/2014 - 01:15

I’m Jeff Sheldon from the School of Social Science, Policy, and Evaluation at Claremont Graduate University and today I’m introducing the Survey of Empowerment Evaluation Practice, Principles, and Outcomes (SEEPPO). I developed SEEPPO for my dissertation, but more important, as a tool that can be modified for use by researchers on evaluation and evaluation practitioners.

For practitioners, SEEPPO is an 82 item self – report survey (92 items for researchers) across seven sections (nine for researchers).

  • Section one items (“Your evaluation activities”) ask for a report on behaviors in terms of the specific empowerment evaluation steps implemented.
  • Section two (“Evaluation Participant Activities”) asks for observations on the evaluation – specific behaviors of those engaged in the evaluation as they relate to the empowerment evaluation steps implemented.
  • Section three (“Changes you observed in individual’s values”) asks for a report on changes in evaluation – related values by comparing the values observed at the beginning of the evaluation to those observed at the end of the evaluation.
  • Section four items (“Changes you observed in individual’s behaviors”) ask for a report on changes observed in evaluation – related behavior and whether the sub-constructs characterizing the psychological well- being outcomes of empowerment (i.e. knowledge, skills/capacities, self-efficacy) and self – determination (competence, autonomy, and relatedness) were present by comparing observed behaviors at the beginning of the evaluation to those at evaluation’s end.
  • Section five (“Changes you observed within the organization”) items ask for a report on the changes observed within the organization as a result of the evaluation by comparing various organizational capacities at the beginning of the evaluation to those observed at evaluation’s end.
  • Section six (“Inclusiveness”) asks about the extent to which everyone who wanted to fully engage in the evaluation was included.
  • Section seven (“Accountability”) items ask about who the evaluator was accountable to during the evaluation.
  • Lastly, the items in sections eight and nine, for researchers, ask about the evaluation model used and demographics.

This is a brief “snap-shot” of SEEPPO. Item development was based on: 1) constructs found in the literature regarding the three known empowerment evaluation models and their respective implementation steps; 2) the ten principles (i.e., six process and four outcome) of empowerment evaluation; 3) the purported empowerment and self – determination outcomes for individuals and organizations engaged in the process of an empowerment evaluation; and 4) constructs found in the humanistic psychology literature on empowerment theory and self – determination theory.

Hot Tip: Theresults of SEEPPO can be used to: determine whether you or your subjects are adhering with fidelity to the empowerment evaluation model being implemented, the principles of empowerment evaluation in evidence, and the likelihood of empowerment and self – determination outcomes.

Rad Resource: Coming soon! SEEPPO will soon be widely available.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Alice Hausman on Measuring Community -Defined Indicators of Success
  2. CPE Week: Abraham Wandersman on Empowerment Evaluation and Getting to Outcomes
  3. Jeff Sheldon on the Readiness for Organizational Learning and Evaluation instrument

Maximising international finance for development in the poorest and most vulnerable countries

ODI general feed - Thu, 08/07/2014 - 00:00
This paper examines post-crisis trends and issues and proposes policy options to support positive outcomes.
Categories: Resource Feeds

Fossil fuel subsidies in developing countries

ODI general feed - Thu, 08/07/2014 - 00:00
This paper aims to provide a review of the organisations and governments involved in supporting other countries to reform their fossil fuel subsidies and the approaches being undertaken.
Categories: Resource Feeds

CP TIG Week: Rachel Becker-Klein on Using Most Significant Change: A Participatory Evaluation Strategy That Empowers Clients and Builds Their Evaluation Capacity

American Evaluation Association 365 Blog - Wed, 08/06/2014 - 01:15

My name is Rachel Becker-Klein and I am an evaluator and a Community Psychologist with almost a decade of experience evaluating programs. Since 2005, I have worked withPEER Associates, an evaluation firm that provides customized, utilization-focused program evaluation and educational research services for organizations nationwide.

Recently I have been used an interview and analysis methodology called Most Significant Change (MSC). MSC is a strategy that involves collecting and systematically analyzing significant changes that occur in programs and the lives of program participants. The methodology has been found to be useful in monitoring programmatic changes, as well as evaluating the impact of programs.

Lessons Learned: Many clients are interested in taking an active role in their evaluations, but may not be sure how to do so. MSC is a fairly intuitive approach to collecting and analyzing data that clients and participants can be trained to use. Having project staff interview their own constituents can help to create a high level of comfort for interviewees, allowing them to share more openly. Staff-conducted interviews also provides them with a sense of empowerment in collecting data. The MSC approach also includes a participatory approach to analyzing the data. In this way, the methodology can be a capacity building process in and of itself, supporting project staff to learn new and innovative monitoring and evaluation techniques that can be integrated into their own work once the external evaluators leave.

Cool Trick: In 2012, Oxfam Canada contracted with PEER Associates to conduct a case study of their partner organization in the Engendering Change (EC) program in Zimbabwe – Matabeleland AIDS Council (MAC). The EC program funds capacity-building of Oxfam Canada’s partner organizations. This program is built around a theory of change that suggests partners become more effective change agents for women’s rights when their organizational structures, policies, procedures, and programming are also more democratic and gender just.

The evaluation employed a case study approach, using MSC methodology to collect stories from MAC staff and their constituents. In this case study, PEER Associates trained MAC staff to conduct the MSC interviews, while the external evaluators documented the interviews with video and/or audio and facilitated discussions on the themes that emerged from those interviews.

Rad Resources:

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Wilder Research Week: Caryn Mohr on Case Studies
  2. Chris Michael Kirk on Negotiating the Value of Evaluation
  3. Ann Zukoski on Participatory Evaluation Approaches

Post-crisis trends in private capital flows to developing countries

ODI general feed - Wed, 08/06/2014 - 00:00
This paper focuses on the poorest and most vulnerable low income countries (“LICs”).
Categories: Resource Feeds

Indonesian climate change efforts at stake

ODI general feed - Wed, 08/06/2014 - 00:00
‪If Indonesia's new president cannot grasp the momentum gathered under Susilo Bambang Yudhoyono, and misses the opportunity to perpetuate international public and private investments in sustainable agriculture and renewable energy, it would be fateful for Indonesia and the world at large.
Categories: Resource Feeds

Climate Finance Leadership and Learning Forum

ODI general feed - Wed, 08/06/2014 - 00:00
​Africa Climate Finance Hub host the first annual seminar under the Climate Finance Readiness Leadership Programme 2014.  The seminar is aimed at participants from the government institutions responsible for national planning, budgeting, climate change policy-making and coordination. A team from the African Climate Finance Hub and the Overseas Development Institute will be facilitating, with additional input provided by leading regional and international experts.

Categories: Resource Feeds

BRICS development bank, too good to be true?

ODI general feed - Wed, 08/06/2014 - 00:00
The willingness and increasing capability of the BRICS, at last, is providing a form of global public good in development finance at a time when many of the global economic governance negotiations are in jeopardy.
Categories: Resource Feeds

Incredible Budgets - budget credibility in theory and practice

ODI general feed - Wed, 08/06/2014 - 00:00
​In many countries, particularly low-income and/or fragile states, national budgets are often poor predictors of revenue and expenditure. This roundtable will bring together CAPE staff, academics and public financial management practitioners to discuss why, in spite of this considerable effort to strengthen budget credibility, do many governments still struggle to execute their budget to plan?
Categories: Resource Feeds

Ecosynomics: Transforming to a Collaborative Future

Networking Action - Tue, 08/05/2014 - 09:33

Transformation involves changes in basic assumptions.  Changing the basic economic paradigm assumption from managing scarcity to an “ecosynomics” one of great abundance is a transformation that my good colleague Jim Ritchie-Dunham is working on.  In his new book Ecosynomics, …

Risks of Intelligence Pathologies in South Korea

A series of intelligence scandals has plagued South Korea since the fall of 2012, exposing the risk of intelligence failure, the politicisation of intelligence and direct intervention by intelligence agencies in domestic politics. In its latest report, Risks of Intelligence Pathologies in South Korea, the International Crisis Group examines measures needed to reduce those vulnerabilities and explains why failure or manipulation of intelligence in South Korea could have serious consequences for security on the peninsula and beyond.

CP TIG Week: Abraham Wandersman on Demystifying Evaluation: Even A Fourth Grader Likes Empowerment Evaluation

American Evaluation Association 365 Blog - Tue, 08/05/2014 - 01:15

Hi, I’m Abe Wandersman and I have been working since the last century to help programs achieve outcomes by building capacity for program personnel to use evaluation proactively.  The words “evaluation” and “accountability” scare many people involved in health and human services programs and in education.   They are afraid that evaluation of their program will prove embarrassing or worse and/or they may think the evaluation didn’t really evaluate their program.   Empowerment evaluation (EE) has been devoted to demystifying evaluation and putting the logic and tools of evaluation into the hands of practitioners so that they can proactively plan, implement, self-evaluate, continuously improve the quality of their work, and thereby increase the probability of achieving outcomes.

Lesson Learned: Accountability does not have to be relegated solely to “who is to blame” after a failure occurs e.g., problems in the U.S. government initial roll out of the health insurance website (and Secretary of Health and Human Services Kathleen Sebelius’ resignation) and the Veterans Administration scandal (and Secretary Shisinski’s resignation). It actually makes sense to think that individuals and organizations should be proactive and strategic about their plans, implement the plans with quality, and evaluate whether or not the time and resources spent led to outcomes. It is logical to want to know why certain things are being done and others are not, what goals an organization is trying to achieve, that the activities are designed to achieve the goals, that a clear plan is put into place and carried out with quality, and that there be an evaluation to see if it worked. EE can provide funders, practitioners, evaluators, and other key stakeholders with a results-based approach to accountability that helps them succeed.

Hot Tip: I am very pleased to let you know that in September 2014, there will be a new EE book: Empowerment Evaluation: Knowledge and Tools for Self-Assessment, Evaluation Capacity Building, and Accountability (Sage:Second Edition) edited by Fetterman, Kaftarian, & Wandersman.   Several chapters are authored by community psychologists including:  Langhout and Fernandez describe EE conducted by fourth and fifth graders; Imm et al. write about the SAMSHA service to science program that brings practice-based programs to reach evidence-based criteria; Haskell and Iachini describe empowerment evaluation in charter schools to reach educational impacts; Chinman et al describe a decade of research on the Getting To Outcomes® accountability approach; Suarez-Balcazar,Taylor-Ritzler,  & Morales-Curtin describe their work on building evaluation capacity in a community-based organization; and Lamont, Wright, Wandersman, & Hamm describe the use of practical implementation science in building quality implementation in a district school initiative integrating technology into education.

Rad Resources:

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. CPE Week Bonus: Abraham Wandersman on Empowerment Accountability
  2. CPE Week: Abraham Wandersman on Empowerment Evaluation and Getting to Outcomes
  3. CP TIG Week: Wendi L. Siebold on Where The Rubber Meets The Road: Taking A Hard Look At Organizational Capacity And The Use Of Empowerment Evaluation

Surveying livelihoods service delivery and governance - baseline evidence from Nepal

ODI general feed - Tue, 08/05/2014 - 00:00
In 2012, SLRC implemented the first round of an original sub-regional panel survey in Nepal aimed to produce data on people's livelihoods, access to and experience of basic services, and people’s perceptions of governance.
Categories: Resource Feeds

No development unless somebody pays

ODI general feed - Tue, 08/05/2014 - 00:00
In this financing progress blog Jonathan Glennie concludes that the world does not need another list of financial problems and solutions - it needs a list of priorities.
Categories: Resource Feeds