Monitoring, Evaluation and Learning Systems

PD Presenters Week: Mindy Hightower King and Courtney Brown on A Framework for Developing High Quality Performance Measurement Systems of Evaluation

Hello. We are Mindy Hightower King, Research Scientist at Indiana University and Courtney Brown, Director of Organizational Performance and Evaluation at Lumina Foundation. We have been working to strengthen and enhance performance management systems for the last decade and hope to provide some tips to help you create your own.

Lesson Learned: Why is this important?

Funders increasingly emphasize the importance of evaluation, often through performance measurement. To do this, you must develop high quality project objectives and performance measures, which are both critical to good proposals and successful evaluations.

A performance measurement system:

  • Makes it easier for you to measure your progress
  • Allows you to report progress easily and quantitatively
  • Allows everyone to easily understand the progress your program has made
  • Can make your life a lot easier

Two essential components to a performance measurement system are high quality project objectives and performance measures.

Project objectives are statements that reflect specific goals that can be used to gauge progress. Objectives help orient you toward a measure of performance outcomes and typically focus on only one aspect of a goal. Strong Project objectives concisely communicate the aims of the program and establish a foundation for high quality performance measures.

Cool Trick: When developing projective objectives, be sure to consider the following criterion of high quality project objectives: relevance, applicability, focus, and measurability.

Performance measures are indicators used at the program level to track the progress of specific outputs and outcomes a program is designed to achieve. Strong performance measures are aligned with program objectives. Good performance measurement maximizes the potential for meaningful data reporting.

Cool Trick: When developing project measures, be sure to account for the following questions:

  • What will change?
  • How much change you expect?
  • Who will achieve the change?
  • When the change will take place?

Hot Tip: Make sure your performance variables:

  • Have an action verb
  • Are measurable
  • Don’t simply state what activity will be completed 

Rad Resources: There are a host of books and articles on performance measurement systems, but here are two good online resources with examples and tips for writing high quality objectives and measures:  Guide for Writing Performance Measures  and Writing good work objectives: Where to get them and how to write them.

Want to learn more? Register for A Framework for Developing and Implementing a Performance Measurement System of Evaluation at Evaluation 2014.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. GOVT Week: David Bernstein on Top 10 Indicators of Performance Measurement Quality
  2. MA PCMH Eval Week: Ann Lawthers, Sai Cherala, and Judy Steinberg on How You Define Success Influences Your Findings
  3. Susan Wolfe on When You Can’t Do An Evaluability Assessment

PD Presenters Week: Osman Özturgut and Cindy Crusto on Remembering the “Self” in Culturally Competent Evaluation

American Evaluation Association 365 Blog - Mon, 07/28/2014 - 01:15

Hi, we’re Osman Özturgut, assistant professor, University of the Incarnate Word and Cindy Crusto, associate professor, Yale University School of Medicine. We are members of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group. We are writing to inform you of our forthcoming professional development workshop at Evaluation 2014 in Denver. Since the last meeting in Washington, we have been “learning, unlearning, and relearning” with several groups and workshop participants with respect to cultural competence. We wanted to share some of our learning experiences.

Our workshop is entitled, “Acknowledging the ‘Self’ in Developing Cultural Competency.” We developed the workshop to highlight key concepts written about in the AEA Public Statement that focus on evaluator hirself and what the evaluator hirself can do to better engage in work across cultures. As the AEA’S Public Statement explains, “Cultural competence requires awareness of self, reflection on one’s own cultural position.” Cultural competence begins with awareness and an understanding of one’s own viewpoints (learning).  Once we become aware of, reflect on, and critically analyze our existing knowledge and viewpoints, we may need to reevaluate some of our assumptions (unlearning). It is only then we can reformulate our knowledge to accommodate and adapt to new situations (relearning). This process of learning, unlearning, and relearning is the foundation of becoming a more culturally competent evaluator.

We learned that evaluators really want a safe place to talk about culture, human diversity, and issues of equity. In our session, we provide this safe place and allow for learning. Participants can explore their “half-baked ideas”, as one of our previous workshop participants had mentioned. This is the idea that we don’t always have the right words or have fully formulated thoughts and ideas regarding issues of culture, diversity, and inclusion. We believe it is crucial to provide a safe place to share ideas, even if they are “half-baked”.

Lessons Learned: We learned that the use of humor is critically important when discussing sensitive topics and communicating across cultures. It reduces anxiety and tension.

Providing a safe place for discussion is crucial, especially with audiences with diverse cultural backgrounds and viewpoints. Be open to unlearning and relearning – Remember, culture is fluid and there is always room for improvement. Get out of your comfort zone to realize the “self”.

Rad Resource: AEA’s Public Statement on Cultural Competence in Evaluation

Also, see Dunaway, Morrow & Porter’s (2012) Development and validation of the cultural competence of program evaluators (CCPE) self-report scale.

Want to learn more? Register for Acknowledging the “Self” in Developing Cultural Competency at Evaluation 2014.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Scribing: Vidhya Shanker on Discussions Regarding the AEA Cultural Competence Statement
  2. CC Week: Osman Özturgut and Tamera Bertrand Jones on Integrating Cultural Competence into Your AEA Presentation
  3. Cultural Competence Week: Rupu Gupta and Tamara Bertrand Jones on Cultural Competence Working Group Evaluation

PD Presenters Week: M. H. Clark and Haiyan Bai on Using Propensity Score Adjustments

American Evaluation Association 365 Blog - Sun, 07/27/2014 - 01:15

Hi! We are M. H. Clark and Haiyan Bai from the University of Central Florida in Orlando, Florida. Over the last several years propensity score adjustments (PSAs) have become increasingly popular; however, many evaluators are unsure of when to use them. A propensity score is the predicted probability of a participant selecting into a treatment program based on several covariates. Theses scores are used to make statistical adjustments (i.e., matching, weighting, stratification) to data from quasi-experiments to reduce selection bias.

Lesson Learned:

PSAs are not the magic bullet we had hoped they would be. Never underestimate the importance of a good design. Many researchers assume that they can fix poor designs with statistical adjustments (either with individual covariates or propensity scores). However, if you are able to randomly assign participants to treatment conditions or test several variations of your intervention, try that first. Propensity scores are meant to reduce selection bias due to non-random assignment, but can only do so much.

Hot Tip:

Plan ahead! If you know that you cannot randomly assign participants to conditions and you MUST use a quasi-experiment with propensity score adjustments, be sure that you measure covariates (individual characteristics) that are related to both the dependent variable and treatment choice. Ideally, you want to include all variables in your propensity score model that may contribute to selection bias. Many evaluators consider propensity score adjustments after they have collected data and cannot account for some critical factors that cause selection bias. In which case, treatment effects may still be biased even after PSAs.

Hot Tip:

Consider whether or not you need propensity scores to make your adjustments. If participants did not self-select into a treatment program, but were placed there because they met a certain criterion (i.e., having a test score above the 80th percentile), a traditional analysis of covariance used with regression discontinuity designs may be more efficient than PSAs. Likewise, if your participants are randomly assigned by pre-existing groups (like classrooms) using a mixed-model analysis of variance might be preferable.  On the other hand, sometimes random assignment does not achieve its goal in balancing all covariates between groups. If you find that the parameters of some of your covariates (i.e., average age) are different in each treatment condition even after randomly assigning your participants, PSAs may be a useful way of achieving the balance random assignment failed to provide.

Rad Resource:

William Holmes recently published a great introduction to using propensity scores and Haiyan Bai and Pan Wei have a book that will be published next year.

Want to learn more? Register for Propensity Score Matching: Theories and Applications at Evaluation 2014.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Wei Pan on Propensity Score Analysis
  2. LAWG Week: Joseph Gasper on Propensity Score Matching for “Real World” Program Evaluation
  3. LAWG WEEK: Michelle Slattery on the Need for Evaluators in Problem Solving Courts

Dan McDonnell on Learning More About Your Twitter Community with FollowerWonk

American Evaluation Association 365 Blog - Sat, 07/26/2014 - 12:37

Hello, my name is Dan McDonnell and I am a Community Manager at the American Evaluation Association (AEA). If you’re a frequent Twitter user, you’re probably familiar with Twitter’s ‘Who to Follow’ feature – a widget in the right sidebar that ‘suggests’ Twitter users for you to follow, based on your profile and following list. If you’re like me, you’re a frequent user of this feature, and oftentimes feel as if  you’ve exhausted the suggestions Twitter provides, and are interested in digging a bit deeper. Enter: Followerwonk!

Followerwonk


Hot Tip: Search Twitter Bios

For starters, Followerwonk offers a robust Twitter bio/profile search feature. When you search a keyword like ‘evaluation’, Followerwonk will return a full list of results with many different sort-able criteria: social authority, Followers, Following and age of the account. The really cool part, however, is the Filters option. You can narrow these results down by only individuals with whom you have a relationship  (they follow you or you follow them), reciprocal followers or only pull those with whom you are not currently connected, which is a great way to find interesting new people to follow.

Hot Tip: Learn More About Your Followers

Using the ‘Analyze Followers’ tab, you can search for a Twitter handle and find some really interesting details about your network of followers (or folks that you follow). Like Twitonomy, Followerwonk will map out the location of your followers and the most active hours that they are Tweeting (great for identifying optimal times to post!). In addition, you’ll see demographic details, Tweet frequency information and even a nifty wordcloud of the most frequently Tweeted keywords.

Hot Tip: Compare Followers/Following

Now here’s where Followerwonk really shines. Let’s say I want to see how many followers of @aeaweb also follow my personal Twitter account, @Dan_McD. Or maybe you’re a data visualization geek, and want to see what accounts both Stephanie Evergreen (@evalu8r) and AEA (@aeaweb) are following to find some new, interesting Twitter users to follow. The Compare Users tab allows you to see what followers certain accounts have in common and add them to your network!

Using Followerwonk can give you a better overall view of your Twitter community, whether it be identifying interesting connections between your followers or surfacing new users to follow by comparing followers of those you trust. Many of the features of Followerwonk (including some I didn’t cover today) are available for free – and for those that aren’t, a 30-day free trial is all you need. What are you waiting for?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Dan McDonnell on Google + | Dan McDonnell on Twitter

 

Related posts:

  1. Dan McDonnell on Making New Friends and Mastering Lesser-Known Twitter Features Without Third Party Apps
  2. Dan McDonnell on Even More Ways to Analyze Tweets with Twitonomy
  3. Dan McDonnell on Twitter Etiquette and Data Archiving

GSNE Week: Alice Walters on Stakeholder Engagement

American Evaluation Association 365 Blog - Fri, 07/25/2014 - 01:15

I’m Alice Walters, a member of AEA’s Graduate Student and New Evaluator TIG.  I am a doctoral student in human services and work as a non-profit consultant in fund development, marketing, and evaluation.  Here, I explore potential pitfalls and recommendations based on experience with stakeholders for new evaluators.

Hot Tip 1:  Stakeholders are central to evaluation – include them in every step of the process.

This may be Evaluation 101 but it bears emphasizing.  Identify, include, and inform stakeholders.  Think carefully and critically about all involved parties in evaluation outcomes.  Leaving out key stakeholders may lead to poor quality evaluation in unrepresented perspectives.  Key decision-making stakeholders should be engaged in the evaluation process to ensure evaluation relevancy. 

Rad Resource: Engaging Stakeholders  This CDC guide has a worksheet for identifying and including stakeholders in evaluation.

Hot Tip 2:  Be proactive in frequent & ongoing communication to stakeholders.

Don’t assume that initial evaluation conversations and perspectives haven’t changed without your knowledge.  Frequent communication with stakeholders will alert you to any changes in stakeholder perspectives toward the evaluation.  Ongoing communication will also keep lines of communication open and inform stakeholders of evaluation progress.

Rad Resource: A Practical Guide for Engaging Stakeholders in Developing Evaluation QuestionsThis 48-page resource from the Robert Wood Johnson Foundation covers engaging stakeholders throughout the evaluation process.  It provides worksheets and a range of useful communication strategies.

Hot Tip 3:  Take the time to consider stakeholder’s views at every stage of evaluation.

Stakeholders may be unclear about the evaluation process, its steps, and methods used.  Be sure to explain and continue to inform at every stage of evaluation.  As a new evaluator, I made the faulty assumption that stakeholder views were unchanging from initial evaluation meetings.  I also failed to use opportunities to communicate during evaluation stages that might have signaled changing circumstances from stakeholder response.  Evaluators should be cautious about assuming that evaluation environments and stakeholder views are static.

Rad Resource: Who Wants to Know? A 4-page tip sheet from Wilder Research on stakeholder involvement. Evaluators have an expertise that may require working away from direct stakeholder contact, particularly key decision-making stakeholders.  The relevancy of an evaluation requires ongoing stakeholder input.  Successful evaluation requires keeping communication channels open with stakeholders.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. OL-ECB Week: Bonnie Richards on Setting the Stage: Evaluation Preparation and Stakeholder Buy-In
  2. Bikash Kumar Koirala on The Importance of Effective Communication in Participatory Monitoring and Evaluation
  3. Cultural Competence Week: Asma Ali and Anthony Heard on Beyond the Findings: Reflections on the Culturally Competent Evaluator’s Role

GSNE Week: Laura Pryor on Considerations for Teacher Evaluations with Multiple Measures

American Evaluation Association 365 Blog - Thu, 07/24/2014 - 01:15

Greetings, I am Laura Pryor. In addition to being a GEDI alumna, I am a student at UC Berkeley’s Graduate School of Education in the Quantitative Methods and Evaluation program. As part of my graduate evaluation work, I have been exploring the recent trend of using multiple measures to evaluate teachers. As part of this trend, many policymakers and district leaders are combining multiple measures into a summative composite score, often for the purposes of high-stakes decision making (such as salary and personnel).

As a graduate student evaluator, I have been exploring two questions:
1) Is it necessary and/or purposeful to create a composite score?
2) If so, how should an evaluator combine multiple measures into a single composite score?

I hope this post provides insight into these questions so that evaluators can more easily navigate the increasingly popular context of high-stakes teacher evaluations.

Hot Tip 1: The purpose of the evaluation should decide if a composite score is needed. While it may be a current trend, not all multiple measure evaluation systems are used for a personnel or salary decision. For many districts and schools, the evaluation system is used to help teachers/staff identify areas for improvement; in this case, a composite score is not always necessary. If the evaluation system is intended for multiple purposes, prioritize purposes with stakeholders and discuss the feasibility for the evaluation system to embody multiple uses.

Hot Tip 2: If creating a composite score, select a model that is most appropriate for the evaluation:
a. The conjunctive approach: A pass/fail score is given; individuals must score at a specified passing level on every measure.
b. The disjunctive approach: A pass/fail score is given; individuals are only required to score at a passing level on one of the measures.
c. The compensatory approach: Individuals are given a continuum of scores; low scores on certain measures are compensated for by high scores on other measures.

Hot Tip 3: When using a compensatory approach, decide how to combine the measures:
a. Clinically: Evaluation stakeholders decide how to weight each measure; this is often called the ‘eyeballing’ approach.
b. Statistically: Select a criterion target and use regression methods to statistically determine the weights for each measure; this approach is considered more accurate than the clinical approach.

Rad Resources:

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. EdEval Week: Jade Caines on Survey Design
  2. Linda Cabral and Laura Sefton on So many to choose from: How to Select Organizations for a Site Visit
  3. MME Week: Hongling Sun on Mixed Methods Design

GSNE Week: Jenna LaChenaye on Re-Socializing Yourself from Practitioner to Academic

American Evaluation Association 365 Blog - Wed, 07/23/2014 - 01:15

Hello, evalusphere! I am Jenna LaChenaye from the University of Alabama at Birmingham. As an evaluation practitioner transitioning into the world of academia, I have found myself positioned at the epicenter of tackling the learning curve that separates these two vital yet divergent arenas of evaluation. As an evaluator and lover of social research inquiry, I reveled in the pursuit of solving real world issues, completing utilization-focused reporting and training, and moving into the next challenging project. My goal was (and continues to be) to complete rigorous and professional work that addressed local issues through the tools of evaluation. I prized spending time in activities that I deemed immediately and visually impactful. Transitioning into the world of academia, however, has put me in a position of re-socialization. I must not only continue to produce useful work that is rooted in real problems, but must additionally generate products that build on the academic community’s current work and the university/department’s mission (which can often seem like two very different conversations). However, academia provides many benefits that I did not find as an independent evaluator, such as access to immense resources, funding, and an impressive community of practice. Furthermore, I have come to see the evaluator-to-academic role as even more of a service of our profession due to the value of bringing practical experience and a focus on action into the academic sphere.

Hot Tip 1: Evaluation is often misunderstood by more traditional faculty. Share your knowledge of evaluation and you will often find colleagues who have a need for your action-based skill set.

Hot Tip 2: Many universities have centers that conduct evaluation work for the school and community. Seek out and connect with these groups as a way to seamlessly transition to the academic world.

Hot Tip 3: Many universities offer mentoring and development programs. Contact your faculty development center and/or department for more information.

Hot Tip 4: Academia and the next generation of scholars can value immensely from your knowledge and experience. If you work strictly as a practitioner, consider teaching an online or adjunct course.

Lessons Learned:

  • Like any other shift in work, moving to academia comes with a learning curve as you re-socialize into the role.
  • Academia is more of a translation of practitioner evaluator work rather than the very divergent jump it seems to be.
  • Colleagues are more than happy to provide support if asked.

Rad Resources:

  • Tanya Golash-Boza, Ph.D. writes a great, simple blog on navigating the academic world and maintaining a work/life balance, a great resource for those of us who want a jump start
  • Translating evaluation reporting to a journal format can be tough. Search Eval.org for resources addressing this transition

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. SW TIG Week: Carolyn Sullins and Ladel Lewis on Cultural Competence in the Evaluation of Community Programs
  2. Deshonna Collier-Goubil on Evaluator-Practitioner Collaboration
  3. WE Week: Nick Hart on the Value of Affiliate Connections with the Academic Community

GSNE Week: Kate Westaby and Valerie Moody on How to Hit the Ground Running in your New Evaluation Position!

American Evaluation Association 365 Blog - Tue, 07/22/2014 - 01:15

Greetings! We are Kate Westaby and Valerie Moody, new evaluators from two Clinical and Translational Science Award (CTSA) institutes. Kate is an Evaluation Research Specialist at the University of Wisconsin-Madison Institute for Clinical and Translational Research and Valerie is the Evaluation Coordinator at the University of Iowa Institute for Clinical and Translational Science. At the 62 CTSA institutes nationwide, program evaluation is a complex, dynamic, unpredictable environment, mandated by NIH, but implemented in a wide variety of ways by evaluators with diverse backgrounds.

Due to our personal efforts learning to adapt to these complicated surroundings, we wanted to know if there were best practices for new evaluators to orient themselves to their workplaces. Last year, we interviewed 16 new evaluators from 14 CTSA institutes to gather the most helpful strategies for learning about evaluation, thus allowing new evaluators to hit the ground running.

I felt it was like putting together a 1000 piece puzzle, but nobody gave you the cover,” — quote from a new CTSA evaluator.

Hot Tip 1: Learn the history of evaluation efforts at your workplace.New evaluators found this to be the most helpful strategy. Many suggested using programmatic documents (e.g., grant proposals, strategic goal documents, etc.) to find useful historical information. They were better able to understand evaluation needs and review progress towards those needs in a short period of time.

Hot Tip 2: Attend face-to-face meetings (or a conference) with evaluators who are doing similar work. This setting allowed new evaluators to hear what strategies others are using, what their struggles have been, and how they turned their struggles into successes. It also allowed them to establish face-to-face networks for future communication.

Hot Tip 3: Ask questions! Supervisors or colleagues can provide insight into program history, politics, and help you avoid reinventing the wheel. Don’t be afraid to speak up!

Rad Resource: For more tips on how to get comfortable in your new workplace or to look into which strategies were least helpful to our interviewees, check out our AEA 2013 poster below (or download a larger version from AEA’s public elibrary here).

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Boris Volkov on What (Internal) Evaluators Can Do to Advance Evaluation Capacity Building
  2. OPEN Week: Adrienne Zell on Evaluation for People Who Can’t Be Bothered to Do It
  3. BLP TIG Week: Carla Forrest on Using Appreciative Approaches to Drive Workplace Performance

Labs as a Large Systems Change Strategy

Networking Action - Mon, 07/21/2014 - 12:28

The concept of “labs” as a way to address complex issues is a relatively new concept.  Zaid Hassan, in the Social Labs Revolution, estimates it to be about 20 years old.  His intimate involvement in developing a number of them …

GSNE Week: Alice Walters on The Art and Science of Networking

American Evaluation Association 365 Blog - Mon, 07/21/2014 - 01:15

I’m Alice Walters, a member of AEA’s Graduate Student and New Evaluator TIG.  I am a doctoral student in human services and work as a non-profit consultant in fund development, marketing, and evaluation.  I share some networking tips, below.

Networking is needed at every career stage.  Review tips and resources to increase your effectiveness.  Enjoy using your networking skills as both art and science to see what serendipitous outcomes transpire!

Hot Tip 1:  Networking is developing informal connections with other professionals.

Building informal connections can occur any time you meet other professionals.  Don’t exclude those outside your usual networks who can be a source of unexpected developments.

Rad Resource: Developing a Strong Professional Network” by the Penn State Alumni Association 

Hot Tip 2:  Networking is more than just about a job hunt. Networking is often associated with job hunting success but it can be much more than that.  Networking can lead you to new avenues, develop new collaborations, and bring attention to your own work in new venues.

Rad Resource: Tips for Successful Business Networking10 Advantages of Business Networking” bySusan M. Heathfield

Hot Tip 3:  Networking is not really an “activity,” it is a lifestyle. Networking is not an isolated activity you add to your calendar.  Instead, it is really a process, approach, and outlook on professional relationships.

Rad Resource: Cheat Sheet: 9 Professional Networking Tips” by Jillian Kurvers

Hot Tip 4:  Networking for the shy – is easier when you don’t think of it as “networking.” Even the most outgoing people can struggle with pressure to force a connection professionally.  Instead, it is better to explore relationships by asking questions that occur naturally to you.

Rad Resource: How to Network: 12 Tips for Shy People” by Meridith Levinson

Hot Tip 5:  Networking is an art.  It’s creative, flexible, and individualistic. Use your strengths to network.  Just as art appeals differently to individuals, networking can accommodate a variety of styles.

Hot Tip 6:  Networking is a science.  It deserves study and analysis. Science is study.  Networking is thoughtful.  It seeks to connect the random dots.  Networking requires analysis of input data.  It’s not an oxymoron to look for serendipity.  Serendipity is defined as finding something valuable but not sought for.  Still, if you are looking for connections and value, you will be more likely to find them.

Hot Tip 7:  AEA is a great resource for networking. AEA is the hub of evaluation professionals.  The AEA Topical Interest Groups, conferences, and local affiliates are a great place to start. On the AEA home page go to (third tab to the right):  Read>Links of Interest>Professional Groups   http://www.eval.org/p/cm/ld/fid=69

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. OPEN Week: Erin Stack & Lindsey Patterson on Successfully Transitioning from Student to Professional
  2. Tamara Bertrand Jones on Finding and Working With a Mentor
  3. WE Week: Nick Hart on the Value of Affiliate Connections with the Academic Community

GSNE Week: Ayesha Tillman on Graduate Student and New Evaluator (GSNE) TIG Mentorship program

American Evaluation Association 365 Blog - Sun, 07/20/2014 - 03:38

Hello, I am Ayesha Tillman, and I have all but deposited my dissertation for a Ph.D. degree in Educational Psychology with an evaluation specialization from the University of Illinois at Urbana-Champaign (Illinois). I along with Rae Clementz, Sarah Wilkey-Gordon, Tiffany Smith, Pat Barlow, and Nora Gannon-Slater are mentors in the Graduate Student and New Evaluator (GSNE) TIG Peer-Mentorship program. I have five mentees located across the U.S. including Louisiana, California, Michigan, Texas, and the Dominican Republic. So far, my participation as a mentor has been an incredibly rewarding and worthwhile endeavor.

Lessons Learned: All of my mentees joined the GSNE TIG peer-mentoring program because they were looking for someone to bounce ideas off of, share experiences with, and someone to give them tips/advice. Below is advice I have shared.

Hot Tip 1: Presenting. AEA, American Educational Research Association (AERA), and the Center for Culturally Responsive Evaluation and Assessment (CREA) are three conferences for evaluators to submit presentation proposals to. If you are uncomfortable submitting a paper, start with roundtable and poster presentations.

Hot Tip 2: Publishing. Publishing in evaluation can be tricky. Evaluation journals (and conferences) do not want submissions about the results of the evaluation. Research on evaluation and reflections on evaluation practice are well suited for publication. For example, the American Journal of Evaluation “explores decisions and challenges related to conceptualizing, designing and conducting evaluations.”

Hot Tip 3: Capacity building. The workshops at the AEA conference, the AEA summer evaluation institute, and AEA eStudies are great professional development opportunities for evaluators. The AEA Graduate Education Diversity Internship Program is an awesome opportunity for graduate students of color and from other under-represented groups who would like to extend their research capacities to evaluation.

Rad Resources:

  • GSNE mentorship program mentees. If you are interested in being a mentee, make sure you are a member of the GSNE TIG. You will receive an email once a quarter with the opportunity to become a mentee. If you are interested in being paired with a GSNE mentor, please send an email to Kristin Woods.
  • GSNE mentorship program mentors. If you are interested in being a mentor, you should have been an AEA member for two or more years and have attended at least one annual conference. If you are interested and willing to be a GSNE mentor, please send an email to Kristin Woods.
  • GSNE TIG Facebook page. The GSNE Facebook group is a great place to connect with other graduate student and new evaluators. We share resources, opinions, advice, and network on Facebook.

AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. GSNE Week: Kristin Woods on Gaining Practical Experience as a New Evaluator
  2. Tamara Bertrand Jones on Finding and Working With a Mentor
  3. STEM TIG Week: Ayesha Tillman on Advice to New Evaluators

Sheila B Robinson on EvalYear: A Taste of 2015 and a bit of Alphabet Soup to Whet Your Appetite

American Evaluation Association 365 Blog - Sat, 07/19/2014 - 06:38

Good morning! I’m Sheila B Robinson, aea365′s Lead Curator and sometimes Saturday contributor. I love to share evaluation news and, ever the teacher, I look for opportunities to educate aea365 readers. Today’s lesson is about EvalYear, the International Year of Evaluation. If you haven’t yet heard about this, it’s time to get reading!

In October 2013 in São Paulo, Brazil at the Third International Conference on National Evaluation Capacities it was announced that 2015 would be the International Year of Evaluation (EvalYear). EvalPartners, the global movement to strengthen national evaluation capacities, is behind the effort, and it’s a big effort! Two leading partners and 47 core partners (of which AEA is one) along with 1580 evaluators/activists have joined or expressed interest in the declaration of EvalYear.

“The aim of designating 2015 as the International Year of Evaluation is to advocate and promote evaluation and evidence-based policy making at international, regional, national and local levels.”

Image credit: Hans Watson via Flickr

Lesson Learned: When you visit the EvalYear website and start reading, you will come across no fewer than 15 acronyms! Most are written out, but not all are (or are not written out on every page), so to prepare, have a little taste of alphabet soup!

  • IOCE – International Organization for Cooperation in Evaluation
  • IEG – Independent Evaluation Group
  • OECD/DAC – Organization for Economic Cooperation and Development / Development Assistance Committee
  • VOPE – Voluntary Organization of Professional Evaluators
  • MDGs – Millennium Development Goals
  • SDGs – Sustainable Development Goals
  • UNEG – United Nations Evaluation Group
  • ECG – Evaluation Cooperation Group
  • ALNAP – Active Learning Network for Accountability and Performance 
  • TF – Task Force
  • NECD – National Evaluation Capacity Development
  • QCPR - Quadrennial Comprehensive Policy Review
  • CSO – Civil Society Organization
  • ECD – Evaluation Capacity Development
  • EFGR - Equity Focused and Gender Responsive

“EvalYear will position evaluation in the policy arena, by raising awareness of the importance of embedding monitoring and evaluation systems in the development and implementation of the forthcoming Sustainable Development Goals, and all other critical local contextualized goals, at the international and national levels. EvalYear is about taking mutual responsibility for policies and social action through greater understanding, transparency, and constructive dialogue.”

Hot Tip: Visit EvalYear to learn more and consider how you will get involved!

Rad Resources: Check out the resource center for presentations and updates. The EvalYear logo and brochure is currently available in 18 languages and it is being translated in to many more languages.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Neha Karkara on the Evaluation Advocacy Toolkit
  2. Susan Kistler on VOPEs and the EvalPartners Innovation Challenge
  3. Amir Fallah on Resources for International Evaluators

LAWG Week: Amber Hill on Boosting Online Survey Participation

American Evaluation Association 365 Blog - Fri, 07/18/2014 - 01:15

Hello! My name is Amber Hill and I am a research specialist at McREL International’s Denver, Colorado office. My work focuses on education research and my responsibilities include managing online surveys administered to state departments of education, districts, school staff, parents, and students in the United States, Pacific region, and Australia. Encouraging online survey participation can be tricky, which is why I use a variety of methods.

Hot Tip – Work with the IT Pros

Ensuring that participants receive the survey in the first place can be half of the battle. No matter your level of information technology (IT) expertise, it is helpful to coordinate efforts between the IT pros who work for your survey software provider, your own organization, and the organization for which you are administering the survey. Those three groups can help you with white listing, test emails, firewalls, and broken links.

Hot Tip – Communicate Early, Communicate Often

Participants are often leery of participating in a survey administered by a stranger, especially if the content is sensitive. Working with a partner organization that is familiar to participants helps increase understanding about the purpose, value, and trustworthiness of the survey and evaluator. Partner organizations may send an e-mail to participants with the evaluator’s name and contact information in advance of the recruitment e-mail. Follow-up and reminder e-mails from the evaluator that includes references to the partner organization shows participants the coordination between the organizations. Keeping surveys open for extended amounts of time also allows for more reminders and opportunities for participants to ask questions.

Cool Trick- Provide Participate Appropriate Incentives

Incentives such as monetary compensation or prizes can motivate participants to spend their time on the survey. Try to think of something that participants would genuinely enjoy or find useful. Incentives may go to participants or survey administrators, depending on how the survey is distributed. When funding is limited, a drawing for a prize among participants who elect to provide their contact information may be effective.

Rad Resources – Denver’s Urban Trails and Parks

While at Evaluation 2014, you will notice that Denver’s outdoor culture thrives everywhere from mountains peaks to downtown. The Colorado Convention Center bumps up against the Cherry Creek Trail, which if taken north leads to Confluence Park and south leads to Sunken Gardens Park and beyond. A quick exploration west will hook up with the South Platte River Trail and to Sloan’s Lake Park.  Longer treks east of downtown will reward visitors with mountain views at Cheesman Park (go to the Pavilion) and animal life at the Denver Zoo and City Park. Get outside!

We’re thinking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Registration will soon be open! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. Carla Hillerns on Thoughtful Tokens of Appreciation to Encourage Study Participation
  2. Jessica Foster on Maximizing Survey Response Rates
  3. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October

LAWG WEEK: Michelle Slattery on the Need for Evaluators in Problem Solving Courts

American Evaluation Association 365 Blog - Thu, 07/17/2014 - 01:15

Greetings from Colorado!  I am Michelle Slattery, President and founder of Peak Research, a consulting firm specializing in program evaluation and STEM research, and evaluator for the 4th Judicial District Veteran Trauma Court at University of Colorado – Colorado Springs Trauma, Health & Hazards Center. I am writing about the need for evaluators in problem solving courts. These courts provide treatment and alternatives to incarceration for people with community health issues like substance use disorders or combat trauma. They are growing rapidly (more than 2,500 courts nationwide) because they show promise for increasing treatment engagement and reducing recidivism (relapse of negative behavior). They are easily criticized, however, because there is currently little evaluation research being done to inform their processes, measure their cost savings, and provide evidence of impact. My team recently concluded work on a 5-year Jail Diversion and Trauma Recovery – Priority to Veterans (JDTR) grant administered by the Colorado Office of Behavioral Health and funded by the Substance Abuse and Mental Health Administration (SAMHSA), which required and funded extensive process and outcome evaluation. During our tenure on the JDTR grant, we helped improve the court and obtain funding and sustainability by sharing and publishing results that document significant improvements in recidivism, post-traumatic stress disorder, substance use, depression, self-harm, and resilience. You can read more about the work here. Problem solving courts are a natural fit for evaluators, providing a rare opportunity to conduct evaluation that can help save lives while also improving communities.

Cool Trick: Check out Says-It.com to create your own custom signs like the Uncle Sam recruiting poster above.

Lesson Learned: Experimental studies with random selection and random assignment are very difficult to implement in the judicial system.  Quasi-experimental designs using propensity score matching may be the best alternative.

Rad Resource: Check out the Interactive Map provided by the National Association of Drug Court Professionals to find problem solving courts in your state.

Hot Tips:

  • While you’re in town, the Colorado Convention Center is a 5-10 minute cab ride from some fun diversions – Voodoo Doughnuts (1520 E. Colfax) and the Tattered Cover Book Store (2526 E. Colfax).  Voodoo is famous for their voodoo dolls, complete with a pretzel stake through the raspberry jelly heart, as well as an assortment of vegan doughnuts, throwback flavors to the 70s like Captain Crunch, and a giant glazed doughnut called the “Tex Ass.”
  • If you’re in the mood for a short road trip, you are just an hour’s drive from Colorado Springs, home of the U.S. Air Force Academy, Garden of the Gods park, and the Cog Railroad which goes to the top of Pikes Peak.  At the top, you will also find doughnuts being made – at 14,110 feet!

We’re thinking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Registration will soon be open! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. Margaret Braun and Shannon Myrick on Evaluating Drug Courts
  2. FIE TIG Week: Kathryn Sielbeck-Mathes and Rebecca Selove on Feminist Evaluation and Framing
  3. Olmos-Gallo, DeRoche, and McKinney on Increasing Stakeholder Sophistication

LAWG Week: Laureen Trainer on Enjoying Evaluation in the Moment

American Evaluation Association 365 Blog - Wed, 07/16/2014 - 01:15

My name is Laureen Trainer and in January I became Principal of Trainer Evaluation in Denver; to say the past six months have been a mixture of excitement and terror would be an understatement. But, the good news is that I love the decision to start my own business and I’m excited to begin the next phase of my evaluation career.

As a one-person company, I do all of the data collecting, which leads to some really cool moments. Recently I’ve been observing the new school experience at the Clyfford Still Museum. My past life includes a MA in art history, so I could tell you all about Still and Abstract Expressionism and his role in the pantheon of American art. Yet, I was never a big fan of his work. But these past few weeks, I’ve listened to a lot of kids trying to describe their thoughts and feelings when looking at his art, and it has been awesome! Truly.

For example, when looking at one abstract work of a large field of orange covering 2/3 of the work and a mixture of purple, gray and black vertical lines covering the remaining 1/3 of the work, one 8th grader commented that it reminded of him of day and night. His was a good start, and I could see that, but he went further. He said that the orange took up the majority of the canvas because we remember more and live more during the day, so it is bigger, and even though we sleep a lot, we don’t remember that part of our lives, so that is why the dark part covers less of the canvas. I could go on with other examples of zen, autumn, war, hope, despair…but there isn’t time.

Their ideas have made me take another look at some of the artwork and have led to a new appreciation of the Clyfford Still Museum. Plus, it has been wild to combine my two worlds of art history and evaluation.

Hot Tip: It’s okay to live in the moment and enjoy what you are observing. I know that as an evaluator, I am there to document certain aspects of the tour and the interaction between the students and gallery teachers. However, I have totally enjoyed listening to the kids during my observations. I’ve even gone back through the galleries at the end of a tour to take a new look at an old painting.

Rad Resource: For additional great modern and contemporary art when you’re in Denver (all are close to the Colorado Convention Center), visit Museum of Contemporary Art, Denver Art Museum, Kirkland Museum and our great public art program, which includes mustangs, bears and brooms!

We’re thinking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Registration will soon be open! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. LAWG Week: Marley Steele-Inama on Collaboratively Building Evaluation Capacity
  2. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October
  3. Sheila B. Robinson on the Annual Call for Proposals – Evaluation 2014

LAWG Week: Mya Martin-Glenn & Lisa M. Jones on Evaluation in Public School Districts

American Evaluation Association 365 Blog - Tue, 07/15/2014 - 01:15

We are Mya Martin-Glenn and Lisa M. Jones, and we work in the Division of Accountability & Research at Aurora Public Schools in Colorado. We will be sharing how external evaluators can learn some of the nuances of requesting school data. We also will give you a few hot tips for attending the AEA conference in Denver this October.

Lesson Learned: Know the district policies as well as the federal laws governing student data sharing. There are specific federal laws and rules that govern student data sharing, including the Family Educational Rights and Privacy Act (FERPA) and Children’s Online Privacy Protection Act (COPPA).

FERPA protects student education records and COPPA requires online sites and services (such as Survey Monkey and others) to provide notice and obtain permission from a child’s parents (for kids 13 years and younger) before collecting personal information from that child.

Hot Tip: Talk with someone in the district prior to requesting student data even if the evaluation is being conducted as a requirement of a grant. See if there is a central research and evaluation division that oversees data sharing with external entities. Also, check with the state – often the data you need is readily available.

Lesson Learned: Be sure you understand data coding. School district personnel download student data from data management systems such as Infinite Campus (IC). Frequently, data are stored in these systems using programmatic codes specific to the school district. It often takes considerable time to download and “clean” the data file for distribution to external evaluators.

Hot Tip: Ask for a “data dictionary” to help with any coding that may be unfamiliar to you.

Rad Resources: Currently our district is working on revising the external data request process, but here are some examples of other school district requirements for collecting data in schools.

Hot Tips: AEA Annual Meeting in Denver

  • Drink plenty of water – Start a week or so before arriving in Denver so your body has a chance to acclimate to the altitude which can be dehydrating.
  • Wear sunscreen and lip balm – Even in October, the mile high city is closer to the sun.
  • Bring your walking shoes – There are a lot of fun places within walking distance of the conference hotels (as well as a Light Rail system)

o   Comedy Works, 1226 15th St.

o   Denver Performing Arts Complex, 950 13th St.

o   Mercury Café, 2199 California St.

o   Denver Microbrew Tour, Great Divide Brewing Company – 303-578-9548

o   Brown Palace Hotel, 321 17th St, High tea is a lovely experienceor take a tour of the historic hotel

We’re thinking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Registration will soon be open! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October
  2. MNEA Week: Katherine Drake on Requesting Data from School Districts
  3. LAWG Week: Marley Steele-Inama on Collaboratively Building Evaluation Capacity

LAWG Week: Marley Steele-Inama on Collaboratively Building Evaluation Capacity

American Evaluation Association 365 Blog - Mon, 07/14/2014 - 01:15

My name is Marley Steele-Inama, and I manage Audience Research and Evaluation at Denver Zoo. The Local Arrangement Working Group’s (LAWG) is excited to share with you the great evaluation work taking place in Colorado, as well as give you advice for making the most of Evaluation 2014 in Denver. Coloradoans are very proud of our state; don’t be shocked to notice many locals wearing clothing that dons the state flag’s emblem!

Denver harbors a spirit of collaboration, and this rings true for an initiative of which I’m a part – the Denver-area Evaluation Network (DEN). This network is made up of 15 different museums and cultural institutions, most of whom are a part of the Scientific and Cultural Facilities District (SCFD), a sales and use tax that supports cultural facilities through the seven-county Denver metropolitan area. DEN’s goals are to increase evaluation capacity building (ECB) in museum professionals through a multidisciplinary model that includes trainings with national evaluation experts, attending workshops and conferences, mentoring and technical assistance, dissemination and meetings, and engaging in institutional and pan-institutional studies. Thanks to a grant from the Institute of Museum and Library Services (IMLS), all DEN members will be attending this year’s AEA conference in Denver – a first for most of these participants.

Lessons Learned: Collaboration is core to DEN, however, working together is challenging. We’ve learned that to be successful, we need:

  • Champions to steer the project, and subcommittees to engage members and activate the work.
  • Frequent in-person meetings to stay motivated and connected.
  • Flexibility and the acceptance to make adjustments quickly when needed.
  • Leadership involvement at our institutions in the project to sustain such a large and time-consuming ECB effort. Value buy-in is critical.
  • Two members from each institution as part of the project – those institutions with two members in DEN, compared to one, are more successful at transferring ECB back in their institutions.
  • To accept that pan-institutional studies don’t always work with such a large and diverse group; we’ve learned that cohort studies often work better.

Hot Tip: Colorado is home to endless adventure, and that includes its exploding addiction to running. Start a training plan now and lace up for the Denver Rock n’ Roll Marathon and Half Marathon, scheduled for Sunday, October 19, one day after the conference ends. Prefer a “hoppy” adventure? Colorado is booming with craft breweries. You won’t have to walk far to taste some of Denver’s finest ales. Taprooms close to your hotel room include Denver Beer Company, Great Divide Brewing Company, Jagged Mountain Brewery, Prost Brewing, Renegade Brewing Company, and the legendary Wynkoop Brewing Company. Of course, feel free to stick around after the conference and sample from more of Colorado’s 230+ craft breweries!

We’re thinking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Registration will soon be open! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October
  2. AHE TIG Week: Sean McKitrick on Accountability Demands for Assessment in Higher Education
  3. Sheila B. Robinson on the Annual Call for Proposals – Evaluation 2014

LAWG Week: Antonio Olmos, Stephanie Fuentes, and Sheila Arens on Welcoming You to LAWG Week and Inviting You to Come to Evaluation 2014 in Denver this October

American Evaluation Association 365 Blog - Sun, 07/13/2014 - 01:15

We are Antonio Olmos, Stephanie Fuentes, and Sheila Arens of the Colorado Evaluation Network (COEN), the local chapter of AEA. Welcome to the Local Arrangements Working Group (LAWG) week. We look forward to your visit to Denver this October for Evaluation 2014. We are working hard with AEA to plan a great event for you while you are in the Mile High City.

In the last year or so, our state has experienced some shifts which may interest evaluators. On January 1, 2014, we legalized the use of recreational marijuana. There have been many anticipated and unanticipated consequences, including possible reductions in crime and new sources of city and state revenues (along with potential ramifications in traditional evaluation arenas like public education and public health). It’s too early in this “experiment” to assess overall impact, but we know everyone is watching. Similarly, energy is on the minds of many Coloradans with intense debates over the oil and gas industry’s use of fracking driving initiatives to ban fracking in some jurisdictions due to of health and safety concerns. At the same time, there are calls to expand research and evaluation of green energies. These two examples speak to the balance between meeting energy demands and environmental consequences. Clearly, program evaluation may be in a position to help.

Hot Tips – What to Visit When You Come to Denver

In addition to the 2014 AEA conference, there are many other things worth keeping in mind when you plan your visit! Check http://www.denver.org/ for events in Denver. Be on the lookout for a local guide; in the meantime consider any of the following:

  • Outdoor activities: Jog or bike around the many trails through Downtown Denver. Denver has a B-Cycle program where you can borrow townies and explore the city. Get out of the city! There are plenty of hiking trails just a short drive away in Boulder, Red Rocks, or Golden.
  • Sporting events: Denver Nuggets, Colorado Avalanche, Colorado Rapids (soccer) and Denver Broncos will be in full swing. If Colorado Rockies are in the pennant race, baseball could be a fun option. Except for the Rapids, all venues are within walking distance or a short shuttle/light rail ride away.
  • Cultural events: Downtown Denver and its immediate surroundings are home to multiple art, nature and history museums, as well as theaters and music halls. Within walking distance are aquariums and historic buildings.
  • Explore the city … and beyond: There are multiple places in Downtown and Lower Downtown (LoDo) to go at night. Both are full of microbreweries – take a tour of them! Using the light rail/buses you can go to Golden, Boulder or Fort Collins. Or rent a car and explore the majestic Rocky Mountains.

We’re thinking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Registration will soon be open! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Related posts:

  1. Sheila B. Robinson on the Annual Call for Proposals – Evaluation 2014
  2. Susan Kistler on making the most of AEA2010
  3. Olmos-Gallo, DeRoche, and McKinney on Increasing Stakeholder Sophistication

Jayne Corso on Boosting Your LinkedIn Presence with Evaluation Keywords

American Evaluation Association 365 Blog - Sat, 07/12/2014 - 09:50

Hello, my name is Jayne Corso and I work with Dan McDonnell as a Community Manager for the American Evaluation Association (AEA).

As you probably know, LinkedIn is the social platform for professional development, career hunting and thought leadership. It is an excellent resource for presenting yourself as an experienced, savvy evaluation professional and enables you to find resources and networking opportunities that will benefit your practices and strategies.

One of the most powerful features of LinkedIn is its ability to search people by name, profession, keywords, or location. Results from these searches are dependent on the strength of personal profiles. I’d like to share a few tips that will help you create a stronger personal profile and become better connected with your professional peers in the evaluation community.

LinkedIn Search

Hot Tip: Utilize all aspects of your profile.
Go beyond just including a photo, your work experience, and education. Add in your publications, skills, awards, independent course work, volunteer experience, and organizations you belong to. All of these features allow you to have a robust, well-rounded profile and will better highlight your expertise as an evaluation professional.

Hot Tip: Incorporate keywords.
Create a list of keywords that accurately communicate your expertise. Are data communications or data visualization or monitoring some of your greatest strengths? Improve your profile by incorporating these keywords repeatedly in your profile descriptions. This will allow your profile to be ranked high when the words are searched within LinkedIn (who you are connected to also influences these rankings). Placing keywords in your profile headline is also a great way to publicly show your expertise and helps other users make an informed decision about connecting with you.

Hot Tip: Customize your LinkedIn URL.
When you join LinkedIn, the site creates a generic URL for your profile that includes a series of numbers. Similar to a website URL, these numbers do not resonate high in a search. Placing your name or keywords into your URL will improve the visibility of your profile. Here are a few tips from LinkedIn on how to get started customizing your URL.

Rad Resource:
The search function of LinkedIn is also a great resource if you’re looking to expand your network and make connections. Searching industry keywords provides you with a full list of professionals and organizations dedicated to evaluation. You can also use advanced search to connect with colleagues, clients, and industry thought leaders. You’ll be surprised at how quickly you can expand your evaluation network with just a few searches. Try it out!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Related posts:

  1. Dan McDonnell on 4 Recent Social Media Changes You May Have Missed
  2. Dan McDonnell on Adding a Photo to Your Blog Post in Google Search Results
  3. Nicole Porter on Looking for Evaluation Opportunities

Cultural Competence Week: Lisa Aponte-Soto and Leah Neubauer on Culturally Responsive Evaluation 101

American Evaluation Association 365 Blog - Fri, 07/11/2014 - 01:15

We are Lisa Aponte-Soto and Leah Christina Neubauer, members of the AEA Public Statement on Cultural Competence in Evaluation (The Statement) Dissemination Working Group. Aponte-Soto teaches at DePaul University, and is an independent consultant. Neubauer is based in DePaul’s MPH Program.

The Statement reminds us that cultural competence is essential to all evaluation theory and practice. Being a culturally responsive evaluator requires a conscious effort of self-awareness and acknowledgement of our biases and the assumptions that we make about cultural groups. It also requires a willingness to attend to unique contextual dimensions and perspectives of a community, which requires open communication and dialogue. It is also a responsibility to contribute to the greater good of society. The following will help you assemble an evaluation team to apply CRE practices.

Four steps for building CRE practices:

  1. Attend to assumptions and context: Examine both cultural assumptions and conduct an analysis of the historical context, sociopolitical changes, and environmental strengths and weaknesses.
  2. Establish a CRE team and identity the team resources: Assemble a team that values and can attend to the unique cultural context of the community served and is inclusive of the key stakeholders and consumers of the evaluation as active agents.
  3. Apply CRE action steps to the evaluation plan: Work with stakeholders to develop, design and implement culturally sensitive and appropriate instruments.
  4. Disseminate evaluation results: Share findings of the program’s influence and impact with all stakeholder groups.

Becoming cultural “competent” is not a prescriptive goal. It is simply a way of being that requires a lifelong process of interactions and learning from experiences. The tips below will assist you in getting started.

Top 3 tips for personal growth and development:

  1. Practicing mindfulness and being present. Attend to what you say to others or even in how your body language may be inappropriate or insensitive.
  2. Journal your thoughts, perspectives, feelings and experiences. Reflect on these and revisit them to assess progress.
  3. Give yourself stretch assignments on your automatic processes once a week by listing 5 assumptions about someone you interacted with on a given day.

The following resources will help you continue to engage in developmental exercises.

Rad Resources:

Explore the Implicit Association Tests that allows you to check on your automatic assumptions your past-experiences, the media, and other cultural norms.

Visit the Teaching Tolerance website for resources on building greater understanding of diverse cultural groups among youth.

Gauge your emotional intelligence by taking the Body Language Quiz and attending workshops or reading books by The Wright Leadership Institute to hone your social and emotional intelligence to tune in to your negative assumptions, the root sources of these, and how to resolve these.

This week, we’re diving into issues of Cultural Competence in Evaluation with AEA’s Statement on Cultural Competence in Evaluation Dissemination Working Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Related posts:

  1. Cultural Competence Week: Cindy Crusto and Osman Ozturgut on the Re-Introduction to AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group and Reminder to Examine the “Self”
  2. CC Week: Osman Özturgut and Tamera Bertrand Jones on Integrating Cultural Competence into Your AEA Presentation
  3. CC Week: Jori Hall on Integrating Cultural Competence into Everyday Practice, Part 1