[Skip to Content]
Visit us on Facebook Visit us on FacebookVisit us on Twitter Visit us on TwitterVisit our RSS Feed View our RSS Feed
NEO Shop Talk July 20th, 2019
CategoriesCategoriesCategories Contact UsContact Us ArchivesArchives Region/OfficeNEO Main Site SearchSearch

Aug

04

Date prong graphic

Logic Model Hack: Project Management Tool

Posted by on August 4th, 2017 Posted in: Blog, Logic Models, Practical Evaluation


My whole career has been an evaluation project juggling act.  At times, I was the only evaluation consultant for an entire university campus. Logic models were a game changer for me. When a client showed up in my office three months after our initial consult, I could pull out our logic model and we could catch up on the plan in less than a minute.

Now, on project management teams, I am the self-appointed “Logic Model Queen.”  I’m sure my team mates roll their eyes behind my back when I start drawing the iconic logic model structure on the conference room white board.  That’s okay. I always win them back by the end of the project, because logic models are excellent project management tools.

Creating the Logic Model

Let’s review how a project team develops a logic model.  Using the template shown below, the group first establishes desired outcomes (results) for the project. Team members then work “backwards” to see if planned activities logically could lead to the outcomes. In the Resources column, they list everything needed to conduct activities. Once the columns are filled, the team members can reflect on their assumptions underlying the plan and identify known challenges. From there, the team established methods to assess program implementation (process evaluation) and results (outcomes evaluation).

Logic model template. Three boxes connected by right-pointing arrows. Resources (If we get these resources)" "Activities" Conduct these activities and deliver these services/products; Outcomes: We will accomplish these outcomes. Under the three boxes, there are two boxes spanning the three columns labeled "Assumptions" and "Challenges"

Centerpiece of Project Management 

Once the logic model is drafted, don’t bury it in some dusty folder in cyberspace.  Your team should review it at each meeting.  Lack of meeting time is no excuse: a logic model review can be done in five minutes, preferably at the top of the meeting.  The general question the team should consider this:  Does our logic model still reflect reality? If the answer is basically yes, the team can then use the following logic-model inspired questions to review the project more thoroughly

  1. Have we been able to acquire everything we needed, as listed in the resources section of our logic model? Do we have to do without or find a substitute for anything?
  2. Does our process evaluation data show we are on track to complete our outputs (aka deliverables)? How are the participants responding to our project?
  3. How well are we documenting our progress? Are our records complete? Are the project implementers submitting assessment information? Are there ways to make the data collection process more usable?
  4. What wrong assumptions did we make that we need to address?
  5. What unexpected challenges are we encountering?
  6. Are our outcomes still reasonable?
  7. Do we have the evaluation information we need to make decisions at different stages of this project?
  8. Are we recording our successes well? Will we be able to tell a compelling story to key stakeholders at the end of this project?

Change Shows Learning 

So let’s say your team considers the question Does our logic model still reflect reality, and the answer is “no.” Now what do you do? First, be comforted by the fact that this is not an unusual occurrence.  Good project management almost always leads to change.  Here are some reasons that programs must be adapted midstream:

Queen saying "'Tis okay to change your logic model"

Your expectations were too high. Often, in the middle of programs, we realize the change we hoped for is going to take more time than expected. It is not unusual to discover “things take longer than they do.” (That’s a direct quote from one of my first evaluator-bosses.) In such cases, the most comforting words in the English language are pilot project. Let go of your expectations, revise your outcomes and, if necessary, alter your evaluation methods to capture evidence of modified outcomes.

You learn new things about the community you’re serving: Let’s face it, needs and community assessment only give you a tiny piece of the information you need to run a successful project. Only by working with participants and stakeholders directly to do you truly learn about their desires, needs, and values. As the great poet Maya Angelou said,  do the best you can, and when you know better, do better. If you can, adjust your program activities and outcomes as much as possible to better meet your stakeholders’ wants or needs.  Otherwise, record what you learn so you know what to do differently next time.

Unexpected changes arise for you, your community, or your partners. We all work in complex systems that can change with little notice, disrupting best laid plans. Regular discussion related to your logic model will help your team assess and respond to shifting terrain.  Team discussions will document the challenges for final reports.

I have written before that I think logic models are the duct tape of the evaluation world.  I hope this post convinced you to use it early and often during project management discussions.  You also might like reading about how to use it for proposal writing, conducting program reality checks, or even for planning your child’s birthday party.

If you want training on logic models, you might be interested in the half-day workshop Logic Models for Program Evaluation and Planning, being offered at the American Evaluation Association’s 2017 conference in Washington, DC. (It’s number 64 on this list of AEA 2017 workshops.) You do not have to register for the conference to enroll in the workshop.

 

Image of the author ABOUT Cindy Olney
Cindy Olney is the Assistant Director of the NNLM Evaluation Office. She leads NNLM's evaluation efforts, designs evaluation methods, and guides analysis and reporting of evaluation findings.

Email author View all posts by
This project is funded by the National Library of Medicine (NLM), National Institutes of Health (NIH) under cooperative agreement number UG4LM012343 with the University of Washington.

NNLM and NATIONAL NETWORK OF LIBRARIES OF MEDICINE are service marks of the US Department of Health and Human Services | Copyright | Download PDF Reader