Friday at the American Evaluation Association meeting had a technology slant in most of the sessions I chose.
Blogging to the Beat of a New Drum: The Use of Blogs and Web Analytics in Evaluation (Cary Johnson and Stephen Hulme, Brigham Young University)
This roundtable was a discussion about possible uses of blogs and web analytics to assist in evaluations. Content analysis could be used to study blog conversations; blogs could be used to identify survey participants. Issues include lack of trust, lack of time, access to computers & network, extraneous data, uneven representation, lack of comfort with written expression, influence of social desirability, and lack of knowledge about people who are “most likely to blog.” Video blogging, in which participants can click and speak, can be used to collect student feedback in classes but it is challenging to analyze the data. Google Analytics is a free tool that can be used to view percentages of which users are looking at posts, who selects “read more” and how long they stay, and top search terms.
Voicethread: A New Way to Evaluate (Stephen Hulme and Tonya Tripp, Brigham Young University)
Voicethread.com is a new website where you can upload videos, images, documents, and presentations. Viewers can then make audio and video comments. This can make it possible to gather richer data than via a survey or interview.
Can You Hear Me Now? Use of Audience Response Systems to Evaluate Educational Programming (Karen Ballard, University of Arkansas)
This was a hands-on demonstration of a personal/classroom/audience response (“clicker”) system. The Vanderbilt Center for Teaching maintains a bibliography of articles about clickers and provides a guide for using them in teaching.
Evaluation Dashboards: Practical Solutions for Reporting Results (Veena Pankaj and Ehren Reed, Innovation Network Inc.)
In the context of evaluation, a dashboard is a performance monitoring tool that provides a quick view of how well goals are being met. Dashboards, borrowed from the corporate world, are useful in the nonprofit world. They display indicators and targets, and use simple visuals (such as color codes) to illustrate levels of achievement. Although specialized software is available to help create them, Excel also works. More information is available at Dashboard for Nonprofits.
Walking the Talk: Evaluation Is as Evaluation Does (Matt Gladden, Michael Schooley, Rashon Lane, Centers for Disease Control and Prevention)
Representatives from CDC’s Division for Heart Disease and Stroke Prevention presented an overview of their strategies to build an organizational culture that values and routinely uses evaluation techniques. Evaluation capacity building approaches include planning evaluations, conducting evaluations, using results, and supporting evaluation through consultation, training, and resources. Lessons learned included: establish systems that support evaluation; define boundaries and priorities; balance ability to implement a project with its potential benefit; recognize that an evaluator’s role is to identify issues but not necessarily solve them; differentiate between long-term and workplan goals.