NEO Shop Talk January 21st, 2018
CategoriesCategories Contact Us Archives NEO Main Site Search



Date prong graphic

Evaluating Websites with Web Analytics: Contextual Use and Interpretation

Posted by on March 26th, 2012 Posted in: Logic Models, Practical Evaluation
Tags: , , ,

If you missed the previous post in this series on “Evaluating Websites with Web Analytics,” you can find it here!

We left off in our discussion of web analytics by talking about a few of the basic metrics and how they can help libraries and non-profit organizations assess their websites and online presence.  The “big three” as described by the Digital Analytics Association included visits/sessions, unique visitors, and page views.  These three metrics are only the starting metrics that libraries and non-profit organizations can use to understand user patterns and actions, but other comprehensive metrics such as bounce rate, referral and direct traffic sources, traffic flow and conversion rate can give a better glimpse at how users are accessing your online resources.  In order to have the most comprehensive and “big picture” view of your website’s success and customer satisfaction, it’s important to use these web metrics with other kinds of evaluative tools, including logic models, surveys and formal usability studies (Marek, Chapter 4, 2011). Each metric should be evaluated within the context of other metrics and assessments, as web analytics contributes to the larger picture in understanding user patterns. Lastly, the organization’s mission and specific goals will determine how, when, and which web analytics should be used and interpreted.

At OERC, we use these comprehensive metrics to assess the OERC blog and website (separately), in addition to the big three.  Our web analytics tracking program is still fairly new and developing, but we wanted to share our process so you can see how easily web analytics can fit into the evaluation strategy of a non-profit organization or library.  Unlike for-profit companies, we aren’t using web metrics to calculate return on investment (ROI) and assign a monetary value to the metrics.  However, we can mimic some of those steps taken by for-profit companies to help guide our use of web metrics.

At OERC, we wanted to learn more about who accesses our blog & website and if they are finding the information they need. As Marek suggests, you need to evaluate your website through clear goals “based on your library’s mission and strategic plan” (Chapter 1, 2011).  From there, web analytics provides the raw data, which you can then assess within the context of your organization’s mission and strategic plan to make the most appropriate changes and informed decisions.  In web analytics, it is difficult to have exact and 100 percent accurate statistics, so capturing and evaluating trends over time is the best way to use web analytics as an assessment tool (Marek, Chapter 1, 2011). Web analytics can be a strong and valuable tool when creating evaluation tools such as logic models to guide your organization projects and outcomes. As you develop your short-term outcomes at the individual level, web analytics can be a quantitative way to measure your targets within your indicators.  This is where you can look at the patterns provided by bounce rate, traffic sources, referrals, and conversion rates to determine if you see a change and if your organization is achieving your logic model outcomes.  Unlike many evaluations, web analytics provide instant analysis and as a result, an organization can consult the data and make adjustments instantly to their website.  This flexibility allows you to reach your defined logic model outcomes and produce the best results from your structured plan.

At OERC, we determined specific goals to frame our web analytics strategy, and decided how we wanted web analytics fit into the overall structure of our evaluation plan.  Our overall goal is to increase blog post views and push more blog visitors to the OERC website and evaluation tools & resources. In our situation, we decided that we needed to track returning and new visitors as we publish new OERC blog posts. Then, we can compare those web statistics against baseline traffic patterns during days or weeks without a new blog post.  For the OERC blog, traffic sources and traffic flow are some of the biggest indicators of our visitors’ patterns. We can also separate new and returning visitors to the blog in different groups, and evaluation the traffic sources separately.  This way, we can work with our overall marketing plan to see if any marketing efforts have been successful in garnering new visitors, or pushing returning visitors back to the blog.

This step of web analytics complements other evaluation strategies because you can use web analytics to assess the same goals that other evaluation strategies are targeting.  For example, when an organization surveys users about how they obtain a particular form or brochure, then web analytics can provide supporting evidence for online users and downloads.  If the survey and web analytics show that more users are accessing the needed form online, then the organization can save costs on printing hard copies.  Furthermore, the organization can make a commitment to expand their online selection of forms and resources to support the greater interest in accessing forms online. Web analytics can support patterns from small assessments like surveys to see the full picture.   At OERC, we will be launching updated evaluation guides this year, and we will be tracking how often they are requested. Web analytics complements our in-person requests, and we will be tracking conversion rate to determine how often visitors download these new materials after visiting other webpages.  Conversion rate complements the other metrics and can provide support for our specified communication goals and logic model outcomes.  Another OERC organization goal is to increase requests for informational materials, as well as requests for classes and webinars.  As we market new materials and new online resources over the next few months, any increase in new visitors to specific OERC webpages as shown in the web analytics will complement our evaluation of these marketing techniques.  Web traffic to specific webpages or blog posts will complement the advertising efforts during webinars, social media outputs, and word-of-mouth.  In the next post for this series, we will take a closer look at the actual analysis of the web analytics and discuss what changes and patterns we see within OERC web analytics.

Web analytics provide valuable quantitative data that can support organizational goals and logic model outcomes. By integrating simple web metrics into your evaluation program, you can increase your understanding of website visitor patterns, website use, and access to online materials.  With this data, you can adjust current practices to improve your online presence.  Combining the data with results from other evaluations, such as surveys or focus groups can provide “big picture” statements, perfect for making larger decisions regarding future goals and actions.


Digital Analytics Association. (n.d.). Retrieved March 26, 2012, from

Direct Traffic. (n.d.).Google Analytics Help. Retrieved March 26, 2012, from

Evaluation Guides from the OERC. (2008, July 2). Retrieved March 26, 2012, from

Gould, D. (2011, November 22). Flow visualization: Traffic and conversion reports in Google Analytics. Vertical Measures. Retrieved from

Kaushik, A. (2006, July 31). Excellent analytics tip#5: Conversion rate basics & best practices. Occam’s Razor by Avinash Kaushik. Retrieved from

Kaushik, A. (2007, August 6). Standard metrics revisited: #3: Bounce rate. Occam’s Razor by Avinash Kaushik. Retrieved from

Kaushik, A. (2008, March 26). Excellent analytics tip #13: Measure macro and micro conversions. Occam’s Razor by Avinash Kaushik. Retrieved from

Marek, K. (2011). Chapter 1: Web Analytics Overview. Library Technology Reports, 47(5), 5–10.

Marek, K. (2011). Chapter 4: Reporting and Analysis. Library Technology Reports, 47(5), 26–32.

NNLM Outreach Evaluation Resource Center. (2012, January 3). Retrieved March 26, 2012, from

OERC Blog: A Weblog from the NN/LM Outreach Evaluation Resource Center. (2011, December 21). Outreach Evaluation Resource Center. Retrieved March 26, 2012, from

Olney, C., & Barnes, S. (2006). Step one: Develop an outcomes-based project plan. Including Evaluation in Outreach Project Planning, Planning and Evaluating Health Information Outreach Projects. National Library of Medicine. Retrieved from

Olney, C., & Barnes, S. (2006). Step two: Develop an outcomes assessment plan. Including Evaluation in Outreach Project Planning, Planning and Evaluating Health Information Outreach Projects. National Library of Medicine. Retrieved from

Referral Traffic. (n.d.).Google Analytics Help. Retrieved March 26, 2012, from

ROI. (n.d.). Webopedia. Retrieved March 26, 2012 from

Image of the author ABOUT Mandy Gonnsen

Email author View all posts by

Subscribe to our blog!

NEO’s Latest Tweet

Annual Archive

This project is funded by the National Library of Medicine (NLM), National Institutes of Health (NIH) under cooperative agreement number UG4LM012343 with the University of Washington.

NNLM and NATIONAL NETWORK OF LIBRARIES OF MEDICINE are service marks of the US Department of Health and Human Services | Copyright | Download PDF Reader