Module 7: Dissemination and Utility of Evaluation Findings

Once data collection and analysis are complete, there are two major steps in finalizing the evaluation process: dissemination and utilization of the findings. The purpose of evaluation is to ultimately make decisions that improve the quality of life. Although this can be done through small programmatic alterations or major policy changes, neither can ultimately occur without the proper dissemination and utilization of findings. It is important to note that although this module falls under the Course on Evaluation, the following information is applicable to a wide variety of health communications, such as research and media components of global health programs.


Once results have been analyzed, evaluation findings may be shared with clients and stakeholders, including participants and community members. Ideally, these individuals will be ready and receptive to recommendations, since the evaluation process will have incorporated the priorities and interests of these groups (see Focusing the Evaluation). The findings can be disseminated in a number of ways:  detailed reports, news releases, press conferences, seminars, or email-based list serves to name a few.(1) When comparing modes of dissemination, a study by Muller et al. (2008) found that simultaneous use of print reports, website, and workshop media from a tobacco-control evaluation was the most effective at improving satisfaction of stakeholders and prompting further dissemination of the results.(2) This research highlights the importance of using multiple modes of communication  as well as understanding the channels in which your stakeholders are accustomed.  Dissemination strategies can be very comprehensive documents, but there are few key features that should be emphasized.

Goals and Objectives

Like health programs, dissemination plans should have goals and objectives. These should be established in the beginning of the planning stages and guide all other aspects of the dissemination strategy.(3)  For example, if the goal is to create or change existing policy, mass media should be considered a main channel of communication, as the public would play a key role in contacting and communicating with policymakers.


When developing a dissemination plan, it is important to understand who will be using the evaluation findings; this information should be determined using stakeholder analysis. Research shows that dissemination plans tailored to users’ needs are more likely to be utilized.(4) With knowledge of the user, it will also be easier to choose the appropriate modes of communication for the highest likelihood that the information will be understood. If possible during stakeholder analysis, it may be useful to survey common channels of communication and learn about how clients envision findings being used.

The user also defines the tone of the communication. Findings disseminated to the general public should look, sound, and read differently than the information communicated to researchers or policymakers.(5) An example of this is the Centers for Disease Control and Prevention’s (CDC) publication materials on the benefits of immunizations. Based on extensive and longitudinal evaluation into the efficacy of immunizations, the CDC has developed a number of campaign materials to disseminate evaluation findings to stakeholders. With a wide range of groups benefiting from immunization programs, the CDC has tailored communications to each target audience. For example, the materials targeted towards pre-teens and teenagers are in tune with their preferred channels of communication: podcasts, YouTube videos, and color print advertisements in magazines. Materials for adults are comprised of brochures and pamphlets, media that are more effective for this population than internet-based sources.

Access and Availability

Users will read, process, and utilize evaluation findings at their own convenience, not necessarily at the time of initial information dissemination. Therefore, findings should be made available through a number of channels and archived for future use.(6)

For detailed instructions about developing a dissemination strategy:


Despite consideration of users, access, and availability, information dissemination may still have unintended affects. For example, in a randomized trial conducted to test the effectiveness of messages designed to reduce vaccine misperceptions and increase MMR vaccination rates, the messages were found to actually decrease intent to vaccinateeand reinforce misperceptions.(7) In this case, though the audience was properly informed and the modes of communication were appropriately tailored, context and pre-existing beliefs prevented the dissemination of information from achieving its intended outcome: decreased vaccine misperceptions and increased vaccination rates. Though it cannot be expected that audience reactions will be the same for different content, appropriate measures should be taken to examine previous methods of information dissemination in the target population and each method’s level of success.


Once evaluation findings have been distributed through the appropriate channels, the goal of the evaluator is to improve the likelihood that information will be utilized in some way – whether it is in policy, program, or organizational changes. For policy-makers and many other populations, “the quality of the research is extraneous to making decisions.”(8) In a series of connections, “the more complex the evaluation, the more jargon, the more equivocal the conclusions, the more caveats in the preamble, the more sensitive the issue, the more complex the writing, the more obscure the evaluators, [and] the more apt that the report will be discarded by policymakers and legislators.”(9) Lipton (1992) stresses the importance of tailoring findings to the user; not to fundamentally alter the findings themselves, but simply to reduce jargon and simplify the information.

There are a number of steps that evaluators can take to increase the likelihood of evaluation findings being utilized positively by the intended audience. One strategy is to make evidence-based recommendations in an evaluation report. Recommendations are clear action items that clients and program staff can apply directly to the program under evaluation. For example, results of an evaluation for a local immunization program found that 35% of vaccine providers were able to recall content of a monthly newsletter. To meet the current objective of a 50% recall rate among this population, a governing body recommended varying the media message and increasing the number of messages through specialty-specific channels.(10) This detailed, evidence-based recommendation was directed at clients and stakeholders such that they could quickly act on and make changes to their programs.

It is not sufficient to disseminate reports of findings or various communication materials to stakeholders and expect immediate application of information. Feedback and stakeholder discussions are important steps in the dissemination process that can improve both the chances and quality of utilization. Facilitating conversation among stakeholders can also help avoid miscommunication of findings, brainstorm strategies for how to implement recommendations, and prevent misuse of the findings.(11) In the course of evaluation, follow-up and technical assistance to stakeholder groups are critical components in order to ensure the proper use of evaluation results.

Dissemination and Utilization Conclusion

Evaluators have an interest in ensuring that findings from program evaluations are disseminated to the proper audiences and subsequently utilized in a manner that will be most effective in improving the quality of life of the target population. In 2011, Rajiv Shah, the Administrator of the U.S. Agency for International Development, commented on the need for evaluation to inform future program improvements: “In the end, the measure of our success will not be predicated on the number of evaluation done, or stored within a database, or even solely upon the quality of the findings…Our success will depend on our ability to use evaluation findings to strengthen our efforts and sharpen our decision-making.”(12)

Go To Module 8: Five-Year Evaluation of the Global Fund >>


(1) United Nationals Population Fund. (2004). Tool number 5: Planning and managing an evaluation. Programme Manager’s Planning Monitoring and Evaluation Toolkit. New York, NY: Division for Oversight Services. Accessed on 20 June 2019.

(2) Muller, N.B., Burke, R.C., Luke, D.A. and Harris, J.K. (2008). Getting the word out: multiple methods for disseminating evaluation findings. J Public Health Manag Pract, 14(2):170-176.

(3) National Institute on Disability and Rehabilitation Research. (2001). Developing an effective dissemination plan. Accessed on 20 June 2019.

(4) Hawkins, R.P., Kreuter M., Resnicow, K., Fishbein M., Dijkstra, A. (2008) Understanding tailoring in communicating about health. Health Education Research, 23(3):454-466.

(5) Yale Center for Clinical Investigation. (n.d.). Beyond scientific publication: Strategies for disseminating research findings. New Haven, CT: Yale Center for Interdisciplinary Research on AIDS (CIRA). Accessed on 20 June 2019.

(6) National Institute on Disability and Rehabilitation Research. (2001). Developing an effective dissemination plan. Accessed on 20 June 2019.

(7) Nyhan, B., Reifler, J., Richey, S., Freed, G. (2014) Effective Messages in Vaccine Promotion: A Randomized Trial. Pediatrics, 133(4): 835-842.

(8) Lipton, D.S. (1992). How to maximize utilization of evaluation research by policymakers. Annals of the American Academic of Political and Social Science, 521.

(9) Ibid.

(10) Centers for Disease Control and Prevention. (2006). Introduction to program evaluation for public health programs; Evaluating appropriate antibiotic use programs. Atlanta, GA: CDC. Accessed on 20 June 2019.

(11) Milstein, B., Wetterhall, S. and CDC Evaluation Working Group. (1999). Recommended framework for program evaluation in public health practice. Atlanta, GA: CDC.

(12) U.S. Agency for International Development. (2011). Evaluation learning from experience: USAID evaluation policy. Washington, DC: USAID. Accessed on 20 June 2019.