[Guest blog post by the principal co-investigators of the Ushahidi Haiti Project evaluation]
With the release of the final evaluation report, the evaluation team would like to share a few additional thoughts on the importance of the Ushahidi Haiti Project (UHP) Activity. Information systems used by the humanitarian response community have rarely been evaluated. The ten year evaluation in 2006 of Relief Web is a notable exception and prompted some of the Relief Web functionality that we all enjoy today. Needs Assessment has received some attention, but there are no other well evaluated examples in the humanitarian realm. None-the-less, one of the most common recommendations of almost any evaluation of humanitarian response is a call for improved participation of beneficiary populations and better information systems. Both the UHP team and the evaluators knew we had a unique opportunity with this evaluation to contribute some important learning in these areas to the humanitarian response community.
Being able to document that the humanitarian spirit is alive and well with young people today was perhaps one of the most rewarding aspects of the evaluation. As worrying as the proclamation by Jan Egeland that “humanitarian action is under attack” in OCHA’s recent Stay and Deliver document must be, we think that UHP and other projects like it continue to show that Humanitarians will continue to be more dynamic and innovative than those that would attempt to pervert or repress perhaps the most noble human motivation -- to help those in need. In our mind, UHPs ability to connect volunteers, those who are suffering and those that want to help will be the project’s lasting relevance.
Volunteer efforts are at the core of any emergency response. Whether it is family members or neighbors sheltering someone that has lost their home or students collecting money for an international organization, it is voluntary action that makes the difference in emergency, especially during the first seventy two hours, which is the time often required to launch a vigorous formal humanitarian response. UHP showed us how the principals and technology behind social media and rapid information integration can quickly connect volunteers. Rapid—this is important because it allows for rapid action by local people and smaller voluntary groups in the immediate aftermath of the emergency.
The evaluation re-affirmed on-going problems in evidence-based humanitarian action. The disconnect between information, needs, and decision making persists to thwart effective humanitarian response. Much can and should be done to highlight scrutiny of the relationship between needs, information and action in differing humanitarian stakeholder groups. Most significantly, crowd sourcing through social media is an emergent and highly potent instrument for humanitarian information and action. We are on the frontiers of applying this tool effectively for population well-being.
Of interest to evaluators --the data sources that the evaluation team used were not your grandfather’s document review, key informant interviews or probability surveys. Automatic time stamps are added to Skype chats, sms messages, email and web postings, and this is an invaluable tool for evaluators when trying to triangulate information. On the other hand, there are significant challenges of analyzing so many small and often disparate threads of conversations. Reinforced and adaptive models of program logic and improved techniques for intelligent analysis of these data, including multi-media data will become increasingly important for evaluators of this generation.
Crowd sourcing is proving itself globally as a powerful information/social movement integration that is transforming the world. Georeferenced and aggregated crowd sourced information is transformational to disaster management, especially catastrophy management. We were honored to be part of the analytical team that might foster learning and adaptation. We favor the application of developmental evaluation as an evaluation strategy, in contrast to traditional evaluation frameworks.