Maps & Decision Making- the best can happen offline

Maps & Decision Making- the best can happen offline

Ethiopian Government office. What would you like to see on this map? Before we all get online and create a map, who are our users, and are we taking the time, traveling the distance to ask them? Pens, paper and face to face discussion are powerful… then comes the tech. First people.

Blogging vs processing what’s before you


It’s been days since the last blog and quite a lot has passed. This is a venture back into the realm of reflection about humanitarian tech evaluations, field work and for this blog… sometimes just about being a person and a humanitarian.

I’m always truly humbled by the dedication of communities affected by crisis and complex humanitarian challenges.  And for all those big words… it’s truly about resilience.  And however large and inflated the term is becoming in the global sector and no matter how people continue to speak about what the term “really means”, its silencing, and rightfully so… to breath, share space, and deeply learn from with those who live and breath resilience. We need more of that.

I spent some of the most inspiring days to date of my humanitarian career with three women from a few villages in southern Ethiopia. Recently they traveled hundreds of kilometers to the capital to speak to those who would listen, and to learn from those around them.  And participate in a world that is called the “humanitarian space” they are core to but are rarely invited to come to.  And for all of the meta-discussions and working groups at this international accountability conference that discussed how to best empower and engage with “affected populations” to me some very clear, honest, and inspiring voices rang through. I’m deeply saddened by the challenges we face and create for ourselves in the humanitarian system, and at the same time enraged to tears by obstacles in the way. But in the end, it’s not about me, it’s about supporting the sharp, powerful and amazing voices that are not just to be heard, but will reshape the future…. if we are willing to help make it happen.

Evaluation approaches for humanitarian ICT programs – Questions before Answers

What are the best evaluation metrics for humanitarian ICT programs?  In the past few weeks I’ve spent days in the field with communities who’ve been integrating ICT into their existing drought early warning surveillance system. So much of this specific evaluation has been about shared learning. Listening to people’s challenges and successes. Thinking together about ways to make actionable changes going forward. And in many ways and more importantly… finding ways to sustain the successes to date.

But at the same time there are many guidelines, metrics, and criteria in our humanitarian system that aim to “define” evaluation. Much of this is related to accountability. The very well known OECD Criteria comes to mind and it has been applied to humanitarian complex emergencies, in the GUIDANCE FOR EVALUATING HUMANITARIAN ASSISTANCE IN COMPLEX EMERGENCIES document and support by evaluation entities like ALNAP. (link).

The deep and burning question for me right now is what part of OECD is applicable to humanitarian ICT program evaluations, especially programs that are driven by community-based approaches. While it may be easy to apply the terms “efficiency, effectiveness, coverage, scale, etc…” taking a step back for just a moment and asking..

“is this the right approach?”

can open up many more questions.

Are we leveraging and complementing existing criteria like OECD with nuanced approaches that appropriately fit this new and emerging environment we call humanitarian technologies or humanitarian ICT. The ways of working may be quite a different animal than what existing evaluation frameworks were designed for.  For example, some innovative ICT programs support and even encourage iteration and change during the program or project cycle. What does this mean for frameworks that require pre-determined indicators with logframes and provide very little room for iterative changes?

How to understand OECD criteria in these settings, reapply it and potentially reshape it to humanitarian ICT programs are the burning questions for me right now.  While other approaches in the innovation and technologies disciplines that can certainly inform us (ref) , I believe that we may just not be there yet.

At the moment I’m struggling with how do you look at outcomes and “effectiveness” of a humanitarian ICT projects during the pilot phase. The Humanitarian Innovation Fund (HIF) has provided guidance on this for their funded pilot projects and uses the OECD criteria as well. (ref) This has provided some valuable approaches, but I’m still left with questions.

What should we expect when evaluating for outcomes and especially scale, for a community-base program that had just begun to integrate digital data collection only 9-12 months ago.  How do you measure something that is expected in many ways to evolve, iterate and change during an early stage of development.  And should you force order and pre-existing metrics to holistic, dynamic evolving systems so one can evaluate using currently accepted approaches.

Is it possible that this approach will lead us to erroneously come to conclusions because the method was not fully fit for the purpose.

I would say let’s not throw out the baby with the bathwater, but let’s accept the possibility that we need to take a new approach or at least question whether or not we need to integrate new ways of working into existing evaluative methods for humanitarian ICT projects.

Back in bandwidth and crunching data

After a full week in the field (more posts to come about this) it’s now time to hustle and crunch data from a community based early warning system.  There are urgent issues identified on the trip that require pushing analysis forward as quickly as possible.  With the internet limited in the office yesterday time has been delayed, but now the connectivity is better and it’s time to make some headway.  Headphone on, a little music to keep the brain and sleep deprived mind moving, it’s about moving data into information and visualization as quickly as possible, and in a way that decision makers can potentially  1) understand 2) accept, and 3) find the information useful for decision making. What that decision may or may not be is for another post.

A little Sly and the Family Stone’s Everyday People can only help….



Wading in the weeds of digital data- Part 2

I mentioned in a prior post about merging various digital data collection files.  Exploring this data has been an interesting experience, particularly prior to heading to the villages where the data is being collected. There are hundreds of ways to look at the data, but the first pass was taking a cursory look at data quality. One common metric that many folks like to report is the how frequently a survey question is answered.

StatsImage And the missing data here is village name (aka PA) . So we can count all the beans, or have a computer count those beans for us. And we can produce all of these tables and graphs like the one above.  But the core question in my opinion is what that “2″ above really means and why.  So in many ways it’s important to walk away from the “data” for a moment and go back to the context. Is the missing village information because…

  • the digital data collection form has a glitch?
  • Does the question appear unclear to the women who are answering the questions and they are just moving on to the next question?
  • Is the data collector chatting with the respondent woman in another village but that village is not on the list and despite reporting on their home village they are unsure of what to check?
  • Or is it because they are in a rush and just need to finish the survey and get back to their daily activities so they move quickly to the next question.

Some digital data collection tools can help improve data quality…. to a certain degree. For example, below is a function which requires an answer prior to proceeding to the next survey question. It’s not a full fix (as future posts will discuss) but may help decrease missing data which is just one marker of quality.


A long day before departure.

Today is my last day in Addis before heading to the field. Yabello_2005It will likely take us 1/2 day by 4WD to get half way down to the southern tip of Ethiopia. And then another approximately 1/2 day to get to a town of about 20,000 people. Then the next day to one of the two small project sites. To the villages themselves.. another short drive.

Much of yesterday was in the office working on logistics.  Driver, arranging hotel accommodations, sketching out a more detailed travel plan.

  • depart early am or mid day?
  • stay overnight in Awasa or head straight town and make a full run of it.
  • what is the best hotel in the first small town
  • is there electricity in the 2nd town, maybe we should stay 45km outside?

I went to the local grocery store to pick up some treats for the road, and will pick up more tomorrow. That in complement with a few Cliff bars should do the trick.


My brain yesterday was slightly fried from doing a preliminary analysis of 298 household surveys, and I couldn’t muster the energy to dive back in. But the timing was good because today I embarked upon a more detailed framework for field interviews, focus group discussions and potential surveys. This may be my only feasible opportunity to print paper surveys so that took top priority. (why not tech? that’s for another post)

From a bird’s eye view it’s a patchwork methodology for capturing lessons learned and measuring the potential early effects of integrating digital data collection into a community-based early warning system. While this is not a research project, there are many research methodologies that are leveraged to achieve the evaluation’s goals. What is remarkable to me is the challenges we face in meshing research approaches with programmatic project goals. For some traditional researchers it looks messy, “lacks rigor”, and even half hazard, I believe this is one area where research meets practice. Or more aptly said in this context where humanitarian practice touch points with research.

A large portion of the field work will  lean upon qualitative methods; semi-structured interviews and focus group discussions. The questions may change based upon who is engaged in the conversation. And the language may also vary from English to Amharic to Afaan Oromoo. We may be in a local government office, at a local restaurant, in a hut, or under the shade of a tree.


From a researcher’s standpoint we can challenge the “validity” or “accuracy” of this approach, but is the purpose broad generalizability of integrating one type of digital data collection into two small areas of southern Ethiopia? Maybe (an likely) that’s not the goal. Maybe the goal is to gain learning/knowledge of how to creatively move forward with a multitude of actors.  So that data and knowledge extraction is not the predominating factor (hopefully) for a manuscript or conference, but information and knowledge sharing for a group of people who may not care at all about peer-review.

First Steps in evaluating ICT – Mixed Fruit & Paper

So I’m back again  in Ethiopia to evaluate a project which has begun to integrate digital data collection into a community-based early warning system. My prior blog here, and here briefly describes the  system which I helped design in 2007 .


I started out the day in Addis scribbling on a piece paper, drinking a mixed fruit juice drink which I’ve greatly missed since my last trip in 2012. The piece of paper is a brainstorm of pictures, arrows and images- rather than any  “data”, “numbers”, columns or even rows of a database. I’m partial to little pieces of paper and scribbling out pictures because it helps me stay closer to the 90/10 rule. In short it’s about the people, processes, partnerships, and trust networks that really push an ICT project forward. That’s the 90%. The remaining 10%  (others would say 20) is the technology itself. I know this brainstorming session will be drastically changed by the insightful and ever powerful conversations with both the local partner and community women. It’s just a start for met to catch up with them.

Wading in the weeds of digital data- Part 1

So it’s a well known conversation in the humanitarian technology sector that digital data collection is one of the most powerful humanitarian technologies to date. (see NOMAD, IFRC 2013 World Disaster Report) But I would say, the devil’s in the details  particularly in such a complex ever changing environment.

Mobile phone

I spent the rest of the afternoon working on “data”.   I think the automated part of digital data collection creates this idea that once you collect the data it all of a sudden becomes “synced” and transformed into one grand standardize file for analysis. But in reality I think for many organizations this “standardize database” just doesn’t really exist in the way that we envision it.

In theMyDesktop_FileChaos process of gathering monthly data  from 10 villages and  a total of 30 individuals for the past six months I’ve come to realize that a limited file management system can create a lot of data chaos. And to be transparent many of us struggle with it… since here is a screenshot of my desktop which  embarrassingly exposes my recent personal  file management system!  Ahh chaos!

But with a few hours of blood, sweat, and tears I managed to get 298 individual household surveys into one somewhat standardized file for analysis. (the survey had been changed multiple times based upon the feedback of women and the local NGO coordinator.  And for those of use who like to count beans;  that’s 298 surveys with ~81 variables for a total of ~24,138 points of data. In a future post (I actually have to run to the office right now) I’ll talk about the first pass of analysis… but only a sliver of knowledge and understanding. Ciao for now.

Heading back to Ethiopia

I’m heading back to Ethiopia today, and spending some time before departure at Sauce and Bread a great local coffee shop/restaurant in my neighborhood of Edgewater. In the flurry of packing, downloading documents, and uploading software this haitus is warmly welcomed. In upcoming posts I’ll chat about the project that I’m working on, musings on ICT in humanitarian settings, what it means to me as a doctor, humanitarian and just me. For now it’s a wonderful sandwich, good conversation, coffee and staring out the window. Chicago is finally sunny again.


Community based early warning

So as I mentioned in another blog post I have been asked to return to Ethiopia to help assess feasible approaches to integrating (I)nformation (C)ommunication (T)echnoloy (ICT) into a drought early warning program.

While for many project the T often supersedes the I & C, I’d like to think that this project has evolved in a bit of a different way.

Back in 2006 during my emergency medicine residency training at HAEMR I had the opportunity to support an international NGO with a regional office in Addis Ababa. A severe drought had just affected the southern pastoralists communities and all but decimated their primary livelihoods- their livestock. So how does this affect the health of communities? And for those of us who think primarily in terms of human health, how does this connect?

Well its all an intertwined ecosystem. Between access to water, rainfall, growth of pasture, ability for animals to produce milk for human consumption. And during times of stress the ability of people to buy and sell in the markets to purchase grain for food. And access to food or the lack there of… spins a dangerous web for only the health of already undernourished children but of pregnant women, the elderly and communities as a whole.

For this project a public health lens for drought early warning needs to come from the perspective of the communities. An as I learned, it’s just as much about the pasture as it is about the number of visits to the health center.

One of the reasons this project came about was to complement the existing early warning systems. In the minds of many, the existing more traditional EWS often struggle to collect timely and actionable information to prevent the deleterious outcome of droughts/famine.

And so the approach began with conversations with women in Tuka, Arganne, and Danmbi. (focus groups) and learning about the communities perspective on early signs of drought, coping mechanisms and ongoing community needs.


Get every new post delivered to your Inbox.