Skip navigation

AGENDA: Credible Research Methodologies Dialogue - March 16th, 2015


AGENDA: Credible Research Methodologies Dialogue - March 16th, 2015

To: Participants - Credible Research Methodologies - March 16th, 2015, BBC Media Action, BBC, London

From: Warren Feek - Executive Director - The Communication Initiative

Resending the agenda below as there is a clarification of the final session at 4-30pm. The agenda has been edited to include the new description. We have also added the reception on Monday evening. If you have not already done so, please send your two questions/comments. This link is a good place to start. Look forward to seeing everyone at the BBC Media Centre, White City on Monday. At reception please ask for Catharine Buckell - BBC Media Action - 07889-264-198 (mobile)

Many thanks for participating in this meeting on the 16th March, 2015. It will be excellent to have your knowledge and insights as we critique the presentations and collectively seek to build stronger research and evaluation related to the vital communication and media/social and behavioural elements for effective development action across all of our priorities.

The agenda follows.

If you log on to this community of practice platform you will be able to access background information/profiles on people who will attend or are otherwise involved in this event.

AGENDA - Research Methodologies - Communication and Media for Development; Social and Behavioural Change

Date: 16 March, 2015

Venue: Media Centre Boardroom, BBC Media Action, BBC London

CONTEXT: Local, national, and international development requires impact evidence. Measuring and reporting the impact of the social, cultural, and personal dynamics related to development challenges - for example, community engagement strategies on health issues or media engagement on democracy and governance questions - can be extremely difficult.

REQUIREMENT: Credible data requires credible research methodologies. How the data is collected and analysed is subject to scrutiny by people both within and outside the communication and media for development/social and behavioural change field. Those outside our field, such as economists and epidemiologists, can be particularly harsh!

PURPOSE: To critically review, discuss and debate credible methodologies for research on communication and media for development/social and behavioural change themes.


  1. Individual participants will leave the meeting with two new ideas for ways to strengthen the research and evaluation work of their organization related to communication and media; social and behavioural change themes.

  2. Collectively the participants will contribute to the development of a short (two pages) paper that provides a critical overview of the credibility of communication and media; social and behavioural change research.

MEETING STRUCTURE: This one day meeting will be structured around these principles -

  1. Substantive presentation of three research and evaluation methodologies, including an insight into the results obtained from those methodologies.

  2. Significant time for the critical review by participants of those three research and evaluation methodologies.

  3. We will do all that we can to avoid filling the day with a large number of short presentations and little dialogue. The emphasis is on substantive presentations and time for critical review.

  4. From the perspectives outlined in the presentations and the participant review a critique of the BBC Media Action paper Reframing the Evidence Debates: A View from the Media for Development Sector.

  5. The day will be facilitated by Warren Feek, Executive Director of The Communication Initiative.

AGENDA for 16th March, 2015

8-30am - Participants gather - please arrive before 8-30am as you need to be escorted!

8-45am - Welcome by the hosts

8-50am - Overview of the day (Warren Feek)

9-00am - Quick pairs discussion to establish individual goals and establish partnerships


9-15am - Presentation by Dr Seb Taylor - will outline a methodology drawing on Qualitative Comparative Analysis (QCA) for a major research initiative on how households make health decisions in northern Nigeria with specific reference to polio and routine immunization

10-15am - Critical review of this methodology by participants.

11-30am - Coffee!


11-45am - Presentation by Dr Sue Goldstein (Soul City) - on the major Soul City evaluation that combined: cross-sectional analytic studies; national household survey; and, stratified cluster samples.

12-30pm - Lunch

1-15pm - Critical review of this methodology by participants.

2-15pm - Quick Coffee!


2-30pm - Presentation by Simon Cousens and Roy Head (LSHTM and DMI) on the Randomized Control Trial methodology implemented in Burkina Faso with an indication of the initial results.

3-30pm - Critical review of this methodology by participants (with rolling coffee as this is happening).


4-30pm - What is evidence and who is it for? A perspective from applied research. Presenters James Deane and David Jodrell - BBC Media Action.

5-00pm - Critical review by participants.

5-30 pm - Reception

BBC Media Action cordially invite you to join them for a drinks reception immediately following the 16 March research methodologies meeting. The reception will take place at BBC Media Centre starting at 17.30.


Warren Feek
Executive Director
The Communication Initiative Network and Partnerships
Mobile: 1-250-588-8795

Comment viewing options
Select your preferred way to display the comments and click "Save settings" to activate your changes.

Critical discussion of the predominant demands for RCT

Thanks for this convening this meeting on Credible Research Methodologies, Warren. I am hoping that there will be some critical discussion of the predominant demands for Randomized Controlled Trial (RCT). We recognise that there are situations where this is not an appropriate part of methodology, but I’m not sure we know how to write results for evaluations without the RCT standard that stand up to criticism.

James Deane’s discussion of evidence in his BBC Media Action policy paper gives a description of different notions of evidence. This might serve as beginning point for discussion of how evidence - other than that which is from the academic paradigm (“evidence as the findings from surveys, experimental or quasi-experimental studies that support, or reject, a conclusion” ) - might be written to standards such that “coherence, defensibility, credibility and consistency (Spencer et al., 2003)” become fully accepted in the way that validity and reliability implied by RTC now seem to be.

Julie Levy

Agenda - Resarch Methodologies

The agenda is ok. Given the intensity and need for critical analysis, great if presentations are available earlier to allow richer engaged discussions.


Rosa Ongeso

How will the presentations be used?

Agenda: Looks great! I wanted to know how the feedback on the different presentations will then be used? Will the researchers use the discussion points to further critique their papers?

Apoorva Mishra

Mixed Methods

I have started to look through the BBC MA paper on "Reframing the evidence debates" (by Kavita, Anna and Zoe) and I found the different ways the issue of "mixed methods" is treated very interesting. On one hand mixed methods are named in the conclusion a quality critiera for evidence, on the other hand among the best examples of evidence (p12-13) there are also some that are applying a single method only. And Philip Davies suggests to combine formative evaluation, summative evaluation, theory of change and economic appraisal - as a "powerful mixed methods appraoch to evaluation" (p. 7) Maybe it is of interest that we discuss and differentiate on which levels mixed methods are of importance in communication and media for development. And when and how this mixing process is best carried out. What the underlying assumptions, theories or paradigms are. And how to take practical limitations into consideration...

Jan Lublinski


Please share your two comments or questions in advance of the meeting

The Agenda for the meeting

Methodology One Background - Seb Taylor

Methodology Two Background - Sue Goldstein

Methodology Three Background - Roy Head and Simon Cousens

Using the online platform


Link to Reframing the Evidence paper

Hi - in his comment just posted Jan refers to the BBC Media Action paper Reframing the evidence debates. FYI a summary can be reviewed at this link with access to the full paper. BBC Media Action will feature this paper in the final session of the day. Please do submit comments and questions here when logged in or just reply by email to this note. Thanks - Warren

Re: [Research Methodologies] AGENDA: Credible Research Methodolo

Hello Warren and James - greetings from Bogotá. For those of us not participating in situ ( sadly The Communication Initiative Latinamerica - CILA- will  not be present due to lack of travel funds)  is streaming being coordinated? 
This would be of huge value to the  network members. Food for thought for our BBC Media Action Colleagues as this is a strategic discussion for the field and our region. Language should not be a a barrier for southern network member as these are extremely valuable questions/ challenges we all share . We would like to translate and summarize the presentations, the key discussion points and reccomendations thru a Son de Tambora special bulletin in spanish- if we manage to find funding! 
Warm regards, and have a great meeting.
Adelaida Trujillo 
 La Iniciativa de Comunicación- Imaginario/ Citurna

Enviado desde mi iPhone

El 7/03/2015, a las 6:00 p.m., Development Networks <> escribió:


Using a variety of data sets?

In polio, we have seen the value of using a variety of data sets (program and research) and looking for discordance and convergence. The GPEI uses a combination of surveillance, pre-campaign dashboards, intra-campaign monitoring, Post campaign monitoring, LQAS, polling, DHS/MICS/SMART surveys, QCA (in Nigeria), and increasingly SMS-based rapid questionaires.

Are there lessons for other programs on how to best select a combination of approaches to inform decisions or make mid-course corrections? What is useful in the short term vs longer term trends? How can these methods be used to focus on equity (who is left out and why?)

For LQAS specifically - is there a role for LQAS in BCC research?

Ellyn Ogden

RE: [Research Methodologies] AGENDA: Credible Research Methodolo

Dear Ellyn,

Thanks for this. I think there are a whole lot of people coming to the meeting who will have much to say on this question, but thought I'd step in with a few ideas. I'll put them in as bullets as a (probably hopeless) attempt at brevity:
* One thing we've seen is a burgeoning array of operational data collection and associated research processes in polio over the last 15 years. This is positive in many respects, but different data and analysis processes have not always been used in a sufficiently complementary manner (e.g. the separation for a long time, and in some respects continuing, between social and epidemiological data). So one lesson is for the coordinating entities of other programmes, from an early operational stage, to develop methodologies for bringing together diverse research approaches and types of data to collectively answer core operational questions (and to do this as close to real-time, for course correction, as possible).
* Second, as a programme (especially one with an implicit focus on population coverage and hence equity) progresses, PEI shows us the need to generate better data at more localised levels - to get a more granular level of analysis and, associated with this, complementing population sampling with purposive sampling methods. We found that the availability and reliability of data below Ward level in Nigeria was really quite weak, making it harder to pin down increasingly small residual areas of programme under-performance.
* Third, programme research needs to be consciously adaptive - a slightly odd term, but what I mean is, the categories of analysis with which we start to think about the behaviour of individuals and groups of interest to an intervention are likely to change as the intervention itself progresses and we understand more about our target groups. The case of 'missed children' in polio is instructive insofar as the programme has at times been slow to expand its understanding of the range of categories of causation lying behind the phenomenon of un- or under-vaccinated children. Programmes should constantly be challenging the continuing veracity and utility of their analytical categories, asking whether we need to adjust them as the programme shows us new and emerging forms of behaviour in residual (and arguably increasingly specialised) groups and communities.
* Regarding LQAS, I think the adoption and expansion of these as a standard practice in PEI has been good for the programme up to a point. The problem is over-reliance on what is often a partially-deployed, aggregative, and essentially ex-post instrument for analysis. Over-reliance on the generalised and quite limited findings of LQAS can become the basis for over-confidence in general programme performance, whilst missing localised variance which gets submerged. I think LQAS can be productively used as a reference point, to pick out such areas of variance at a relatively broad level, but that this then acts as the starting point for further, more localised investigation. And again, that brings us back to the value of combining quantitative and qualitative methods and data types, something that has already been valuably commented on in this platform discussion.



Date: Sun, 8 Mar 2015 16:15:42 -0700
Subject: Re: [Research Methodologies] AGENDA: Credible Research Methodologies Dialogue - March 16th, 2015

RE: [Research Methodologies] AGENDA: Credible Research Methodolo

Dear Ellyn, Seb and colleagues:


On the LQAs, UNICEF has used this method over the past three years as part of a programmatic monitoring approach to systematically identify bottlenecks and barriers and adjust our interventions. This approach has been applied in several countries across different programme areas (Health; Nutrition; Child Protection; Education; HIV). As Seb points out, from the very beginning it was clear that data generated through LQAs needed to be complemented with additional data, particularly qualitative, in order to better understand bottlenecks and barriers across four domains – enabling environment; supply; demand; quality- and inform programmatic adjustments. We’ve seen that in most cases bottlenecks and barriers are associated with issues such as access, behaviors, norms, practices, stigma and discrimination, etc. that demand consistent SBCC interventions. We just released a compendium that captures different examples/case studies of how UNICEF country programmes have applied this monitoring approach. One of them focuses on SBCC (C4D) and nutrition in Guatemala (in collaboration with partners – Govt of Guatemala; USAID; Plan International; WFP). This experience includes the use of mixed methods (LQAs, KAP; participatory communication plans; MSC, etc.) and has led to more focused and data driven interventions with very promising preliminary results.








From: Development Networks []

Request for compendium title and URL

Hello Rafael,

I would be very interested in exploring your compendium and am wondering if you could kindly send me the title and/or URL.

Many thanks,
Kier Olsen DeVries

Use of LQAs

Response from Adelaida Trujillo to this contribution from Rafael Obregon

CILA is also very interested in this compendium- is there anything in spanish? And /Or portuguese?



Agenda - Research methodologies

I am looking forward to listening to these presentations and exploring how we can incorporate some of these approaches in our efforts to monitor and evaluate the impact of our communications activities.

Paul Neate - CTA

A general question

All the organisations in which I work have generally invested in communication only as an afterthought. The same is true of impact assessment. Given this situation, how can one persuade rural research and development organisations to invest in the type of research described in these presentations?

Paul Neate

Response from Simon Cousens

Response from Simon Cousens to this question from Paul Neate

Good question. I don't know the answer.


General questions

Dear all

First a huge thank you Warren indeed for organising this, I look forward to it!

In general I've been studying the differences in assumptions and theories supporting quantitative vs qualitative methodologies and I've found there are a lot of cases where misunderstandings happen (e.g. some of you have raised this via questions related to when are RCTs appropriate) as well as generally recognising that both approaches are complementary yet not used in such way generally. It would be good if we could discuss this somehow as we go through certain methodologies or perhaps towards the end of the day if possible?

Best wishes, Jessica Romo Monitoring and Evaluation Coordinator at SciDev.Net

measuring the effectiveness of communication

It would be great to hear from others how (and if) they measure their communication activities (i.e., impact on behavior, attitude changes, etc.).

Sharon Felzer - World Bank

Standardised questions

Thanks Warren for this opportunity and for the interesting discussion already taking place. I would be interested in knowing if others have systems in place to monitor and evaluate communications as well.

In addition, I have a few questions: (1) Are there good examples of organisations or projects that use standard questions and evaluation methods that cut across themes or areas of work? (2) What types of tools are used (e.g., excel, dashboards, etc.) to measure and how often are they used (yearly, at the end of the project, etc.)? -- What has worked best and why?

I look forward to the discussions and presentations on Monday.



General Questions

How these methodologies deal with (identify, analyze) un-expected outcomes and impact? How are they integrating these in their M&E systems?


General question - dealing with iterative feedback

A comment from Sarah Cardey related to Agenda: Credible Research Methodologies Dialogue referring to Irela's question

In a similar vein to Irela's question, how is iterative feedback dealt with within each methodology? My experience with research that seeks to understand communication processes and impacts is that, aside from it being relatively messy, there are unanticipated processes and issues that need to be fed back into the investigation in order to understand fully the interlinked nature of communication, and to ask the "right" questions when we are trying to understand the impact of communication. It would be helpful to understand, or hear reflections, on how this happens in each of these approaches. Kind regards, Sarah

Dr. Sarah Cardey Lecturer in International Development School of Agriculture, Policy and Development, University of Reading, Whiteknights Reading, RG6 6AH, UK Telephone: + 44 (0) 118 378 6594 Website:


I think the agenda for the day is well-structured and offers a really substantive programme of engagements that'll make it a worthwhile experience.



Please share your two comments or questions in advance of the meeting. Here are some starter links. Make sure that you are logged in and then complete the Comments block.

The Agenda for the meeting

Methodology One Background - Seb Taylor

Methodology Two Background - Sue Goldstein

Methodology Three Background - Roy Head and Simon Cousens

Using the online platform


Long term behavioral change in WASH

Hi everyone! Sorry to miss this meeting in-person but looking forward to hearing about the discussions. The questions posted have been though provoking. I would be curious to know if there are good examples of studies that track long-term behavior change in WASH (water, sanitation, and hygiene), and/or that measure behavior change (quantitatively) at the individual and community levels beyond self-report.

And has anyone come across research that cogently examines behavior change at the individual level, family level, and community levels, and their interrelations? Are there established methodologies (even for looking across two levels of contexts) for doing that in behavior change communication?

The BBC Media Action report has been helpful in framing the debates about what constitutes evidence (and who gets to decide?). Sesame Workshop (Sesame Street) is working with the University of Maryland to evaluate our WASH pilot project in three countries--Bangladesh, India, and Nigeria. Because of differences in program implementation and settings in each country, it's been challenging to conduct community-level RCTs (for which 60-90 communities would need to be randomly assigned to intervention and control groups), particularly for a pilot project that is, by definition, conducted on a small scale.


Lit review looking at behaviour change in handwashing

Dear June,

I recently worked on a (brief) lit review looking at behaviour change in handwashing. There are not that many examples in the literature (we found 8 'success stories') and all were conducted in different contexts, at different levels (small C-RCTs vs. national) and using different outcome measures. The papers we found report between 13 and 64% increase in HWWS measured by observation (change measured at different lengths of time post intervention ranging from 45 days to 18 months). Between 4 and 46% increase in HWWS measured by self-report.

I worked on a C-RCT in India where we measured behaviour using structured observation with measures at baseline, 6 weeks, 6 months and 1 year post-intervention - behaviour change was sustained over this time period: Biran et al 2014. Lancet Global Health We faced many difficult decisions during this study and had many questions, for example: do we keep the same enumerators or change them? [We changed them so had to ensure adequate training each time]. We wish to minimise reactivity, but how do we know that the intervention hasn't resulted in differential reactivity?

We do not know the answer to this but have subsequently thought that we could assess this qualitatively by asking participants in the intervention arm whether they were aware of what we were measuring i.e. did they link the observation to the intervention. I still wonder about this problem and would be keen to know if anyone has any other ideas of how to deal with this.

Katie Greenland

Research Fellow
Environmental Health Group
London School of Hygiene and Tropical Medicine

Agenda and dialog

The methodology presentations are well selected to raise very important issues that I look forward to discussing. I would also like to express the hope that methods related to studying the impact of dialog might come up. Warren's manifesto draft emphasizes dialog, engagement, and so on in a number of ways. Media Action has done a lot of work on projects employing mediated political debate, public discussion, and so on. And Soul City's theory of change references Paulo Freire's ideas. Lots of territory to cover here. But if it isn't piling too much on....perhaps the relationship between dialog processes and media campaigns can be explored, or other topics that address this element of the manifesto.


A more general question

A comment from Sarah Cardey related to Agenda: Credible Research Methodologies Dialogue

A more general question: One of the challenges, if trying to provide evidence of communication across disciplines, is simply how to communicate different research perspectives and definitions of what constitutes "valid" and "reliable" evidence to different perspectives. At times, different disciplines speak different languages of research -- so it isn't only a matter of finding methodologies that work to establish rigorous evidence, but learning how to communicate different types of evidence between different stakeholders. It would be interesting to hear reflections on this from different perspectives tomorrow.

Kind regards, Sarah

Dr. Sarah Cardey Lecturer in International Development School of Agriculture, Policy and Development, University of Reading, Whiteknights Reading, RG6 6AH, UK Telephone: + 44 (0) 118 378 6594 Website:

Comment viewing options
Select your preferred way to display the comments and click "Save settings" to activate your changes.
Research Methodologies
Need help?

General Forum

The blog lets your team communicate by posting updates and discussing issues. It is a great place for sharing progress, discussing challenges, and exploring ideas.