We All Use Data, But How Do We Learn?

We All Use Data, But How Do We Learn?

By: Charity Troyer Moore, Emily Myers, Charlotte Tuminelli

 

This article originally appeared on The SEEP Network Blog on October 18, 2018. 

''

SEEP Conference participants discuss ways to improve data and use of evidence in organizations during the session, Using Data and Evidence to Drive Policy and Program Improvement. Photo: © mari matsuri

For those of us working in development, the need for data and evidence-informed decision-making is no longer controversial. Of course, there are ongoing related debates, such as whether randomized trials are leading development research in the right direction, and what types of data to use when such methods are not feasible. But by now, most groups have internalized the value of evidence, and a wide array of data collection methods are available.

So increasingly, we have data, but how are we using them to learn? And perhaps more crucially, how can we leverage the important multiplier effect our organizations have (as donors, researchers, and catalysts) to make the best use of the growing evidence base? At Evidence for Policy Design (EPoD) at Harvard Kennedy School, this question is central to our mission of uniting research and practice for smart policy. During our lunch dialogue at the 2018 SEEP Annual Conference, we explored these questions with participants in an interactive session focused on using data to inform decisions. The surrounding discussion revealed ongoing challenges and frustrations – but also concrete ideas for improvement.

Why is using data so challenging?

When we polled SEEP conference-goers about the barriers to data and use of evidence in their organizations, individual capacity gaps (lack of capacity for data analysis) topped the list, but organizational barriers were also important (lack of organizational resources/incentives and problems with the format of existing datasets). These results resonate for us based on our surveys of – and experience working with – civil servants in South Asia, who cite similar barriers to evidence use in their organizations.

''

Responses from SEEP conference-goers on top barriers to use of evidence in their organizations.

Data quality also remains a major challenge. Though there are a host of technological solutions that can help maintain high standards, such as digital data collection with built-in audit functionality, they require organizational investments. And even what seem like relatively simple fixes – like using common unique identifiers to facilitate linking across multiple datasets – are often not adopted due to lack of coordination across departments or agencies.

Addressing these barriers can seem daunting, especially when most of our organizations work on complex problems where achieving precise measurement and setting metrics for success is difficult. For example, in our own work building policy organizations’ capacity to use data and evidence, we face challenges both in defining objectives (what does ‘capacity to use evidence’ look like?) and tracking participants to understand how they integrate new knowledge into practice.


What can we do to make better use of all that data?

SEEP conference-goers had no shortage of good ideas when it came to small steps that they or their organizations could take to improve data use. Some of these ideas tied back to aligning organizational incentives – such as setting explicit targets and incentives for staff related to data collection and use, better communicating about available data and evidence, and building in more collaboration at the outset to define the questions different stakeholders within the organization need data to answer.

Participants noted that long reports often go unread, but short summaries, dashboards, and central repositories combined with visualization tools allow users at all levels to extract relevant information. We know that human beings are visual learners, and in our work we’ve seen that converting data into simple visualizations has great potential for expanding data use.

Through our work with government agencies, we’ve seen examples of how small changes in organizational systems can lead to greater data use, with significant impact on outcomes. Our pilot projects, where we collaborate directly with policymakers to give them hands-on experience with data and evidence, provide several examples. Through this work, we’ve recognized the value of integrating systems for data use (not just data collection and reporting) into the organizational processes of government ministries and development groups alike. This might mean setting up standards for data collection in order to ensure data systems and datasets are synchronized, building data use into our standard practices (like using a data dashboard as part of a standing agenda for a regular staff meeting), and making investments in good knowledge management (perhaps hiring someone who can serve as a ‘data champion’ to support data use across the organization). To get the most out of the data we collect, aligning incentives across the organization is key.

Some participants at #SEEP2018 noted that their organizations would benefit if the staff as a whole – and not just a few key analysts – had a better understanding of how to use data. Here, we can offer some concrete guidance: EPoD has recently made two online modules on using Descriptive Evidence and Impact Evaluation available for open access use, and we invite readers to enroll and share with colleagues.