Rachel Glennerster and Michael Kremer point out that interventions in health and education need to complement much more complex machinery: human behavior. They are right. And their argument can go a step further. Engaging local stakeholders in the design of policies and solutions can boost the innovative behavior of the people whose well-being we evaluate.

At MIT’s D-Lab we believe that users in the developing world have the potential to be the everyday inventers of their own solutions. In a Nicaraguan hospital, a nurse might quietly create neonatal UV protectors from layers of surgical gauze. Around the corner in the operating room, surgeons can be found trading sutures for fishing line and drainage valves for cut-up soda bottles that work just as well. These inventive behaviors are often hidden. The designs are remaches, geuzas, improvisations, hacks. Not exactly the stuff of professional associations. This is only because they lack the last bit of formal engineering that makes them appear the brilliant solutions they in fact are.

Traditionally, technology designers who focus on the developing world try to create affordable solutions adaptable to the local environment. They might develop efficient water pumps that run on pedal power, cell phones with longer ranges and smarter features, and syringes that are safer and more accessible, with retractable needles that automatically disable them. Our approach is to encourage co-creation in the design process: we want to empower locals to invent, so they can be collaborators, not just clients. In our fieldwork we teach students to look for inventive behaviors, and many of our interventions have originated with users. Cultivating inventiveness and the tools of invention among the poor is our priority.

Of course, not all inventions will be effective, even if they result from the collaboration we promote. A great design may not translate into a great development intervention. This is where Glennerster and Kremer come in. Their work in empirical evaluation has three critical implications for those involved in designing interventions in the developing world.

First, we need to educate designers of technologies to use evaluation as a key parameter of design itself. We can come up with a fantastic design for a lab-on-a-chip that tests for several conditions with a single finger prick. That design will surely pass through half a dozen reviews and undergo small pilots to gauge adoption. In the end, though, proper evaluations will be the judge, not just of our inventiveness and skill, but also of our impact. And that is what we want to do—impact research: novel discoveries, applications, designs, applied with scientific rigor to produce direct or indirect benefits to society, particularly those living in poverty, within a generation.

Second, the velocity of technology transfer often depends on the behavior of users, particularly when the technologies are products of the user-driven invention philosophy we prize. Behavioral economics may help us figure out how to diffuse an innovative solution, even if it originates with, say, a nurse who does not receive recognition from her superiors or peers for what is perceived as an idiosyncratic behavior.

In our initial observations during deployment of MEDIK prototyping kits—essentially, modular construction sets for medical devices—among medical personnel, we found that users were less likely to participate in disseminating behavior—blog postings, uploading pictures to the Internet, asking for or offering help—if they had to pay for Web access. Despite this they had no problem purchasing a snack for twice the cost of an hour’s worth of Internet access ($0.30). While it is clear that they could afford to inform others online, it would help us, as design educators, to understand what levers can be pulled to encourage disseminating behavior. Accelerating the velocity of technology and technique transfer among the poor requires more fundamental insights into the economics of information sharing. This lies beyond the physical mechanisms we can design in the lab, but meticulous experimental research into behavior can make headway on the problem.

Third, behavioral economics offers important tools in designing incentives for behavior change itself. In this area we have collaborated with Glennerster and her colleagues at the Abdul Latif Jameel Poverty Action Lab on increasing medication compliance among tuberculosis patients. Our relatively inexperienced team used a combination of chemical diagnostics and mobile telephony that applied theories of incentives to on-the-ground behavioral change. We call our solution XoutTB.

Had we not identified incentive theory as a tool, our approach to the problem might have been substantially differently from the one being tested—of course, a randomized evaluation—in Pakistan this month. In the initial XoutTB pilot in Nicaragua, we planned to use a microfinance strategy for incentives. This was a moment of dominance by design. Working with an early cohort of would-be implementers, the team identified a more effective incentive for the context: providing cell-phone minutes.

Glennerster and Kremer champion development strategies that take into account the economic behavior of everyday users, the application of incentives to change behavior, and the powerful effect of randomized evaluations to keep us honest about their impact. We should also create room for local design of these development strategies, room for something beyond the implementation phase. This means letting go of total control from the start, and that is hard. Designers may at first find it frustrating to collaborate with non-professionals. In the end, however, we may find that strategies grown from a collaborative approach will allow users to attack problems in novel ways. For the XoutTB team, cell-phone minutes were an elegant solution; that incentive propelled the project to where it is today. Let’s use randomized evaluations to do more of this, to launch collaborative innovation for impact research.