Panoply Digital

View Original

Ethics In Mobile For Development (M4D): My Take At Panoply

As my particular focus in learning both in the developing and developed contexts, I tend to extend the inviolability of the teacher-student relationship into other aspects of my work here at Panoply, namely research, evaluation, and development projects. I take it seriously as do my colleagues. That teacher-student relationship is sacred to me; as such, I tend to apply that same weight to my research relationships, my evaluation relationships, or any relationship that I have within the work at Panoply Digital. There are things you can and cannot ethically do in this context. Corporate maxims like Don't Be Evil don't extend nearly as far as I would like when dealing with actual people in development contexts. Do no harm, while instructive, feels incomplete. So, we are left with the evolving role of ethics. The ethics of dealing with live people in live contexts where participation can reasonably provide both benefit and harm. Linda Raftree has written convincingly about this (and with much more nuance than I am able to present here) in the context of open development;  John Traxler has further written extensively on ethics in the mobile context. Tim Unwin and Matt Haikin have all proven instructive in my work as well. I refer often to BERA’s Ethical Guidelines for Research or the AERA’s Research Ethics. As I am based, I follow closely Korea’s research ethics developments. There are many more writers and resources I could point to, but ultimately, ethics involves the practical application of idea into activity. So this post is about essentially the preliminary steps as I see them towards establishing a baseline for ethical research in the development context that apparently was a lot cloudier than I had realized. Many of the following are more a wishlist than an actual recommendation, but bear with me.

Systems & Organizations

  1. Stop avoiding the ethical discussion. It is upon us and we can’t, nor shouldn’t, be trying to avoid it. Address it head on as a community. Negotiate a more robust ethical framework as a community. Demand that it be used as a community. At the organizational level, have the conversation and invest in the time to let it evolve. Ethics is an active negotiation of context, an active discussion. Putting the teacher hat on for a second, ethics is an ongoing reflection of right action in a particular context. Set aside time for it.

  2. Invest in tools, systems, and platforms with great caution. Again with the teacher hat on, for open learning I point to the Courseras of the world here (or any tool with fuzzy data collection policies), whose privacy policy is relatively clear about what types of data they collect. Distinguishing between non-personal and personally identifiable information feels like legalese for saying we collect all information. And saying they value confidentiality and privacy highly and then listing all the ways they share this information is standard legal copy, but it is copy we should be aware of as educators, researchers, and the like. No excuses for us to be ignorant of its existence. We should know they collect and use a lot of data that we provide. Whether that is wrong or ethically unsound is for us to decide. We shouldn’t claim ignorance after the fact, though.

  3. By all means work with tools, applications, or organizations if you feel that provides an opportunity for you as an organization to extend the impact of your education. These are tools in a toolkit. But they are also aggregations of more than just code or machinery. They are tangible aggregations of values, intent, and purpose. Sometimes these values and intentions will merge with yours; sometimes they won't. But in this instance, the ethics involved isn't as much about us as development professionals as it is about the environments or people we are, for lack of a better word, trying to help 'develop'. Who is speaking for them?

  4. Dictate to the technology providers what you want ethically. Let the technology become a breathing extension of your ethical policy. Don't slam the square peg of your ethical policy into the technological circle hole. At the end of the day, these platforms cannot exist without your explicit approval, either in terms of use, content, and reputation. In short, don’t invest in platforms that aren’t bound by your ethics. There are plenty of options to get behind, who provide a relatively transparent structure. FrontlineSMS, the tools developed by Ushahidi, etc. There are many examples. Just do the research to see if these tools, and the values they represent, mesh with your ethical policy.

  5. Be pragmatic, but responsible. Yes, we are working towards pragmatic ends in development, but that doesn't equate to some sort of free pass in terms of data collection, in terms of anonymity or confidentiality, in terms of informed consent just for a perceived greater good. We implicitly signal in each shortcut or ethical breach our disrespect, our value judgment as to who deserves what treatment.

Individually

  1. Working with commercial platforms/vendors/options/technology does involve entering a domain where you, an ethical practitioner, won’t have full control over the data being collected or made visible to parties other than the individual who created it. It is a fact. However, that doesn’t mean you wave your rights to input ethics into the discussion. It doesn’t mean you wave your responsibility to investigate the tools being used, to understand what risks that poses to the individuals involved, to understand what data is being collected and by whom is it being seen. We don’t live in a world any longer where we can play dumb on these technology issues. I might not know how to code as well as I should, but I sure am going to know what I am exposing participants to if I ask them to use a particular application or technology. Ethics begins right here. Evaluate the tools you are going to use.

  2. Evaluate the potential for full and absolute disclosure when using mobile or educational technology (or both). Is it possible to receive informed consent from the participant? If not, truly ask if you should be doing it. Others might find the ambiguity in this issue and come to a different conclusion (and that might make more sense in other fields). Ethics are made visible in applied activity. Informed consent is a mandate and not a suggestion. Sure there is response bias and sure it colors the results, but that is a response cost worth paying. Your desire as a researcher or teacher or practitioner to get an answer does not trump an individual’s right not to be manipulated without their consent. If someone is in your study, they need to know and agree to being in your study. If someone is participating in your development project, make sure they know.

  3. Beyond do no harm, we need to reorient our research and development focus to protection. We establish guidelines, we generate projects that enact those guidelines, we get informed consent. This is good, but there is a step beyond this. We need to protect our participants as journalists protect sources. By taking these research or development projects on, we assume, implicitly, this role. We champion their right to privacy; we expose, very publicly, attempts to thwart their right to confidentiality. We protect them at every turn. If my work depended on doing something I felt was exposing my participants, I would stop doing that work. It really isn’t integrity if it isn’t tested, is it? This will never be the case for the vast majority of us, but we should consider this the potential outcome of any research we undertake.

All of these are places to start. The ethical discussion can't be avoided, so consider having it now.