Aid agencies, public health bodies, and health innovators are harnessing the rapidly accelerating improvements in data capabilities to deliver better health and wellbeing outcomes for service users and beneficiaries. Increasingly, smaller organisations are empowered to gather, process, analyse, and act on larger databases with attractively small investments in time and capital. Ostensibly, the calculus is clear: if gathering large quantities of personal data that informs strengthened decision-making is becoming easier, it would be irresponsible for an organisation not to build databases with the intention of improving outcomes.
Yet, this era of year-on-year emergence of new, reality-changing tools has demonstrated an unavoidable truth: technology is never neutral. Technology used in aid contexts is usually developed far from where it is deployed, and can carry with it implicit biases that distort its utility and curb its benefits. Equally, improvements in technological capabilities in the hands of healthcare and aid providers can serve -at least initially- to further widen inequalities between those with access to innovative and those without. Often in aid, these inequalities manifest clearly along the dynamics of provider/recipient.
The ability to gather large sets of personal data are an acute example of this divide. Take, for example, healthcare and aid providers working in low-resource settings. If they choose to harness large personal data gathering and processing tools to build large datasets, comprised of personal information relating to local beneficiaries, they are at once equipped with technological potential that is likely inaccessible locally and additionally entrusted with highly sensitive material relating to many local individuals. It is incumbent for such actors at the privileged end of a power disparity to use their position with utmost responsibility.
This is where Data Protection becomes paramount. Many humanitarian actors now are subject to the European General Data Protection Regulations (GDPR). The donor community — including EU Humanitarian Aid — now require their partners to demonstrate good data practices, including the implementation of Data Protection Impact Assessments for projects that may process, store or share personal data. This includes names, photographs of people, and even CVs. Data ethics goes beyond the procedural programming of safeguards and several guidelines and frameworks exist that can help build projects and teams on solid ethical foundations.
We are ready to support you in implementing the most appropriate tools and frameworks to your operations: analysing your system in order to apply the most relevant adaptation without disturbing your day-to-day operations, in a smooth and efficient manner. With the growing complexity of the data-driven services offered and the risk of social exclusion inherent in opting out of various technologies, individuals are disadvantaged when asked to provide informed consent for their data to be collected and used. This gulf between uptake and understanding has been met by legal frameworks implemented by governments and intra-governmental organisations (such as the EU’s GDPR), aimed at regulating data policies and enabling individuals to trust that their information is being handled responsibly. We help your organisation anticipate needs, and to actively shape the data ecosystems to meet said needs.
For more information, see our complete DPIA Service offering here.
If you would like to collaborate with Outsight International, please use our contact form to get in touch.