Skip to main content

Personal data in FCAS and development contexts

  • Mar 26, 2018

We work on disaggregate, near-real-time, data, because that is a way to be able to support the poor in an efficient way, with the level of assurance that partners need. But it also means we need to be specifically sensible about making sure we get this right.

We know we don’t have all the answers, and have got specialist information security, assurance and governance advisers checking our approach and making a plan: (https://www.crookford.co.uk/about-us/, in case interested).

Our core approach/intention is as follows:

  • For data above the level of personal data, transparency by default remains our approach: see http://charliegoldsmithassociates.co.uk/tech/
  • Say what information you would like to collect, and why, and ask for consent, in person/on the form/on the app, in words people can reasonably be expected to understand
  • Be cautious about asking person x to share data about person y, even if they are a close relative
  • For personal data, anonymise by default, and segment and permission access properly: very few people have a legitimate need to be able to work on collections of non-anonymised data beyond a local level
  • Wherever possible, set up analyses “on the system”. Where ad hoc analyses are genuinely needed, anonymise by default, and, for the rest, password protect any personal data moving on email or removeable media
  • Take into consideration ‘privacy by design and default’ and ensure data is encrypted in transit and where relevant during rest and maintain audit logs of changes
  • Secure, professionally-managed hosting and conduct security best practice when using self-managed hosting
  • In different countries, different kinds of information can be used to identify someone, different incentives may exist, and there may be different levels of attempts to surveil data moving: be aware, and adapt
  • Store biometrics in a securely hashed way
  • Treat everyone’s data with equal respect, whether or not it is gathered or held within the EU
  • Clear data governance: agree in contracts who owns and manages the system, and transition
  • Sunset: keep personal data only while it might reasonably be needed
  • We do share code etc on github, but are careful about sanitising it when we do
  • We involve external advisers periodically, to make sure we haven’t “groupthought” into a vulnerability

Over the next few months, we’ll be saying some more about this: it is important, and we care that we should not only get it right, but set a good lead on which country counterparts can build.

 

26.iii.18

Share: