Balance

A.I., data-, and algorithm-driven technologies have raised many ethical, moral and legal questions over the past few years.

Our experience in applying digital technology to good causes around the world convinces us that, with the right approach and balance, it can be a force for good and civil society.


Data protection

We support and help form policies that protect fundamental human rights globally, including the right to privacy and the protection of personal information.

We see the European Union’s GDPR as a first step in the right direction, but more needs to be done.

For instance, considering that A.I. has the power to profile people by association, protecting an individual’s data may not be sufficient if their friend or relative has chosen to make theirs public.


Algorithmic biases

Stereotypes and biases can result from human errors of judgment. But algorithms can, even unintentionally, perpetuate them at scale in ways that can lead to expensive mistakes, from discrimination to inequality and human rights violations.

Through our analysis and applied work we try to uncover these biases and engage with other technology practitioners and innovators to promote transparent, if not entirely bias-free programs.


Data-driven political advertising

Powerful algorithms determine content visibility on both paid and unpaid digital media channels.

Without the ability to advertise through paid social media channels, political newcomers would have a much weaker voice compared to highly funded incumbents who have already established a long-time influence on unpaid (or “earned”) digital media algorithms.

Unlike business marketing cycles, election campaigns are often time-restricted by law.

Our work suggests that without paid advertising, newcomers, grassroots organizations and minority voices will be disadvantaged, especially with time sensitive communication goals.

Stopping paid channel political advertising would be like stopping paid electoral posters.

Discuss with us

Do you agree with us, or think differently?

We would like to hear every viewpoint and consideration, and foster an exchange of opinions, so please share your thoughts and concerns.

Or make an enquiry if you would like to discuss how we could work together, contribute to your research or help you get trained in any of our areas.

Also feel free to follow us on Twitter and LinkedIn.

Your Name (*)

Your Email (*)

Your Organisation (*)

Subject

Your Message

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.