SXSW Review Part 1 – Ethical Digital


It may have been my imagination but SXSW had a more serious tone this year. There were politicians making their case for next year’s presidential race, there was a lot less chat about making millions with Blockchain and a lot more about female empowerment and ethics.

In fact that’s the first key trend I observed – Ethical Digital.

Ethical Digital
Screen Shot 2019-03-25 at 8.40.02 AMSince the Facebook/Oxford Analytica scandal, people have woken up to how their personal data may not be as private as they thought and to the ethical grey ground that social media platforms can inhabit. The idea that people who create technical platforms might have some responsibility for how they are received, the messages they give or the ways that they are used is only now coming into the public consciousness. Combine that with an upcoming election, political investigations and the #metoo movement and it’s no surprise that there was a strong stream of presentations this year focused on ethics. Specifically the presentations were on how ethics must be built into digital experiences, with some striking examples of how it isn’t.  But, this being SXSW, the focus was on networked homes, IoT, AI and even delivery systems like AR.

Privacy? What Privacy?
Garry Kasparov (yes, the chess player who was beaten by IBM’s Deep Blue) is now working for security experts Avast. He spoke about how insecure consumer-facing AI and IOT systems are and how difficult it is to work out how to make them secure because so much rests with the end user. He highlighted the way that manuals provided are designed to be brief on the one hand or extremely detailed on the other. Think about an Apple or Google Home start up booklet – 6 pages that help you to get going but never give you the information you need to ensure to secure your home system.

Avast then used a “Russian hacker” to give a demonstration of how easy it is to hack a smart home from playing music at 4am to getting right into Spotify and taking financial data. Scarey.

But there is hope. He pointed out that because many home assistants are AIs they could easily learn typical owner behavior sets to ensure better security. In fact, Avast has a positive approach to AI, a belief that AI are more efficient, but humans more intelligent and so we are complement each other. But they are also clear that the agencies that create AIs or devise IoT networks have responsibility for how humans use them. And was clear that humans have the responsbility for action or inaction.

Katrina Dow the founder of meeco.co, a start up managing personal data, highlights all the flaws in current developments. For example, the clear bias built into AI assistants like Siri, Alexa and Google which start off with a female voice and persona. Why? Because they are serving us. Yes, you can change the voice, but do you know how?

She also explored the kind of legacy we may be building for our children by including them in our social posts from before their birth. We’re providing data about birthdays, lifestyles, spending patterns and so on that could be used to determine credit scores, school applications and criminal potential – right now. We are signing away rights to their images and data on educational platforms, like SeeSaw, and we have no idea if or when that data will be sold to governments, health providers or insurance companies.

But there is a flipside. We could also be providing data that will enable better health care or personalized education. Her focus was primarily on making the user the edge, having them deliver data to apps as required, rather than warehousing everything in giant cloud based systems that require you to sign away your rights.

Ethical principles
IMG_7073And there are companies considering ethical development seriously. Microsoft spoke about the work they are doing creating AR training systems in factory settings with Hololens.  It turns out that Microsoft have  6 ethical principles for the development of MR or AR devices:
1. How are we treating everyone fairly? Eg Is the data biased
2. How do we have systems that perform safely? Holograms are layers over the real world after all
3. Privacy and security – eg how do you opt out?
4. Inclusive technologies should empower everyone – Hololens is very visual, but what about people who’s main experience is auditory
5. Transparency – for instance being able to change MR guides to improve them
6. Accountability – can you co-develop with customers?

Art installations
Finally, there were plenty of VR cinema experiences to be had exploring current issues in fully immersive worlds. For example, Border Stories which put people along the US-Mexico border, Girl Icon about girls inspired to enter education by Malala Yousafzai and Last Whispers about dying languages.

Ethics is always a theme at SXSW, but this year it felt like it was higher up the agenda and emerged in more accessible places. I doubt it will make the headlines in the same way as the more fun immersions and musical acts, but it will impact us far more deepy in the long term if we get it wrong.