[Part 2] Our Data Driven Approach to Creativity: Considerations
[Part 2] Our Data Driven Approach to Creativity: Considerations
Data Scientist
Our approach to data analysis is grounded in a few key principles, all of which we've learned along the way.
As we discussed in Part 1 of this series, data has become intrinsically embedded in our daily lives, and the same can be said about Wonderland. While I was initially hired to conduct user research via Google Analytics, this quickly expanded into numerous other projects. From calculating the carbon emissions of our digital activities to analysing the sentiment that our target audience has towards a competitor. We will explore some of the more specific examples in Part 3: Applications. But first, before any applications or analyses can be done, we felt it was important to share a few of the key principles that we’ve learned along the way, and which now ground our daily data practices.
No results are results all the same
Anyone who has dabbled in academia has probably come across the ‘publish or perish’ mentality - a structure that cares more about publishing novel results in scientific journals than the quality or importance of the scientific finding itself. Results that don’t fit the researchers' hypothesis are often stuck in a file drawer, never to be read by other researchers or contribute to further scientific publications. This is not the case within Wonderland. For us, the only thing that matters is the end-user, and a negative or null result is still a result all the same. For example: if we run a survey to see how users feel about a certain logo or a colour palette, and there is little emotional response or if people feel neutral towards it, then we can do better. It's a null result, but it still makes for an important action point.
Grounding findings in something real
This was the hardest lesson to learn when joining Wonderland. In other contexts (I’m looking at you, academia), a significant result is a significant result, and you can build the story around that finding. At Wonderland, if you find something interesting in the data, you’ll be met with an onslaught of whys. For example: if a handful of the client’s competitor’s Twitter accounts have seen statistically significant growth over time. Cool. But that in itself isn’t valuable. What is understanding what happened before. What did they do to make the brand grow in that way? What type of content contributed to that growth? What were they talking about? How were people reacting to that content? All of this can be answered with further research where we would use web scraping techniques and natural language processing to get to dive deeper and decipher the initial finding. Saying “Here’s my result!” isn’t good enough, and an interesting data point or statistically significant result doesn’t get to be in the presentation or interpreted as an action point unless it can be connected to events and insights from the real world, insights that educate brand performance.
Hypothesis testing and a good research question are still important
Sometimes it’s fun to throw your data at the wall and see what sticks (or through some fun analytics in this case). For instance, putting your survey results into a network just to see what questions correlate with one another or scraping hundreds of news articles to see a general sentiment or emotional tone towards an event. While this is fun, and can technically be deemed “exploratory analyses” you can’t generate any real-world insights or confirmatory claims without a research question, formulating a hypothesis, and then testing it on the data. It’s the bases of the scientific method, and it can be easy to forget when the deadlines are short and the opportunities are endless. It’s hard to ignore the masses of data available, and the ability to share cool/novel/interesting findings with the team.
A human-first approach
Not to scare you, but you may be surprised with how easy it is to track who you are and everything you’re doing on the internet. A pretty horrifying example I heard from a friend in the past week involved an analytics software that screen recorded all of the sessions that occurred on the website. This practice is common and is used to see how users navigate the website to improve the UI. However, this website also asked users to upload photos to the app. When users would open their photos on their phone - this was also being recorded.* What seemed like an innocent acceptance of the app’s terms and conditions led to all of your embarrassing screenshots, nudes, and photos of your cat now being in the hands of some data analyst at a company you’ve never heard of. Thankfully, one of the first conversations during my interview process at Wonderland discussed how we can care more about people than the potential data points they generate. On top of the obvious (and necessary) GDPR compliance, we also take a common sense and human-focused approach to data collection and web scraping. Pulling from an amalgamation of Open Science Practices, AI Ethics courses (Kaggle has a nice one), that time in my bachelors when I thought I was going to switch from a psychology to moral philosophy degree, experience from navigating the legal grey area of scraping, the University of Amsterdam’s Good Research Practices course, and respect for fellow humans - we can still find exciting insights without objectifying the people and programs who generate them.
In conclusion
Data is everywhere and anywhere—and we can harness it to create better brands, do our best for the environment, and make better products for the people who use them. But before even starting to scan or clean the data (often the first step in every ‘How to Data Science’ tutorial) we need to ground ourselves in a data process that aligns with our values—finding real/true results that add value to our team and our client’s ambitions, all while respecting people, products, and the planet.
In Part 3, we’ll take you through some of our actual applications of data science in creativity. It’s been a process to create some of these systems and methodologies, but the results have eventually led to engaging and actionable insights.