Adam's blog formerly game development, now just casual madness

The Scroll of Declining Mental Health

I recently stumbled over the term “doomscrolling”, and in case you’re as clueless as I am, it means “obsessively reading social media posts about how utterly fucked we are”. And with the climate crisis, an ongoing pandemic and general politics, I can totally see where that’s coming from - but the first association I had was actually something quite different.

The software drug

A reduced attention span is one of the things often connected with extended social media use. This is nothing new. What I was shocked to realize is how affected I am myself and how far these effects extend, despite only limited exposure. I don’t post a lot, or care too much about things that are posted online. In fact, I don’t participate much in the social aspect of social media at all, and mainly use it as a way to pass time and occasionally as a source of information. What I would have identified as the primary feedback loop of social media platforms lies dormant in my case. But I’m still hooked.

The typical social media platform is built to throw a steady stream of disjoint information nuggets at you. Some of them may interest you, some of them do not. Regardless of the quality of their algorithm, it’s essentially random which will be true for the next info nugget you’ll see. If it’s interesting, great! You learned something new - briefly, without internalizing it, but the gratification is still there. If it’s not, just keep scrolling - maybe the next nugget will do better.

This combination of randomness and instant reward is essentially low-key gambling. A mind trick, employed in games, apps, notification mechanisms. Endless scrolling acts as an enhancer, like the opaque window blinds on a sketchy casino: Temporally disorienting you, making it easier and more inviting to continue what you’re doing. In the end you’ll wake up from a mindless state of indeterminate duration, wondering how you got there and whether it was worth it.

I’ve noticed a similar effect in some of my favourite games as well: Civilization is a prime example of being caught up in a stream of simulated events, micro feedback and reward mechanisms, all designed to overlap and interlock perfectly, evoking a constant state of “flow”. The specific urge to play “just one more turn” in this game is well known to the point of reaching meme potential. But it’s nothing bad, right? It’s just a really good game, and quite fun. Or is it a little too much fun?

While Civilization is still in a fringe area for me, there are other games, or even genres that definitely cross a line. Roguelikes are on top of that list, where the concept of collecting randomized rewards into a perfect storm of combined effects, inevitably failing, and then starting over again is a bit too close to actual gambling than I am comfortable with. Don’t get me wrong: I’ve played and enjoyed The Binding of Isaac, FTL and Heroes of Hammerwatch - but in at least two cases, I actually uninstalled the game after a while because I felt I was doing something deeply unhealthy to myself.

I don’t mean to place guilt, or insinuate that any of this is deliberate. It may be, or it may be not - I actually think it is very possible in the software world to design a drug by accident. And this is not the point either, so let’s just assume that this assortment of various apps, games and social media platforms exists, and we now have to find out how to deal with it.

Optimizing for the wrong goals

There are different ways to engage with people and content, and the likelyhood of them happening are shaped by the system in which they take place. Changing this system will change how users behave and interact. It seems that so far, a prominent goal of optimization for social media platforms, consumer apps and games has been to maximize user engagement and retention, i.e. motivating users to interact often and stay for a long time - a goal that is understandable from a business standpoint, but also eerily similar to what I’d imagine slot machines to be designed after.

I’m guessing the issue doesn’t even go away when ignoring the business aspects and removing the need to gain and retain users: Chances are, developers will still try to make great software, and making it more fun and engaging is generally perceived as a good thing. In some cases, we might stumble upon one of those mind tricks and unknowingly employ them, simply because our app is just more rewarding to use that way, or our game is more engaging to play.

It’s like adding more salt, fat and sugar to a meal to make it more delicious, but inadvertently making it less healthy at the same time. It works at any scale, from cooking at home, to the takeaway next door, all the way up to the food industry, where unhealthy processed food and sugary drinks are widely known issues. And I’d guess that they’re the result of the same optimization process that is happening to some large scale software and platforms.

Eating consciously

Being delicious is not the most important metric by which food is valued, and it shouldn’t be the most important one for software either.

Unfortunately though, it’s still the one that leaves the most immediate, direct impression on us, while the notions of nutritional value, or “how healthy it is” seem way more abstract. For food at least, there’s a huge amount of culture, science and anecdotal knowledge on how it can improve or harm physical health, and it’s a topic that I think most people are at least tangentially aware of.

So, what about software? Based on how closely and repeatedly people interact with various instances of it these days, I think it is possible that its impact on mental health could be as measurable as a certain diet’s effect on physical health. And to be clear, I mean this in the same broad sense: Mental health is not the absence of mental illness, but a different layer of well-being.

But how would one even start to evaluate how software can affect the mental health of its users? How can someone get an idea of the risks (and benefits?) before deciding to buy, subscribe, register? How would one choose between different products based on a non-apparent metric?

In the case of food, there are various regulations and labels designed to give consumers an intuitive connection to health impacts - either without having to extensively research each individual product, or at least enabling them to do that in the first place. As an example for the former, this one popped up in supermarkets around here a while ago:

It provides a super simple overview which products are harmless from a health point of view, and which ones might need a bit more scrutiny.

As someone who tends to be on the lazy side when it comes to researching food before buying, I think that’s pretty awesome: It’s a quick good / okay / bad impression that I would have no way of obtaining by other means with the same efficiency, and it also acts as a passive reminder to be aware.

Could something similar be done for software?

The mental nutrition sheet

This is something that would be better off done by someone who has an actual idea of psychology, and a better grasp on the topic of mental health than I do. In fact, most of what I’m writing in this post probably would. I’m going to take a shot at this anyway, but be warned that I’m not a professional. At all. With that said, let’s return to the main topic and explore some ideas.

I think there are at least two different sections that a mental nutrition sheet for software could have.

Engagement spectrum

Based on personal experience, I’d say that there are different ways in which I can engage mentally, and that they could be seen as different aspects of experience. Sometimes I might want to seek out one aspect more than another, but what is actually experienced through any software can certainly be different from what its marketing might suggest. So why not categorize this a bit to give a rough overview?

Some examples on top of my mind:

  • Reasoning could be a scale that describes how much a user would be occupied with logic, critical evaluation, planning or restructuring concepts, applying rules and making judgements.
  • Creativity could describe open-minded exploration, recombining the known into the new, and stream-of-consciousness type creation.
  • Imagination could refer to any kind of evoking senses based on inner concepts, description or incomplete input.
  • Empathy could describe how much a user is encouraged to empathsize with other beings, or seeing things with their eyes.
  • Introspection could be the level of inward attention that is encouraged in a user.
  • Social could refer to the presence of interaction with others, social cues and subtleties, roles and status.
  • Dexterity could describe how much time is spent with quick reactions, movement patterns and building muscle memory.
  • …and so on.

Games are relatively easy to describe on those (or similar) scales, because they are designed to be self-contained entertainment packages. But for apps and platforms, this can be a lot harder: They are a bit like tools and places, where there is a lot of outside context, and a purpose that is partially defined by the user. Is a pair of scissors inherently creative, because it can be used for crafting? Is a park itself social, because it can be where people meet up? Is it imaginative or introspective, because it invites people to sit on a bench and let their minds wander?

I don’t have a great answer for this yet, but I do think that platforms and even some apps have a certain “spin” that stems from their design and defines which higher level behaviours and experiences emerge within their user base: Twitter feels different from Facebook, which feels different from Instagram. Their DNA might not give it away directly, but I’m guessing it would still grow the same kind of beast every time.

Feedback attributes

While I’d imagine the engagement spectrum to be mostly a matter of FYI, preference or balanced mental diet, feedback attributes are where potential health concerns are buried. They’ll distinguish a Journey tea from a Candy Crush soft drink.

Let’s explore some ideas on different metrics:

  • Reward density, describing the accumulated intensity of positive feedback events per user action. Games for example are a software genre that is most easily over-saturated with reward signals and every action is juiced up to the max.
  • Reward instability, describing the statistical variance in reward density when repeating the same user action. A very high value on this metric could be concerning, since it might encourage an unreasonable amount of repetition.
  • Anticipation link-back, the ratio of rewards that immediately set up anticipation for the next one. Very little could mean on-demand usage only, a medium score could represent a good-natured sense of flow, and a very high one could indicate a setup for compulsive behaviour.
  • User agency index, a measure for the difference in reward density between the perceived best and worst outcome of a series of actions, multiplied by available choice variety and range of execution quality. In other words, how well can a user make choices or apply skills in a way that affect reward outcome?
  • Passive loss, describing how fast (or if at all) previously rewarded progress is lost or invalidated due to a users inactivity, or by quitting.

Going back to the initial example of a social network throwing info nuggets at you, it would probably score a high reward instability (who knows what’s next), high anticipation link-back (next nugget is always in view) and a low user agency index (can’t control the algorithm). I’d imagine you could even get a bit of passive loss when those info nuggets are actually from a social circle where you’re out of the loop if you didn’t read the most recent news.

Metrics like this could help provide a different, more user-friendly axis to optimize against, in addition to just user engagement and retention. If they’re visible to both developers and end users, the goal could be a compromise between encouraging healthy behavior and still providing user incentives. Just like there are mechanisms that can fuel compulsive behavior, maybe there are mechanisms that can help avoid it, too.

I’m on various platforms and occasionally play games - and don’t get me wrong, I still want all of them to be engaging. But I want them to be engaging in a way that doesn’t feel like I’m stuck in a mental loop that clever design and UX elements installed in my mind.

Closing words

This post deals with a topic that I really have no clue about, so take it all with a grain of salt. Don’t read it as advice, professional insight or really anything beyond a hopefully reasonable stream of consciousness. If you’re ahead of me (or I managed to get you thinking) I’d like to hear your thoughts on this as well - feel free to comment or contact me.

tl;dr: There are labels to show how healthy food is. Do we need the same for software?

PREV 1 of 74