The role of technology in pregnancy loss and its impact on mental health

In this blog, Dr Jennifer Chubb talks about how online algorithms can affect families experiencing baby loss and the impact this can have on mental health. Jenn calls for primary care teams and developers to consider baby loss in the design and use of algorithms in pregnancy, and hopes to open up the conversation about how technology can be improved to help families.

Guest blog by Dr Jenn Chubb, Researcher

Technology plays such a huge role in society, and we have seen this so clearly during the past year during the Covid-19 pandemic. Technology has connected us. It’s been a friend and a confidant – allowing us to stay in touch with friends and family at a time when we were physically far apart. 

For many people who have experienced pregnancy loss, technology, social media and apps have come to the rescue – allowing communities to share experiences and support one another – but there are downsides too for mental health.

Many platforms rely on data and algorithms - a process or set of rules to be followed to help a computer make calculations, solve problems and make predictions. This is what is more generally known as Artificial Intelligence (AI). 

Sometimes, this technology plays a very deliberate role in very high stakes scenarios where there is more of a risk of harm to or impact on the user. Think, for instance, about algorithms used to make decisions about your CV and future suitability for a job, or place at university. This can impact our lives in significant ways.

Another significant or ‘high stakes’ life event is pregnancy.  

Technology, data and pregnancy

It's very common to use apps and social media when we are pregnant or even hoping to become pregnant.

Making a deliberate choice to download apps that track our fertility or pregnancy journey, we give away so much about ourselves to the algorithms behind these platforms. But what happens when we need to reverse those effects? 

Apps are handed out at midwife appointments sometimes with little advice as to what to do if anything goes wrong. But given Tommy’s most recent findings published in the Lancet showing that miscarriage doubles the risk of depression and anxiety, it's important to consider the role of technology in supporting the mental health of bereaved families. 

Let’s for a moment consider the use of an app like Natural Cycles which tracks your fertility. By teaching the algorithm your personal cycle, you are told what days your fertility is at its highest, and when it’s not. For example, the colour of the app changes from green to red so you can make deliberate choices based on these algorithms. When you become pregnant, the colour changes to blue. You are congratulated by the app – perhaps just you and the app share this intimate secret. 

But what happens if that changes? How easy is it to access the ‘no longer pregnant’ button? All social media has a memory, enforced by the algorithm – which could affect peoples' mental health. 

The ethics of algorithms

At the moment, the world is more and more focused on the ethics of algorithms to balance risk and harm with alongside the benefits. 

Recently, I have been thinking about the burden of artificial intelligence (AI) and its effects on our mental well-being after loss – that is, when AI and algorithms have to be retrained to account for a change in circumstances.

It’s a personal choice to use online platforms – but are we truly aware of how to reverse their effects if things go wrong? We need those clever algorithms to speed up when we're faced with loss. We need more education to support users who may be triggered by predictive content after loss.

In the case of pregnancy, sadly, circumstances may mean that a positive pregnancy test has resulted in miscarriage, or that a longed-for and planned-for baby has arrived too early and too tiny to survive. In many cases, your technology shares this memory with you which can bring added pressure.  

Sophie’s story

I thought I would describe this issue using a story. Stories, after all, help us make sense of the world and can bring awareness to difficult subjects.

This is Sophie's story. Her experiences show the burden of retraining algorithms at a time of loss and suggest there's more work to be done to make sure that people in her position don't have to endure any extra trauma when they're already struggling after loss.

Sophie lost a pregnancy in her second trimester. In the weeks of the early pregnancy, she did what most expectant mothers do – at least of her generation – and found the best app to log her symptoms, which fed her news and articles about ‘what to expect’ and when. She was even advised by a midwife at her first visit to download an app which would tell her what size the baby was, what stage of its development it was, offering advice and even discounts for online baby-related shopping. 

This was her first pregnancy. Although she really wanted to tell people about it, she knew that it's often advised that you don’t announce your pregnancy in the early stages, ‘just in case’. Instead, she would save that news up for after the first scan at 12 weeks. After all, 1 in 4 pregnancies end in miscarriage. So she didn't tell anyone.

Like many of us, Sophie turned to social media and apps to track her pregnancy. She registered for sites, downloaded apps and gathered information online.

But then the worst happened. Sophie went for a scan and there was no heartbeat, no longer 'viable' they said. The shock and pain was hard enough, but little did she realise there was another thing she would have to deal with. 

Instagram offered Sophie pregnancy accounts to follow. Friends’ new babies? "No thanks," she thought as she deleted the app.

She went onto the site that women use to track their fertility. She needed to find a way to tell it to stop tracking the pregnancy. She found it and the blue dots turned to red and it sent her a note saying sorry she was no longer pregnant. Her first ‘sorry’, from an algorithm!

As her brain came around from the initial shock of what had happened to her, she realised that just as she had trained algorithms to feed her information about pregnancy – she now needed to tell it the opposite.

She hadn’t thought about it until then, but it was an added burden at a deeply traumatic time. The issue wasn’t telling her workplace colleagues or telling her friends, but in telling the algorithm.

She started thinking about charities and tried to get help for her trauma – her history went from baby advice to support for bereaved families. Maybe it was starting to learn, she thought, as Facebook started feeding her an invitation to a ‘Festival of Grief’.

Sophie isn’t the only person who feels this way

You might delete apps (as Sophie did), but how much work is it to retrain them to stop sending you pregnancy and baby content? It is exhausting and upsetting.

It turns out Sophie wasn’t alone. Left thinking about the very deep burden technology can leave after trauma, she looked through charity testimonials and found many mentions of being ‘hounded’ or ‘trolled’ by algorithms after pregnancy loss.

One social media user reporting to a fertility tracking app post said, 'if you know miscarriages are so common, why isn't it more straightforward to log this in your app? Having to select end pregnancy really stings and why can't you just use the word miscarriage to help end the stigma?'

Balancing the gains from technology during that happier time, would she prefer the privacy of just a couple of well-thumbed books, hidden away after a loss? Less information at her fingertips, but less continuing pain, too? Most probably. 

Naturally, when something happens in life, we want to tell someone. Those we choose to tell about our personal lives may not say the right thing, but they are there for us.

So what do you do if something bad happens, personally traumatic, a death or a loss?  Who do we tell then? 

Just like Sophie, the reality is that we talk to algorithms every day. We talk to apps and social media with every ‘like’, every click. We create memories for our devices to share with us. Every screenshot of the baby’s development ‘forever’ in the phone’s camera roll. The purchase of folic acid on Amazon, the social media sites, the read message on Mumsnet. All saved in your browser history. All available to the search engine’s recommendation algorithm. All there and never forgotten. But so much had changed. She needed the algorithms to speed up and learn, fast.

She was retraining technology to learn she had miscarried.

Protecting mental health

It's worth mentioning that social media platforms are making steps to protect our mental health in these situations. On Facebook for example, you can choose to hide ads which have upset you, and request not to see these in future – but there is still a long way to go. More research is needed but there is certainly more work to be done by developers and technology companies to prevent unnecessary pain for parents. Health providers too might consider ways to better advise patients when handing out apps, and families might carefully consider their approach to their technology.

But going forward, I am left contemplating what trade-offs we make when we let algorithms into our personal lives, and to what extent this might add to a sense of loss at a time of trauma when things go wrong. In fact, how deeply impersonal it seems. Technology is not a bad thing, far from it, but there is more to do to mitigate against harm. 

Miscarriage is still unbelievably taboo in society, so there is even more work to be done to unravel this specific case.

As the world moves toward greater emphasis on the role of AI and digital technologies in our everyday lives, the need for deeper research and ethical reflection into the social impacts of technology accounting for a range of circumstances, including loss and change, only increases.

If you have been affected by this story, please head to our support pages.

You can follow Jenn on Twitter and find out more about her work on her website

Jenn is writing this piece independently of her professional affiliation.