Home Gender Why artificial intelligence needs to consider the unique needs of older women

Why artificial intelligence needs to consider the unique needs of older women

by Surbhi Kalia
kalia-rochon-why-artificial-intelligence-needs-to-consider-the-unique-needs-of-older-women

Artificial intelligence (AI) is making headlines everywhere. Yet AI applications and implications for older adults, particularly older women, have not been adequately contemplated.

It’s no longer a moonshot idea from a science fiction movie.  AI is already part of our daily lives — Apple’s Siri, Amazon’s Alexa, self-driving cars. And now ChatGPT, an AI chatbot that has human-like conversations, composes music, creates art and writes essays.  It has disrupted the world as we know it. Pundits who are not easily impressed often describe these advancements as “scary good.”

Many leaders have asked for a pause on AI development until we gain a better understanding of its impact. This is a good idea — but for reasons well beyond those often identified.

We need to ask: How can we ensure that AI’s reach is considering the unique needs of different populations? For example, many countries are becoming super-aged societies where women make up the majority of the older population. Is AI taking the needs of older adults into account?

Without thinking through these questions, we may leave older adults, particularly women, and other marginalized populations, open to discriminatory outcomes.

The needs of older women are often invisible to decision-makers. Older women are a unique population and often gendered ageism — discrimination based on their age and sex — causes their needs to be neglected. Research has already demonstrated that older women are more likely to experience adverse health outcomes and face poverty and discrimination based on age and sex.

AI perpetuates this discrimination in the virtual world by replicating discriminatory practices in the real world. What’s worse is that AI automates this discrimination — speeds it up and makes the impact more widely felt.

AI models use historical data. In healthcare, large data sets composed of personal and biomedical information are currently being used to train AI, but these data have, in many cases, excluded older adults and women, making technologies exclusionary by design.

For example, AI has a valuable use in drug research and development, which uses massive data sets or “big data.” But AI is only as good as the data it gets and much of the world has not collected drug data properly. In the United States, until the 1990s, women and minorities were not required to be included in National Institute of Health (NIH) funded studies. And up until 2019, older adults were not required to be included in NIH funded studies leaving a gap in our understanding of the health needs of older women in particular.

Excluding older women from drug data collection has been specifically detrimental because they are more likely to have chronic conditions, conditions that may require drugs, and are more likely to experience harmful side effects from medications.

Also, AI powered systems are often designed based on ageist assumptions. Stereotypes such as older adults being technophobes result their exclusion from their participation in the design of advanced technologies.

For example, women make up majority of the residents in long-term care homes. A study found that biases held by technology developers towards older adults hindered the appropriate utilization of AI in long-term care.

There also needs to be further thought given to loss of autonomy and privacy, and the effects of limiting human companionship because of AI. Older women are more likely to experience loneliness, yet AI is already being used in the form of companion robots. Their impact on older women’s wellbeing, especially loss of human contact, is not well studied.

This is how older women get left out from properly benefitting from advancements in technology.

The World Health Organization’s (WHO) timely policy brief addresses Ageism in Artificial Intelligence for Health and outlines eight important considerations to ensure that AI technologies for health address ageism. These include participatory design of AI technology with older people and age-inclusive data.

We would add the need to consider the differences between women and men throughout.  All levels of government also need to think about how AI is impacting our lives and get innovative with policy and legal frameworks to prevent systemic discrimination.

Ethical guidelines and the ongoing evaluation of AI systems can help prevent the perpetuation of gendered ageism and promote fair and equitable outcomes.

It’s time we rethink our approach and reimagine our practices, so that everyone can participate and take advantage of what AI has to offer.

Photo courtesy of DepositPhotos

Print Friendly, PDF & Email
Creative Commons License
This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.

This means that you are free to reprint this article for any non-profit or for-profit purpose, so long as no changes are made, and proper attribution is provided. Note: Only text is covered by the Creative Commons license; images are not included. Please credit the authors and QUOI Media Group when you reprint this content. And if you let us know that you’ve used it, we’ll happily share it widely on our social media channels: quoi@quoimedia.com.

You may also like