Is AI Prejudiced Against You?

You could find AI impressive or you could find AI abhorrent, but you have to admit that AI is cool. Take GenerativeAI, for instance. You could write a prompt, get a response in seconds and use that to move on with your life. Or if you want an AI-powered image, you could describe the image you want and an AI tool could give it to you.

But, what about the output itself? Are the results the best possible one can get? And does it perpetuate some level of prejudice or discrimination, if a system far more intelligent than human beings could have that? There have been some reports that some AI tools may rate images of females as more s*xually suggestive, maybe when they’re pregnant or exercising. Think of all the biases not just from 2023, but from 1973 or 1873, embedded in database systems through online libraries with outdated and archaic ideas about gender, race and s*xuality. So, it’s important to address and acknowledge AI’s limitations and risks. Sure, it can augment creativity and productivity, but it may not be able to replace human judgment.

But, then again, human judgment could also be filled with bias, prejudice and discrimination, so if the training comes from there, what’s going to happen? Generative AI is becoming even more of a big deal with OpenAI’s ChatGPT releasing in the second half of 2022. So, AI may be, increasingly, informing important decisions affecting social, political and economic rights. That’s why it’s imperative to ensure that AI systems being built uphold principles of fairness, accountability and transparency. There could be incomplete, non-standardized or poorly-collected data that can distort reality. Data collected that reflects long-standing social inequalities may perpetuate those very inequalities in the future.

So, there could be an approach, where developers question and correct the impact of systemic social inequalities. Hopefully, this could mitigate bias. And sure, everyone’s been talking about representation and diversity forever, but maybe having that diverse team would make it easier to identify bias. There’s also something launched by UNESCO called Women4Ethical AI, where leading female experts from various fields share research, contribute to best practices and promote non-discriminatory algorithms and data sources, while aiming to incentivize the participation of girls, women and underrepresented groups in AI.

Rizing Premium Save BIG.The Rizing Gold Plan: ₹1299/-

X