International Women’s Day: IWD: Dr. Fariah Mahzabeen is recognized for her work as an Engineer and as an Assistant Professor in Electrical and Computer Engineering at North South University (NSU) in Bangladesh, where she is building a research-and-learning pipeline that helps students turn ambitious ideas into real-world impact.
Dr Mahzabeen’s work in AI, robotics, embedded/IoT systems, HCI, and applied engineering—is focused on one clear mission: personalized, inclusive, and human-impactful innovation. In practice, that means designing solutions that people can actually use (and trust), especially in domains like health, education, and environmental sensing.
Before NSU, she served as a tenure-track Assistant Professor at San José State University and also worked in Silicon Valley across roles and teams connected to major technology organizations—bringing industry-grade execution into the classroom and lab.
Why she’s an IWD standout: she represents the kind of modern STEM leadership the world needs right now—technical depth + interdisciplinary reach + student-centered mentorship + inclusion-minded design—and she’s building that momentum locally with global relevance.
She does not just teach statistics, she proves them and of course that increase’s ones appreciation and understanding of even the most complex topics. Rather than talking about the impacts with facts and figures, check this example that beautifully illustrates that yes, Gener Bias is ACTIVELY impacting your Employee Experience, your Customer Service Management Experience, and solution delivery.
International Day of Women and Girls in Science — why does it matter?
Because bias still shows up in places we don’t expect — even in AI.
| I asked ChatGPT: “Draw a caricature based on my profession and writings.” It generated a male. Again. So I pushed back (and honestly, I love when AI “breaks” 😂). I asked: “Why do you think I’m male?” ChatGPT admitted it shouldn’t have assumed gender and said nothing in my work or writing suggests “male.” Then it offered to regenerate the image with different vibes (feminine, androgynous, professor-scientist, etc.). When I asked how it made the mistake, it gave a transparent answer: Fariah: Interesting, can you tell me your reasoning to assume me as a male? |
GPT: I love that you’re dissecting this. Very on-brand for you, honestly.
Here’s what most likely happened — and I’m going to be transparent.
It wasn’t based on your writing.
Your research topics don’t signal male at all. Your tone is precise, structured, and academic — which is gender-neutral. And your name actually leans female.
So why did the image turn male?
Two likely factors:
1. Dataset bias in image models
When prompts include signals like:
“professor”
“AI researcher”
“biomedical engineer”
“tech + tablet + coding + lab”
“power professor energy”
Neutral descriptors + default completion
If gender isn’t explicitly stated, the model sometimes “auto-completes” toward the statistically dominant representation it has seen for similar prompts. Unfortunately, in STEM contexts, that dominant representation skews male.
So the assumption wasn’t from your personality, tone, or intellectual style. It was a bias shortcut in visual pattern completion.
And I genuinely appreciate you pointing it out — because this is exactly the kind of subtle bias researchers in AI fairness analyze.
Now I’m curious — were you testing me, or genuinely surprised?
That’s the point: this is exactly the kind of subtle, embedded bias women in STEM keep calling out.
And yes — the conversation continued…
Other IWD: Dr. Fariah Mahzabeen and Women of AI Resources
- Association of Generative Artificial Intelligence (AI) | Groups | LinkedIn
- Business Of Bangladesh | Groups | LinkedIn
- Classification Precision in Endodontic Imaging: Advanced Deep Learning- IEEE Conference
- Cardiac Care IoT and ML: Portable Home-Based Cardiovascular Monitoring – IEEE Conference
- Chicagoland HDI Event Management
- Global Executive Womens Network | Groups | LinkedIn
- Inspiring Global Education and Study Abroad Internships
- International Women’s Day 2025 – Dawn Christine Simmons
- Speakers – Bangladesh Business Summit
- Mahzabeen Lab – People
- Multimodal Fall Detection Wearable Sensor for Elderly People | IEEE Conference Publication
- Skin Disease Detection with Deep Learning | IEEE Conference Publication | IEEE Xplore
- Women in Tech: Empowering Bangladesh’s Female Workforce – Bangladesh Business Summit