The Invisible Algorithms That Decide Who Gets Hired, Fired & Paid

AI is quietly reshaping hiring, pay, and layoffs - but India’s rules lag behind.

Rituraj Das
Tech News
Published:
<div class="paragraphs"><p>Invisible algorithms are making workplace calls, often without rules</p></div>
i

Invisible algorithms are making workplace calls, often without rules

Image created using MidJourney, prompt by author 

advertisement

In Up in the Air (2009), Anna Kendrick plays a freshly minted MBA, Natalie, who proposes a sleek new idea: fire people over video calls. It’s efficient, she says. Scalable. Cost-effective. George Clooney's Ryan, who’s made a career out of letting people down gently in person, is horrified. But the company bites. And soon enough, strangers start hearing the words “Your position is no longer available” from a screen, not a human being.

That was supposed to be the dystopia.

Today, even the screen has disappeared. You don’t get a conversation, just a dashboard. Or worse, silence - the eerie kind that follows when an algorithm decides you’re redundant and nobody thought to loop you in.

Across industries, artificial intelligence has slipped into places it was once considered too impersonal for - hiring, performance reviews, promotions, layoffs. Once used to sort warehouse inventory, it’s now quietly sorting people.

And unless you know what to look for, you may not even realise it’s there.

Back in 2018, Amazon quietly shut down an internal AI recruiting tool after it learned to penalize resumes containing the word “women’s” - as in “women’s chess club captain.” The model had trained on a decade of male-dominated hiring data and, in the absence of supervision, internalized the same bias. It’s a story most HR tech people know by now. A cautionary tale from a younger, dumber AI.

But the world moved on. The tech matured. Models today come with fairness checks, explainability layers, and audit trails. Some even claim to "reason" like a human. At least, that’s the promise. And in corporate settings - where efficiency matters more than emotion - that’s often enough.

These systems don’t just shortlist resumes anymore. They calculate “cultural fit.” They monitor keystrokes. They assess tone in emails. They nudge managers toward action, or inaction, depending on what the model spits out. It’s all very clean. Very objective. Very hard to interrogate.

You may never know that a model flagged you as a retention risk. Or that your bonus was docked because a predictive score dipped two quarters ago. There's no angry boss, no awkward meeting - just silence, punctuated by a form letter.

In the EU, the regulatory response has been clear. The new AI Act classifies tools used for hiring, employee monitoring, and performance evaluation as “high-risk,” requiring transparency and human oversight. In the US, the Federal Trade Commission has warned companies against deploying “black-box algorithms” in high-stakes decisions without documentation or redress.

India, meanwhile, is still workshopping its opening statement.

The Digital Personal Data Protection Act, 2023 lays down a framework for consent, data handling, and breach notifications. But it doesn’t say much about automated decision-making. There's no ‘right to explanation’. No rule that says workers need to be told when AI is involved in a call about their future. If a model has a say in your hiring or firing, your only real option is to guess.

Many Indian startups in the HR and procurement space now offer plug-and-play AI tools for “talent optimization” and “contract intelligence.” The marketing language is full of words like smart, seamless, frictionless. Missing, often, is the word accountable.

This isn’t to say the systems are inherently flawed. Some companies do get it right. Fairness-aware training data. Regular red-teaming. Internal audits. Human-in-the-loop models where the software assists - but doesn’t dictate. The tools have improved. So have the safeguards.

But a well-trained model is still a model. And the real issue isn’t just how these systems work - it’s how comfortably we let them replace human judgment.

In Up in the Air, at least there was someone on the other side of the call, stumbling through a script, trying not to make it worse. Today, that script might be automated. And the call may never come at all.

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT