Five years back, Shilpa’s dreams of opening a tailoring shop kept hitting the same wall. Bank after bank turned her down. No credit score. No property to pledge. No wealthy relative to guarantee her loan. The answer was always no.
Then something changed.
Last year, Kotak Mahindra Bank approved her business loan in less than a week. No collateral needed. No endless paperwork. Their “artificial intelligence system” had looked at things no loan officer ever cared about before—her UPI payment patterns, her shop’s location data, even how regularly she paid her electricity bills. The algorithm decided she was trustworthy enough for a few lakh rupees.
Shilpa, 34, finally opened her tailoring business in East Bengaluru. Her dream came true.
But her phone hasn’t been quiet since.
“Secure your future! Loan-linked insurance at low premiums!” one notification screams. “Turn your business into a brand!” shouts another. Videos about mutual funds for entrepreneurs pop up in her feed. Messages about systematic investment plans arrive like clockwork. The bank seems to know exactly what she needs before she knows it herself.
“Earlier, I only saw generic ads for home loans and car loans—stuff I could never afford anyway,” Shilpa said, scrolling through her phone. “But these new ones… they actually relate to my business. The insurance one seems genuinely useful.”
Here’s what Shilpa doesn’t fully realize yet: the bank isn’t just being helpful. It’s being strategic.
The Data That Opened Doors Now Keeps Them Ringing
What changed wasn’t Shilpa’s financial situation. It was what her bank decided to do with her information.
Related Posts
- Rephraser.co
- BREAKING: Global Crisis Deepens as US-Iran Standoff Rocks Markets While AI Revolution Changes How We Make Money
- What Does Character Ai Rate Exceeded Mean & How To Fix This ?
- AI Banks Are Crushing Traditional Lenders – Rural Businesses Finally Getting Fair Shot at Loans
- American Airlines Soars into the Future with AI Revolution – Here’s What It Means for Your Next Flight
The same data points that convinced the AI she deserved credit—her transaction history, spending patterns, business location, payment reliability—are now being fed into another AI system. This one doesn’t decide if she gets a loan. It decides what else the bank should sell her.
And this isn’t just happening at Kotak.
Across India’s financial sector, major banks like HDFC, ICICI, and Axis, along with microfinance companies like Chaitanya India Fin Credit and CreditAccess Grameen, are quietly repurposing their artificial intelligence engines. The technology that was supposed to democratize lending is now becoming a sophisticated advertising machine.
“We built these systems to assess risk and expand financial inclusion,” admitted a senior executive at a major private bank, speaking on condition of anonymity. “But the data we collected became incredibly valuable for understanding customer behavior. It would be foolish not to use it.”
That last sentence should worry everyone who’s recently gotten an AI-approved loan.
How AI Changed Everything About Banking
Ten years ago, getting a business loan was straightforward but brutal. You walked into a bank. A loan officer looked at your collateral, your credit score, maybe your salary slips. If you had none of these, you walked out empty-handed. Millions of Indians, especially women entrepreneurs like Shilpa, were locked out of formal banking.
Then came the fintech revolution and artificial intelligence.
Banks started feeding their AI systems alternative data—mobile phone payments, utility bill histories, online shopping patterns, even social media activity in some cases. The algorithms could spot patterns human loan officers missed. Someone who paid their phone bill religiously for three years might be a better credit risk than someone with property but erratic payment behavior.
This was genuinely revolutionary. People who’d been invisible to the banking system suddenly became visible.
A 2023 report by the Reserve Bank of India noted that AI-based lending helped banks extend credit to nearly 12 million previously “unbankable” Indians. Women entrepreneurs, small traders, and rural business owners finally got access to capital.
But there was a catch nobody talked about much.

To make these AI systems work, banks had to collect enormous amounts of personal data. Transaction histories. Location patterns. Spending habits. Bill payment regularity. Online behavior. Everything that could indicate whether someone would repay a loan.
This data was supposed to be used for credit assessment. Just credit assessment.
That’s not what happened.
When Your Risk Profile Becomes Your Shopping Profile
Shilpa’s data tells a story. And her bank knows how to read it.
The AI knows she runs a tailoring business. It knows her monthly revenue patterns—which months are good, which are slow. It knows she took a business loan, so she’s probably worried about financial security. It knows her transaction sizes suggest she’s serving middle-class customers, not luxury buyers. It knows she’s probably working long hours because her UPI payments happen at odd times.
This information is gold for targeted advertising.
When the bank’s marketing AI spots that Shilpa’s account balance dips at month-end, that’s when the insurance notifications increase. When her business transactions spike during wedding season, that’s when the mutual fund messages arrive, suggesting she “invest her surplus wisely.”
“The targeting is incredibly precise now,” explained Rajesh Kumar, a fintech analyst based in Mumbai. “Banks don’t just know if you’re creditworthy anymore. They know your vulnerabilities, your aspirations, your financial stress points. And they’re using AI to hit you with offers at exactly the right psychological moment.”
Some of this targeting is genuinely helpful. Shilpa admitted the insurance product actually made sense for her situation. But there’s a troubling question underneath: Did she buy it because she needed it, or because an algorithm knew exactly when she’d be most likely to say yes?
The Consent Problem Nobody’s Talking About
Here’s where things get legally and ethically murky.
When Shilpa applied for her loan, she signed multiple consent forms. Like most people, she didn’t read all the fine print. Somewhere in those documents, buried in legal language, there was probably a clause saying the bank could use her data for “various purposes” or “improving customer experience.”
But did she specifically agree to let her credit assessment data be repurposed for targeted advertising? That’s unclear.
“The consent frameworks in Indian banking are outdated,” said Priya Menon, a digital rights lawyer in Delhi. “They were written for a world where banks just held your money and gave you loans. They weren’t written for a world where banks are running sophisticated AI systems that extract insights from your behavior to manipulate your purchasing decisions.”
The Reserve Bank of India’s data protection guidelines require consent, but they’re vague about how specific that consent needs to be. Banks argue that if customers agreed to data processing, that covers everything. Privacy advocates disagree.
“There’s a huge difference between ‘We’ll analyze your data to see if you qualify for a loan’ and ‘We’ll analyze your data to psychologically profile you and bombard you with targeted offers,'” Menon argued. “Those should require separate, explicit consent. They don’t.”
The New Banking Business Model: You’re Not the Customer, You’re the Product
Traditional banking was simple. Banks made money from the difference between what they paid depositors and what they charged borrowers. Maybe they earned some fees on transactions.
AI has flipped this model.
Now, customer data itself is becoming a major revenue source. Banks aren’t just financial institutions anymore—they’re data platforms. They know more about your financial behavior than you probably know about yourself.
And they’re monetizing that knowledge.
Some banks are using the data internally, to cross-sell insurance, mutual funds, and other financial products. Others are going further, partnering with third-party companies and sharing customer insights (theoretically in anonymized form) for advertising purposes.
“We’re seeing the same surveillance capitalism model that Facebook and Google pioneered now migrating into banking,” warned Arun Mohan, a technology researcher at a Bangalore university. “Your bank is becoming an advertising platform. The AI that decides your creditworthiness is the same AI that decides what ads to show you.”
This shift is happening remarkably fast and with very little public debate.
What Banks Say (And What They’re Not Saying)
When contacted for this story, several major banks defended their AI-driven marketing practices.
“Our personalization efforts are designed to ensure customers receive relevant financial products that meet their actual needs,” a Kotak Mahindra Bank spokesperson said in a written statement. “This is far superior to generic advertising and actually helps customers make better financial decisions.”
HDFC Bank’s response was similar: “We use advanced analytics to understand customer requirements and offer suitable products at the right time. This is beneficial for customers and is done in full compliance with regulatory requirements.”
Notice what they’re not saying.
They’re not explaining exactly what data is being used for marketing versus credit assessment. They’re not clarifying whether customers can opt out of the marketing use while still accessing loans. They’re not acknowledging the power imbalance inherent in the system—the bank knows everything about you, but you know almost nothing about what the bank does with that information.
“The transparency is minimal,” said Kumar, the fintech analyst. “Banks will tell you they’re using AI. They won’t tell you exactly how, what data feeds the system, or how you can control it.”
The Double-Edged Sword
Here’s the complicated part: this system isn’t entirely bad.
Shilpa genuinely found some of the targeted offers useful. The insurance product made sense for her situation. The educational content about systematic investment plans taught her things she didn’t know. Without AI-based credit assessment, she never would have gotten the loan in the first place.
“I’m grateful the loan came through,” she said. “And honestly, some of these suggestions have been helpful. I just wish I understood better how they know so much about me.”
That ambivalence is probably healthy. This isn’t a simple story of evil banks exploiting innocent customers. It’s more nuanced.
AI has genuinely expanded financial inclusion in India. Millions of people now have access to credit who never did before. The same data-driven systems that enable hyper-targeted marketing also enabled their loan approval in the first place.
But access came with strings attached—strings made of data, analyzed by algorithms most people don’t understand, used for purposes that were never clearly explained.
What Happens Next?
India is currently drafting a comprehensive Digital Personal Data Protection Act, which could force banks to be more transparent about how they use customer information. The Act might require explicit, separate consent for different uses of data—meaning you could agree to data use for loan assessment but opt out of marketing.
But implementation is years away, and banks are moving fast.
In the meantime, customers like Shilpa are navigating a financial system that knows them intimately but reveals very little about itself.
“Sometimes I wonder if I should just turn off the notifications,” Shilpa said, looking at her phone. “But then I worry I might miss something actually important. And I don’t want to seem ungrateful to the bank that finally trusted me with a loan.”
That sentence captures the power dynamic perfectly. The bank leveraged AI to take a risk on her when others wouldn’t. Now it’s leveraging that same AI—and her gratitude—to keep selling.
Is this personalization or manipulation? Financial inclusion or surveillance capitalism? A better banking experience or a troubling erosion of privacy?
Probably all of the above.
The AI that opened doors for millions of Indians is the same AI that now watches them walk through those doors—and tries to sell them something at every step.
Whether that’s progress or a problem might depend on how much you trust your bank with not just your money, but your behavioral data, your psychological vulnerabilities, and your financial future.
For now, Shilpa’s phone keeps buzzing. And she keeps reading the notifications, never quite sure if she’s being helped or targeted.
Maybe that uncertainty is exactly the point.


