I had a problem that I think many people in Bangladesh can relate to: my wallet is full of credit cards, and I have absolutely no idea what most of their offers are. Between BRAC Bank, SCB, and others, there are hundreds of “Buy One, Get One” deals, discounts, and perks scattered across dozens of websites and Facebook pages. It’s a full-time job to keep track of them.
So, I decided to do what any tech enthusiast would do: build an AI agent to do it for me. My dream was simple: a single website, updated automatically every day, showing me every credit card offer in one clean, searchable place. How hard could it be?
As it turns in, it was a brutal, frustrating, and incredibly rewarding journey. This is the real, unfiltered story of building the “Offer Hunter”—a tale of blocked scrapers, confused AIs, broken deployment pipelines, and the final, glorious breakthrough. If you’ve ever wanted to build a real-world AI application, this is what it actually looks like.
See the final result first!
The live, autonomous application is running at:
my-final-offer-app.vercel.app
The Grand Plan: The Three Parts of an AI Agent
Every autonomous agent, from the simplest bot to the most complex AI, follows a three-step loop: Perception → Reasoning → Action. My plan was to build three “specialist” tools to do just that:
- The Hunter (Perception): A Python script that would visit a list of bank websites and gather all the raw, messy text about their offers.
- The Analyst (Reasoning): An AI brain, powered by Google’s Gemini model, that would read the messy text and extract the key information into a clean, structured format (like a spreadsheet row).
- The Reporter (Action): A system to save this clean data to a permanent database (Supabase) and display it on a public website (Vercel).
The final piece was to make it autonomous—to have it run on a schedule every single day, forever. For this, I’d use GitHub Actions.
The Brutal Reality of Building in the Real World
The plan was solid. The execution was a nightmare. I hit a brick wall at every single stage.
Problem #1: The Hunter Gets Blocked
My first attempt at scraping the websites used a simple Python library called requests
. It was immediately met with failure. Websites are smart, and they don’t like being scraped by simple bots.
Error Log: 403 Forbidden
& 404 Not Found
Some sites slammed the door in my face, identifying my script as a bot. Others had simply moved their offer pages, leaving me with dead links. My Hunter was blindfolded.
The Fix: I had to upgrade my Hunter. I switched to Selenium, a powerful tool that automates a real, invisible Chrome browser. Instead of politely asking for data, my agent could now visit a website just like a human, wait for all the content to load, and then read the page. It was slower, but it was a master of disguise.
Problem #2: The AI Analyst Gets Confused
With Selenium, my Hunter was now bringing back mountains of text. But it was a mess of menus, ads, and legal jargon. When I fed this to my AI Analyst, it got overwhelmed.
Error Log: Expecting value: line 1 column 1 (char 0)
I had commanded the AI to respond in a strict JSON format. But when it got confused, it would just send back a simple sentence like “I can’t analyze this.” My code, expecting JSON, would immediately crash.
The Fix: This required a two-part upgrade to the Analyst’s brain. First, I used a technique called **”chunking”** to break the giant wall of text into smaller, more manageable paragraphs. The AI would analyze one paragraph at a time. Second, I dramatically improved its instructions (the prompt), even giving it examples of a good answer. This “few-shot prompting” made it far more reliable.
Problem #3: Deployment Hell
With a working agent in my Colab “science lab,” I was ready to build the public website. I chose Vercel for its simplicity. It turned into my biggest nightmare. Every “one-click” template I tried was broken. I was met with a soul-crushing series of cryptic build failures.
The Fix: I had to abandon the easy buttons and build it like a pro. I created a blank project and manually built the website file by file on GitHub, pasting in the correct code for each part. It was tedious, but it worked. It was a powerful lesson: when the “easy” way fails, the “manual” way is often the most reliable.
The Breakthrough: A Live, Autonomous Agent
After days of debugging, I manually triggered the final, autonomous agent from its new home on GitHub Actions. I watched the live log with my heart pounding. It visited the first site. It analyzed the text. It found offers. And then, the magic words appeared:
*** Success! Saved 158 offers to the database. ***
I rushed to my live website, refreshed the page, and there it was. A beautiful, clean, tabbed interface, populated with hundreds of real offers, all gathered and organized by the AI agent I had built. The “Last Synced” timestamp was from just a few minutes ago.
Limitations and The Road Ahead
The Offer Hunter is alive, but its journey is just beginning. It’s not perfect. The AI sometimes misinterprets data or misses an offer. The biggest limitation is that web scrapers are fragile; if a bank redesigns its website tomorrow, my Hunter might get lost. Here’s the roadmap for making it even better:
- Duplicate Detection: Teaching the agent to check if it’s already saved an offer before adding a new one.
- Better Error Reporting: Creating a system that emails me when a specific website fails to scrape, so I can update the Hunter’s instructions.
- Expanding Sources: Carefully adding more high-quality bank and lifestyle websites to the target list.
- Tackling Facebook: The ultimate challenge! Instead of scraping, I’ll need to learn and integrate Facebook’s official Graph API.
What I Learned on This Journey
This project taught me that building a real AI agent is 10% about the cool idea and 90% about persistent, frustrating, and incredibly satisfying debugging. Every error was a lesson. Every failure was a step forward.
The final, working application is more than just a tool. It’s a testament to the power of pushing through when everything seems broken. And that, I think, is the most valuable offer of all.