Show HN: I built a tool that watches webpages and exposes changes as RSS

67 points - today at 4:21 PM


I built Site Spy after missing a visa appointment slot because a government page changed and I didn’t notice for two weeks.

It watches webpages for changes and shows the result like a diff. The part I think HN might find interesting is that it can monitor a specific element on a page, not just the whole page, and it can expose changes as RSS feeds.

So instead of tracking an entire noisy page, you can watch just a price, a stock status, a headline, or a specific content block. When it changes, you can inspect the diff, browse the snapshot history, or follow the updates in an RSS reader.

It’s a Chrome/Firefox extension plus a web dashboard.

Main features:

- Element picker for tracking a specific part of a page

- Diff view plus full snapshot timeline

- RSS feeds per watch, per tag, or across all watches

- MCP server for Claude, Cursor, and other AI agents

- Browser push, Email, and Telegram notifications

Chrome: https://chromewebstore.google.com/detail/site-spy/jeapcpanag...

Firefox: https://addons.mozilla.org/en-GB/firefox/addon/site-spy/

Docs: https://docs.sitespy.app

I’d especially love feedback on two things:

- Is RSS actually a useful interface for this, or do most people just want direct alerts?

- Does element-level tracking feel meaningfully better than full-page monitoring?

Source

Comments

iamflimflam1 today at 8:25 PM
Something I was planning on building but never got round - if anyone wants to do it then feel free to use this idea.

Lots of companies really have no idea what javascript is being inserted into their websites - marketing teams add all sorts of crazy scripts that don't get vetted by anyone and are often loaded dynamically and can be changed without anyone knowing.

A service that monitors a site and flags up when the code changes - even better if it actually scans and flags up malicious code.

ahmedfromtunis today at 8:04 PM
As a (former) reporter, site monitoring is a big part of what I do on a daily basis and I used many, many such services.

I can attest that, at least from the landing page, this seems to be a very good execution of the concept, especially the text-based diffing to easily spot what changed and, most importantly, how.

The biggest hurdle for such apps however are 'js-based browser-rendered sites' or whatever they're called nowadays. How does Site Spy handle such abominations?

tene80i today at 7:35 PM
RSS is a useful interface, but: "Do most people just want direct alerts?" Yes, of course. RSS is beloved but niche. Depends who your target audience is. I personally would want an email, because that's how I get alerts about other things. RSS to me is for long form reading, not notifications I must notice. The answer to any product question like this totally depends on your audience and their normal routines.
xnx today at 6:29 PM
I like https://github.com/dgtlmoon/changedetection.io for this. Open source and free to run locally or use their Saas service.
nicbou today at 8:22 PM
Buddy I love you!

I have wanted this for so long! My job relies on following many German laws, bureaucracy pages and the like.

In the long run I want specific changes on external pages to trigger pull requests in my code (e.g. to update a tax threshold). This requires building blocks that don't exist, and that I can't find time to code and maintain myself.

I currently use Wachete, but since over a year, it triggers rate limits on a specific website and I just can't monitor German laws anymore. No tools seem to have a debounce feature, even though I only need to check for updates once per month.

dev_at today at 8:07 PM
There's also AnyTracker (an app) that gives you this information as push notifications: https://anytracker.org/
enoint today at 6:25 PM
Quick feedback:

1. RSS is just fine for updates. Given the importance of your visa use-case, were you thinking of push notifications?

2. Your competition does element-level tracking. Maybe they choose XPath?

reconnecting today at 7:57 PM
I remember there was something called Visualping many years ago, and the real issue was that when a website changed its structure, it broke the comparison.

Did you solve this?

hinkley today at 7:40 PM
Back in 2000 I worked for a company that was trying to turn something like this into the foundation for a search engine.

Essentially instead of having a bunch of search engines and AI spamming your site, the idea was that they would get a feed. You would essentially scan your own website.

As crawlers grew from an occasional visitor to an actual problem (an inordinate percent of all consumer traffic at the SaaS I worked for was bots rather than organic traffic, and would have been more without throttling) I keep wondering why we haven’t done this.

Google has already solved the problem of people lying about their content, because RSS feeds or user agent sniffing you can still provide false witness to your site’s content and purpose. But you’d only have to be scanned when there was something to see. And really you could play games with time delays on the feed to smear out bot traffic over the day if you wanted.

bananaflag today at 6:58 PM
Very good!

This is something that existed in the past and I used successfully, but services like this tend to disappear

makepostai today at 6:11 PM
This is interesting, gonna try it on our next project! thumb up
digitalbase today at 6:47 PM
Cool stuff. You should make it OSS and ask a one time fee for it. I would run it on my own infra but pay you once(.com)
pwr1 today at 6:19 PM
Interesting... added to bookmarks. Could come in handy in the future