Recently, I started a hobby project in Rust that collects some news from several websites and creates me a weekly digest.
Since I'm making it for fun I started developing a scraper.
However, one simpler option was to use the websites' RSS feed. So, I looked for them as well but few websites have them.
Now, with the advent of Agentic AI they seem to be an old fashioned way that is needed anymore.
What do you think?
torunar•11h ago
Why having an easy way to organise a data feed that will work even on cheapest clients and servers, when you can simply boil a tank of water to get a half-assed hallucination loosely based on the source, right?
I have seen people trying to bury RSS for the past 15 years — didn’t happen yet. It is still widely used by blog enthusiasts — that’s basically how I get information about new posts from my favourite old-tech authors.