Revision [938]
This is an old revision of CachingRSS made by DreckFehler on 2004-08-08 18:54:04.
Given a URL, http://domain.com/feed.xml, I'd like to extract domain and feed, and then gel them together, domainfeed.
This way I can make all RSS requests cache by default, which will alleviate some of the bandwidth concerns.
In the RSS action, I'll check to see if the user has specified a cache file, and if not, I'll set it to domainfeed.xml.
What do you think?
Update: Nevermind the plea. I'm going to use PHP's parse_url().
that's what i would have proposed ;)
if ridiculous filenames don't bother you, there even is no need for that juggeling with the url-chunks. you can do it in one line:
due to the nature of the md5-algorithm this will not produce distinct results, but chances to get in conflict with other cache-files are fairly low (to be accurate 1:2^36).
as a matter of principle defining the cache-file should never be up to the user. your approach seems to be much better (not only as a fallback-mechanism). imagine you want to use the feed-action somwhere at my wiki. you wouldn't have any survey of what cache-files are already in use (actually zero, 'cause i am caching into a database table ;) but that won't help you with the onyx-lib). although you most likely would use a name, which is somehow connected to the sitename or url, the risk of interferences will be in some cases much higher as with the md5-method above or with a modified url-string.
instead it is a good idea to define caching timeframes individually for each feed, since some feeds are updated on a daily basis and really don't need a refresh every 30 minutes, whereas a snapshot of others is obsolete within seconds.