[fluffy]I’m trying to debug an issue with WebSub subscriptions on feed-on-feeds, and the websub.rocks distribution test suite seems to only support h-feed. Is there any way to get it to provide atom or rss instead?
aaronpkwebsub.rocks was really only meant to test the websub subscription mechanism, not the content that is delivered over that mechanism. it'd be nice to expand it to more use cases but that's all it does for now
[fluffy]yeah, it’s just that FoF still only supports rss/atom and adding h-feed would be a rather enormous amount of work that I don’t want to deal with, and testing the websub subscription mechanism is precisely what I”m trying to do.
[fluffy]I noticed it breaking when I moved to nginx but that shouldn’t have anything to do with it. I wonder if the `php://input` pseudofile works differently in nginx or something.
lahackeroh gee.. ok i swear this'll be the last thought experiment; i set the headless firefox to sync with the user's firefox account; no password, but still skirting the API; yay or nay? ;P
[snarfed]lahacker we probably won’t be the best judges for straw men or splitting hairs 😁 hopefully we’ve communicated the spirit of the idea! which you can then take into consideration
[fluffy]oh so now you’re supposed to use file_get_contents(‘php://input’) which is what I was doing all along, still dunno why this isn’t working. argh
[snarfed]another argument is, maintaining scraping over time is a horrible arms race that constantly breaks, is painful to debug and fix, and you never win. i know, i’ve spent way too much time in it. APIs avoid that
aaronpkyea that's the thing about scraping, these sites make no promise on maintaining the html, whereas they do have a schedule for maintaining and deprecating API features, plus they actively fight against scraping
[schmarty]i do get confused by the delivery part of websub. is it required that the POST body contains the updated content? is it possible that it's only notifying of the fact that updated content can now be fetched? (not sure how it would even communicate that URL if POST body is empty. 🤔 )
[fluffy]the POST body is supposed to contain the updated content but if it doesn’t contain any content the presumption is that the subscriber would then fetch the original URL
[fluffy]and so when the hub sends a ping it goes to websub.php and it parses out the feed ID (123) and a ‘secret’ (aklfdsetc.) which is just a very basic valildation thing
[schmarty]makes sense so far. sounds like maybe the publisher is not sending the updated content and therefore expecting the subscriber to fetch, as you described.
[fluffy]but now superfeedr seesm to have stopped sending out push notifications for changes to my feed entirely, probably because I hit some API limit or something. whatevs. I’ll just change FoF to treat all pings as thin pings.
[schmarty]it's been so long since i set up superfeedr but this does trigger a memory that they aggressively dedupe? like they won't send out updates if the content hasn't changed. wouldn't surprise me too much if they put a timeout on it. 😩
[schmarty]GWG: i occasionally think it would be interesting to use refbacks as a source for webmention discovery. i don't think i would collect and try to display them outright.
[schmarty]i think it makes some sense. i am tempted to say that webmentions found this way should get a special flag, but, webmention as a spec already says that webmentions can be reported by anyone. 🤔
[Raphael_Luckom] and ciccarellome[m] joined the channel; nickodd left the channel
[tantek]using it as a source of webmentions (and flagging as such, good idea) seems reasonable, and then apply all your usual webmention processing logic