#dev 2021-07-26
2021-07-26 UTC
jeremycherfas, samwilson, KartikPrabhu, stevestreza, Saphire, capjamesg and hendursa1 joined the channel
[pfefferle] joined the channel
#
[pfefferle] it is not “have to”

#
[pfefferle] as far as I know, the spec recommends to do it asynchronously because of DoS attacks

#
[pfefferle] the wordpress plugin is currently processing synchronously

#
[KevinMarks] The slow (or rather unknown, async appropriate) part is fetching the source link and parsing it. The DoS problem is sending you a link to a large or slow source that you need to fetch and parse.

hendursaga and chenghiz_ joined the channel
KartikPrabhu joined the channel
#
superkuh async makes it a lot easier to detect spam.
#
superkuh Batching it.
justache joined the channel
shoesNsocks joined the channel
#
@askRodney Here's how you can bring your own data in Gatsby, adding it to the GraphQL data layer. You see an example with the Webmentions. You can ♻️ upcycle the code here for other APIs. Hope you find it useful. https://rodneylab.com/add-data-gatsby-graphql/ #askRodney #gatsbyjs #Webmentions @GatsbyJS #GraphQL (twitter.com/_/status/1419692974905217032)
[jacky] joined the channel
#
[pfefferle] [capjamesg] that is how WP-Cron works… when a webmention comes in, you schedule an event, that will be triggered on the next (wp-)cron run

#
[pfefferle] and scheduling an event, means to add an entry to the mysql db

#
[pfefferle] for example

#
[pfefferle] we planned the async feature for webmentions the same way… save a comment to the db with only source and target and schedule an event, that should do the parsing stuff afterwards

#
[KevinMarks] A timeout is sensible in any case, yes. If you go to a queue model you may still need that to avoid the queue being blocked by one bad url. Retry with exponential back off is a common pattern too.

#
[KevinMarks] Aeons ago when I wrote the technorati crawler in python I used Twisted for the async stuff, but there may be easier ways now.

#
[KevinMarks] I'd put a number of tries in the db for the url as well, so you can schedule retries after the fresher ones, and give up after a few attempts.

rockorager joined the channel
#
rockorager capjamesg: Did you make an importer? Or how are you handling existing data?

rockorager joined the channel
#
vikanezrimaya oh well
#
vikanezrimaya my software seems to sometimes glitch out and start sending HTTP 500 errors instead of my homepage
#
vikanezrimaya I wonder if I should just make it restart whenever I get 4 of these in a row in span of maybe a minute
#
vikanezrimaya * I wonder if I should just make it restart whenever I get 4 of these in a row in span of maybe 5 minutes
#
vikanezrimaya * I wonder if I should just make it restart whenever I get 4 of these in a row in span of maybe a minute
#
vikanezrimaya or five minute
#
vikanezrimaya it looks like there's something wrong with the connection pool
#
vikanezrimaya i would need to add more debug logging temporarily to see what exactly is going wrong there and why does it time out
rockorager and angelo joined the channel
#
Zegnat If you are looking for test webmentions, capjamesg, I guess you could loop whatever webmentions you already have on webmention.io and resend them to your new endpoint. Remember: webmentions for any resource can be send by anyone :) Depending on what your endpoint ends up doing for work, that might actually be the easiest way to transfer information

rockorager joined the channel
[schmarty] joined the channel
#
[schmarty] GWG: why would it be the same?

#
Loqi aaronpk: GWG left you a message 1 day, 12 hours ago: https://github.com/indieweb/microsub/issues/24 I assume Monocle supports this, which you note you added to Aperture and is in Yarns...

shoesNsocks1 and [fluffy] joined the channel
odinho, jeremycherfas, [aciccarello] and rockorager joined the channel
#
rockorager Zegnat: That's what I did, and it made me realize I had to implement the authorship spec in order to cover all the different ways to find an h-card for an h-entry

angelo and nertzy_ joined the channel
#
@RubygemsN authorio (0.8.2): Rails engine to add IndieAuth authentication endpoint functionality https://rubygems.org/gems/authorio (twitter.com/_/status/1419804363002843153)
#
vikanezrimaya it sounds interesting but I do not understand what it means
#
vikanezrimaya wait, so a micropub client subscribes to websub notifications to be able to see when the post creation is done?
#
vikanezrimaya wow
#
vikanezrimaya sounds very complicated but interesting
#
vikanezrimaya and also unavailable for those who can't use WebSub
#
vikanezrimaya I am currently working on a hack to let me quickly import all my old webmentions from Webmention.io into Kittybox for safekeeping
#
vikanezrimaya I exported all of them and now I'm gonna reuse some of my WIP webmention endpoint code as a console app that slurps a JSON list of sources and targets, fetches the webmention as it should and then sends it as if it was my webmention endpoint
#
vikanezrimaya Webmention.io API was very helpful in exporting, I just set a ridiculously large pagination size, got a 338KB JF2 file and processed it with jq to reduce extraneous info
#
vikanezrimaya ugh. until I figure out a Rust MF2 parser (which will take a long time since I am lazy) i'll have to use Python
#
vikanezrimaya it's a good thing I still remember a bit of Pythojn
#
vikanezrimaya wait there is now?!
#
vikanezrimaya All my work on writing a Python webmention endpoint wasted?!!
#
vikanezrimaya aaaaaaaa
#
vikanezrimaya why
#
vikanezrimaya well it's still a perfectly good webmention endpoint
#
vikanezrimaya i can't allow it to go to waste
#
vikanezrimaya so I will at least publish it on Gitlab
#
vikanezrimaya it'
#
vikanezrimaya it's somewhat Kittybox-specific tho since it uses Micropub edits to publish Webmentions for everyone to see
#
vikanezrimaya but any software can be adapted to it without proprietary extensions
#
vikanezrimaya ugh, trying to remember if the new version contains any extensions to Micropub protocol that I might've invented myself
#
vikanezrimaya I was planning to do something like that but I don't remember if I abandoned these plans
#
vikanezrimaya Ok, i'm reading the library source code and it looks like it's untyped JSON
#
vikanezrimaya which is kinda good since it's flexible but kinda bad if you need strong type-checking
#
vikanezrimaya I remember that my draft contained a very neat type system for common MF2 objects and properties
#
vikanezrimaya you might actually remember it? I think we've discussed this before
#
vikanezrimaya it was months ago tho