#sknebellist of URL -> "words the redirect target has to contain" to loop through?
#petermolnartest: no, not yet, and I really should, but it's funky; my generator is python, the thing handles the request is php, and the ruleset is in multiple json files
#petermolnarcurrently the request loops through all of them; eventually I'll try to make them smarter, but it's not simple at all
#petermolnareg. for images, I have a glob() to see if I can find a substitute
#petermolnarand that's way faster and lighter, than looping through thousands of regexes
#@isellsoap@gRegorLove How is your workflow when posting a note in ProcessWire and automatically posting it to Twitter via Bridgy? Where do you add the Bridgy webmention endpoint URLs? Or do you use another method? (twitter.com/_/status/1271425693671399424)
#cweiskethere is master-slave replication for DMBS
#jeremycherfasIndeed. And long ago I seem to remember master-slave for disk drives. But in this sense it means "primary" or "default" or something like that, not that it has power over others. Sill, why not change it.
#cweiskeit actually has power over it because it declares what the slave has to do
#sknebelpetermolnar: yeah, guess you'd need to run such a test against the full stack
#jeremycherfasYes, in the DMBS it does. But not in the git repo.
[schmarty] joined the channel
#[schmarty]There are sufficient alternatives to that terminology for database replication which still accurately describe the behavior. For example "leader/follower".
#[schmarty]^^ I like primary and replica even more because they describe how the DB instances are used
#[KevinMarks]It's sending a stream of commands that have been given to it first. And you can chain them, so upstream/downstream maybe.
[LewisCowles] joined the channel
#[LewisCowles][sknebel] apparently the trick to thousands of regexes is to build them into one large state machine, combining. That was at least how fastrouter was achieving some years ago.
#[KevinMarks]So now you have a compiler for a language you can't read?
#[LewisCowles][KevinMarks] I believe it is compiled once per process from discrete rules if that was about the Regex's.
#petermolnar[jgmac1106]: if you ever need help with rewrites and migration, I have a reasonably good understanding of them by now
#petermolnar[LewisCowles]: for routing, this works. For redirects, it's trickier, because the target is not part of the matched elements.
#petermolnarhowever, a combined regex could be used to match any potential known redirect and a lookup table to identify the target, but at that point, I'm not certain how much better it would be.
#[LewisCowles][petermolnar_] come again? The routing still has to match to a target handler. I think they do use a lookup table / hashmap of sorts
#[jgmac1106]petermolnar I might take you up on that offer and erase much of my htaccess file and start over
#petermolnar[LewisCowles]: I followed the idea and reverted back to previous iteration, where I have static lists and arrays as first line of tests and only fall back to complex regexes if there's no other way
#jeremycherfasMy redirects are fine. The problem I have is in hand-migrating the posts. When they get there, the rewrites will find them, but they aren't all there yet.
swentel, [jgmac1106], nickodd, geoffo, [tantek], [chrisaldrich], [KevinMarks] and gRegorLove joined the channel
#jackyI haven't gotten the real time bits down yet (though those counts will update in real time)
#jackythis _should_ work for any site that at least has indieauth setup. I might need to add some sort of fall-back to a provider if I can't find anything
#jacky(or lean on a provider to do that for me! soon)
twomanytacos and geoffo joined the channel
#[schmarty]jacky: i've got some example data from webmention.io's webhook notifications in Morris (which just catches and stores that data where my Hugo build can parse it for my templates to use) https://github.com/martymcguire/morris
#Loqi[martymcguire] morris: PHP webhook for caching webmention.io webmentions for static sites.
#jackyactually [schmarty] lemme pick your brain a bit more
#jackymy hope was that in order to make it seamless that once you sign in, we'd find the feeds on your site that you expose as well as the WebSub endpoint
#jackyI was planning to have lighthouse ask to subscribe to "live changes" to your site as well as listening to your feeds
#jacky(that being websub subscription to your domain as well as either polling against feeds or the websub endpoints for each feed)
#jackydoes that sound like something that'd reduce a bit of work for you w.r.t setup (b/c you're in the static space and some other people I'm targeting for Lighthouse are in that space as well)
#[schmarty]jacky: that sounds exactly like something i have been daydreaming about for literal months (years??)
#[schmarty]there are so many things you can do when you can easily discover and robustly watch a feed.
#[schmarty]when i had bigger POSSE dreams i started work on a thing i called Syndication Party. i ended up building just the IndieAuth parts on Glitch and getting distracted with other proejcts.
#[schmarty]but the idea was for it to poll (websub) for new posts and look for ones that i have flagged for syndication
#jackythat plus some out of the box custom Web components for presenting reactions and comments in a Disqus-y way
#[schmarty]then it would interface with bridgy or whatever API to syndicate each post
#[schmarty]jacky: that was my hope! i have another handful of fun ideas to do with it but never came back to sit with the effort required to get a good robust feed watcher and websub implementation going.
#petermolnarso, this is what came out of my redirect/gone/etc refactoring from today: https://pastebin.com/K7h19R70 - the 404.php ; https://pastebin.com/uRvKG63m - the actual dataset it iterates over with various methods. Some parts of me are proud, because it's working quite nicely, but deep down I'm quite ashamed that I need it.
#petermolnarI could have used a github gist for this instead..
GWG, [LewisCowles], ben_thatmustbeme, KartikPrabhu and [KevinMarks] joined the channel; nickodd left the channel
#petermolnaryeah, I jumped on the let's have a short domain + short slug for every entry
#petermolnarthat was when I still thought it could be useful
[chrisaldrich] joined the channel
#aaronpkyeah i switched mine to stop autogenerating short URLs for everything, now I have a URL shortener app at aaronpk.com that i can use to make short links to whatever, not just my own posts
[tantek] and [Paulo_Pinto] joined the channel
#[Paulo_Pinto]If I generate webmentions from a RSS feed through a IFTTT action that runs once a day, at a particular time, how are treated webmentions already sent? Are they going to be sent again, creating SPAM? I tried to find the answer but I am confused.
#petermolnar[Paulo_Pinto]: they will be sent again, which is OK - re-sent webmentions usually indicate something has changed in the article, but they are not forbidden.
#petermolnaras for how they will be treated is up to the receiver
#[Paulo_Pinto][petermolnar_] thanks. That means unchanged articles are still sent. Ok.
#jackykey=agdicmlkLWd5cmULEg1QdWJsaXNoZWRQYWdlIj5odHRwczovL3YyLmphY2t5Lnd0Zi9wb3N0L2VjMGY5ZDA4LWY0MmMtNDZhYS04YTNlLTRlYjI0NTdlZGEwNAwLEgdQdWJsaXNoGICAkKWYt4AJDA) for mastodon
#jackylooking at the web blogs, that link might be broken
#gRegorLoveOoh, just noticed in Monocole the "someone liked..." when it can't find the author
#@bixtweetsWell shit. This db plugin won’t let me save an edit to this comment to change the Comment-Type to webmention because another field is required, despite obviously not being required when the comment was created. Shit. (twitter.com/_/status/1271579919189807106)
#[tantek]Even those who should know better make really bad design errors
#jackyI've been spending more time on my webmention thingy (namely on the relay support)
#[snarfed]schmarty lol yes! but this is a like of a post
#@bixtweets↩️ And there isn’t. Of course there isn’t. This should be working, as the post is properly displaying a webmention already, so why not this second one? (twitter.com/_/status/1271584902421417985)