#dev 2022-04-03

2022-04-03 UTC
p1, gRegor, ShinyCyril, nertzy, nertzy_, cybi, angelo, mro, tetov-irc and Christian_Olivie joined the channel
#
beler
good morning, does I understand is right?::
#
beler
the relme auth needs only html von the site of the user which want to login
#
beler
and in comparison the indieauth needs POST APIs on both sites (server and user)
#
beler
so server and user communicate using API endpoints instead of parsing the HTML and do the rel-me thing.
cybi and jacky joined the channel
#
jacky
so now webmention information can appear in feeds (though it's exposing a bit of a jam in the async processing queue) https://imgur.com/gsH4Zau
#
jacky
things like OPD and PTD run during processing so it'd (ideally) go from "Sent a Webmention" to "LIked x's post"
#
jacky
next thing would be allowing for a Webmention to be resent from that feed with a button (and/or example cURL code for 'advanced' users)
cybi and nertzy_ joined the channel
#
@qubyte
I have a big backlog of received webmentions to process for my blog, but the edge cases are holding me back. I’m especially concerned about names. They change for a variety of reasons, and I want to make sure mentions are always up to date.
(twitter.com/_/status/1510619040540209161)
#
jacky
^I feel like that's a valid concern
#
jacky
I thought about pulling out author info and keeping it as a URL so I can resolve them independently and merge it in as I went along (like when showing a feed of webmentions for example)
#
jacky
but that'd break aaronpk's custom profile image based on emoji
#
jacky
there's a special case when it could be a embedded h-card or just a URL and if it's only a URL, resolve as necessary
#
sknebel
the author can "push" an update to you by resending the WMs
#
jacky
true!
#
sknebel
although using a reference to an URL as a sign that it should be kept up to date to that if an update is encountered is interesting
P1000[d] joined the channel
#
jacky
yeah - because it isn't necessarily the 'full' form of what a h-card (with photo + name + url) would be
#
jacky
if there's no need for resolving (like if plain-text is okay) then it could be passed through as is
#
sknebel
pointer to an updated page instead of locally baked in feels like the one case where you have an indication how the author intends it to be used
#
sknebel
for an on-the-page card you cant directly tell if thats going to stay or going to be updated
#
jacky
that's a good point
#
jacky
like why else would they go through the "trouble" of publishing all of that info?
#
sknebel
I guess one could also attempt to detect it. i.e. if you see "same author, different information", you could spot-check 1-2 older references
#
sknebel
and see if those have hcanged
#
sknebel
but thats pretty advanced
#
jacky
heh yeah
#
jacky
I _could_ do something like that since I convert incoming Webmentions into MF2+JSON
#
jacky
do a JSONPath query lookup to get values that changed
#
jacky
(from the stored to new ones)
#
jacky
don't know if there's anything to surface to the user tho (like if the author's updated - should the webmention bump _up_ in the feed?)
cybi joined the channel
#
jacky
(probably not - only if there's a new `updated_at` or `published_at`)
#
sknebel
would agree with that, yes
mro joined the channel
#
jacky
so I have a slight issue
#
jacky
I want to support /fragmention with /Lighthouse - I had this idea about highlighting parts of my site with Webmentions (so it's possible to interact with parts of content)
#
jacky
which, also, is a bigger plan to make another form of post type that works as a natural form of threads (a /collection of /note posts)
#
jacky
but the spec encourages the discarding of fragment URLs
#
jacky
at least in request verification
#
jacky
at least I'm interpreting "Note that a target URL may contain a fragment identifier, and if the receiver limits which URLs can receive Webmentions, the fragment SHOULD be ignored when checking if the URL is supported." as that
#
aaronpk
"if the receiver limits..."
#
aaronpk
and yes ignoring it only for checking if the URL is supported as one of the limited URLs that can receive webmentions
#
jacky
hm okay then I need to tweak my logic then to support them properly
#
jacky
oh never mind
#
jacky
it looks like I do already but fragments aren't shown in the UI
mro joined the channel
#
[snarfed]
re ^ qubyte's idea of mass updating previously sent mentions, I like it too, I haven't seen much work or discussion of it before
#
[snarfed]
the easiest answer is probably just to iterate through all received webmentions and "pull" them by resend their webmentions to yourself
nertzy_ and mro joined the channel
#
sknebel
I think something like "recrawl all mentions from this site" would be a good feature to have
#
[snarfed]
could be an interesting service: if you have a feed with rel-prev links and webmentions marked up with mf2, it could crawl your feed and each post and resend all wms
#
sknebel
yeah. although tbh cleaner "just" against the storage, so I'd encourage it there too/more
#
jacky
[snarfed]: I can see that being useful for sending services!
nertzy joined the channel
#
[snarfed]
(all _received_ wms)
#
[snarfed]
sknebel yup! tradeoff is, that can't be a generic service
#
sknebel
yeah. wasnt meant to say "dont have a service", but in reverse even if a service exist it likely makes sense to build as a feature :)
chenghiz_ and ShinyCyril joined the channel
#
@simonw
This morning I learned that the app logo form field in this form for setting up a Google OAuth "consent screen" is a trap! Do not upload an app logo here. It will trigger a multi-day verification flow before your app can go live https://pbs.twimg.com/media/FPbhlGpVsAEYGTC.jpg
(twitter.com/_/status/1510642840149250055)
cybi joined the channel
#
@schnarfed
↩️ Easiest way would be to iterate through your received mentions and rerun your wm receiving code. Could even be built as an independent service that any site could use. Feel free to jump into https://indieweb.org/discuss and/or update https://indieweb.org/Webmention-developer with your experience!
(twitter.com/_/status/1510646585012543488)
mro_ joined the channel
#
aaronpk
[KevinMarks]: yeah that's one of the things they've done to counter the oauth phishing attacks where people would upload like the google drive logo to their own oauth app
omz13 joined the channel
#
omz13
I have been doing things to my IndieAuth server involving tickets. And then I got thinking.
#
Loqi
omz13: gRegor left you a message on 2021-12-08 at 3:31am UTC: I was trying out https://toolbox.imoxia.com/#authmetadisco and got an error when it was trying to redeem the authorization code. Looks like it didn't send `grant_type`
#
omz13
I came up with a cunning piece of choreography and orchestration that essentially lets a reader (third-party) get a token to use on-behalf-of a subject (first-party) at an audience (second-party).
#
omz13
It is, perhaps, like AutoAuth but using tickets.
#
omz13
I have written up a spec for it (ac-obo grant). And since it needed ticket stuff too, it includes ticket send/exchange/want too.
#
omz13
And this is all practical not theoretical. As of last week my reader could quite happily get private things from fluffy's site using it to get bearer tokens in this weird and wonderful way.
#
omz13
I have updated my toolbox so the resource reader at https://toolbox.imoxia.com/#fetchsomething will use this mechanism if its available. It shows trace messages so if anybody else implements it it should help.
mro, cybi, omz13 and adstew joined the channel
#
@qubyte
↩️ No I think that was the micropub stuff only. All I have is http://webmentions.io sending webhooks which get turned into GitHub issues for me to figure out.
(twitter.com/_/status/1510679617149054986)
#
jacky
going to have to check this out :)
mro joined the channel
#
jacky
been having a bit of a 'brain blast' with websub
#
jacky
I can see how one could use it to build a 'trending' feed of sorts
#
jacky
the indieweb doesn'
#
jacky
*doesn't own the WebSub spec, no?
#
jacky
oh I guess we do in a way
ShinyCyril, mro, wagle, tetov-irc and [fluffy] joined the channel