grantcodesaaronpk: does your site repeat attempting to send homepage mentions if they don't work? I've not set anything up yet for saving homepage mentions
tantek, [snarfed], [kevinmarks] and leg joined the channel
AngeloGladdinghey guys i'm trying to do something that seems like it requires a two-step webmention but i don't think that's right.. before i go any further down this path anyone care to give an opinion?
AngeloGladdingso what i'm currently thinking is to have Alice send an additional webmention back to Bob's issue referencing the "canonical" issue page on Alice's site
@dustywebIf you have a text based format you want humans to look at and occasionally write by hand and you don't permit comments, you're making a terrible mistake. JSON, I'm looking at you. (Markdown, I'm also looking at you for having ugly comments as an extension.) (twitter.com/_/status/973239603942608896)
aaronpkthe "expected" column is my own expected result, not necessarily what the spec actually says right now, since i'm also not convinced the spec says the right thing yet
aaronpkthe problem is when a bunch of entries get added to the channel all within the same 1-second interval, they all share the same timestamp, so then a <= comparison no longer works right
snarfedeg my reader (newsblur) allows reversing order, which i care about, but i don't really care if new entries get interpolated or are always at the end
aaronpkbut it turns out rss feeds are particularly low-res enough that the cases where a bunch of entries are discovered at the same time they often also share the exact published date
aaronpk(I do have a special case when you add a new feed, those entries get interpolated based on their published date so that it doesn't flood the channel with new posts)
aaronpkthat would solve the problem of trying to use the published date to disambiguate because the published date is definitely less distinct than the other timestamp
[kevinmarks]that is always hard. You need to make the next/prev links pass in an absolute offset of some kind, so that adding new entries down't throw it off
aaronpkso let's say for example the first page ends at entry_id=5 in that example, the paging ID returned to the client will be encoded(2018-03-13 07:53:26, 1)
[kevinmarks]the other approach may be to have epochs based on published time, so you append old entries to those epochs (like tantek's BIMs), but then you need to cross epochal boundaries
[kevinmarks]with technorati we had recency based shards, so when you searched for keywords it could work back through the recent ones first before going deep if it was a rarer word
MylesBraithwaite👋, I'm currently in the process of developing my own IndieWeb application. Would it be okay if I create a Wiki page in the IndieWeb site for my notes? Or is that only for completed projects?
mylesb.cacreated /User:Mylesb.ca/Amalfi (+1302) "Created page with "'''Amalfi''' is an IndieWeb application built using [[Python]] and [[Flask]] that is currently in development by {{Myles}}. People using it on their own site: * {{Myles}} |..."" (view diff)
LoqiIt looks like we don't have a page for "DNS TXT record" yet. Would you like to create it? (Or just say "DNS TXT record is ____", a sentence describing the term)
LoqiIt looks like we don't have a page for "DNS records" yet. Would you like to create it? (Or just say "DNS records is ____", a sentence describing the term)
[eddie]!tell swentel regarding tokens, etc. not optimal but how Indigenous handles it currently is when you log out, it will send a token revocation request. Besides that, it assumes tokens are valid, but will provide an error if your Micropub request has an authentication error. It is assumed for now that if you have an authentication error, you’ll log out and back in of Indigenous manually
dgold"Warning: The lock file is not up to date with the latest changes in composer.json. You may be getting outdated dependencies. Run update to update them."
aaronpkWhoa really? What sort of sources? That should only have happened in very specific cases like when a large batch of entries was suddenly found and they had drastically different published dates
Loqi[Peter Stuifzand] I have been buildling a microsub server. It's not perfect, but it works. It works with Monocle. The code is open source and can be found on Github here: https://github.com/pstuifzand/microsub-server/
But sadly it seems I can't use my own authorizatio...
ZegnatKartikPrabhu, JSON Schema (like an XML schema) can be used to check the validity of a document. In this case the JSON that you get from parsing microformats from HTML.
ZegnatKartikPrabhu when you are accepting random JSON from a possible untrusted source (e.g. any Micropub client) it is good practice to filter such input and make sure what you are getting is what you expect. Like any other outsider input.
ZegnatNot sure if you win anything by specifying every mentioned property though. I guess it is nice because you get to force the uri format on some of them...
ZegnatI think I made a remote rsvp once, that’s not going to get passed your schema for one, [cleverdevil]. As you only accept 4 fixed strings for rsvp.
ZegnatThere is a bit of a conflict between validating generic mf2 objects (which is what my schema tries to do) and validating for an actual usecase (e.g. accepting blobs for Micropub).
ZegnatI can defintely see why you would want a collection of URLs for the photo property on a micropub server. And from all of the h-entry documentation we have, that’s what [cleverdevil] is specifically validating: URLs.
ZegnatActually, I think I found an mf2 example in the micropub spec that didn’t validate against my generic schema... Oh well. Another issue for tomorrow! Nighty-night.
aaronpkbut certainly there could be use cases where you'd want to strictly validate the vocabularies too. I just think they are totally different concerns