[tantek]Yeah vocab validation is likely a waste of time and possibly going to cause you trouble when people start using / publishing new terms and or old terms in new ways
loqi.meedited /giving-credit (+193) "Zegnat added "[http://curatorscode.org/ The Curator’s Code] tried to get people to “attribute discovery” and standardise on hat-tip and via as terms. (With unicode symbols ↬ and ᔥ respectively.)" to "See Also"" (view diff)
aaronpkdot-separators in the channel IDs could definitely work, clients could recognize that and group them based on that, and clients that don't understand it would just treat them as opaque strings still
aaronpkcweiske: hm, previously I said that channel uids can be any URL-safe character, which means there aren't a lot of good options left for a separator
[cleverdevil]One challenge I did see that surfaced in the test suite for the way I've done the schema is when parts of the document are *multiple* types at the same time.
ZegnatAh. Hmm. I think I have seen some feature about using/extending multiple formats. If that allows you to mix the h-event and h-card schemes inside the schema you would be able to validate it.
ZegnatEspecially with mf2 not having vocabs, no way to know which properties someone wanted to go on the card and which on the event. Or maybe duplicate data? No clue how this is supposed to work out.
ZegnatThat education example does pass against my schema btw. Which shows you why you might want vocab aware validation on your server, in case you want to stop stuff like that from being stored.
LoqiA resumé or curriculum vitae (CV) is a document that represents a person's background and skills, commonly used to secure employment https://indieweb.org/resume
[eddie]!tell aaronpk: Before I disappeared the other day, I mentioned my webmention notifications are duplicating. Because of salmention and because I literally just forward every webmention my webhook recieves into the micropub API for that microsub channel. I guess probably what I need to do is just append the source url of any post received and sent into a file, and then check for that url in the file and if it’s found don’t send it again in the
bearok, so according to my data, the following sites all use some form of h-as-* ['mlncn.withknown.com', 'acegiak.net', 'boffosocko.com', 'dym.cx', 'notizblog.org', 'david.shanske.com', 'www.tombruning.com', 'raretrack.uk', 'achangeiscoming.net', 'tombruning.com', 'kartikprabhu.com', 'unrelenting.technology', 'caseorganic.com', 'jeena.net', 'www.ashersilberman.com', 'www.marcus-povey.co.uk', 'marcus-povey.co.uk', 'glennjones.net', 'tantek.com', 'j4y.co',
aaronpkdgold: yeah if you re-send them to the webmention.io endpoint it will store them for you again. of course the site has to still be online for the webmention to validate so any webmentions from sites that have gone down will not be transferred to webmention.io
aaronpkdgold: actually if you are sending the webmentions yourself you can just send them to the webmention.io endpoint before changing your advertised endpoint
LoqiIt looks like we don't have a page for "microformats validation" yet. Would you like to create it? (Or just say "microformats validation is ____", a sentence describing the term)
LoqiIt looks like we don't have a page for "json schema" yet. Would you like to create it? (Or just say "json schema is ____", a sentence describing the term)
[cleverdevil]My understanding from what I am reading is that `rel-urls` would primarily show up in MF2 JSON as the result of parsing HTML, not created by a Micropub client.
[cleverdevil]microformats validation is the process of inspecting a document that is marked up with microformats2 to see if it complies with the specification for the format, and potentially with the known vocabularies... {{cleverdevil}} is working on an attempt to make this easier here https://github.com/cleverdevil/microformats2
dgoldWMIO_WEBHOOK_TOKEN - a secret string chosen by you that webmention.io will include in webmentions in order to verify that they came from webmention.io
[cleverdevil]I suppose there should be a way for the Microsub server to ping the frontend/cms, and have it generate the HTML for the published content, and then redirect there.
[eddie]I’ve had the Facebook timeline feed in a general “Facebook” channel for awhile (so it’s already fetched a bunch of stuff). I added the same feed url to a Family Facebook after adding my required keywords as my Family member’s names and bingo! It’s literally just a feed of my family’s stuff
[eddie]The Facebook Atom feed puts the words “X shared X” when someone reposts something on Facebook. So on my overall Facebook feed, I just added to exclude the word “Shared”. There might be some false positive posts lost where someone actually uses the word shared in their post, but it’s worth it since Facebook is really just for me to skim through occasionally
LoqiIt looks like we don't have a page for "collector.githubapp.com" yet. Would you like to create it? (Or just say "collector.githubapp.com is ____", a sentence describing the term)