#[kevinmarks][eddie] you can feed unmung from itself, so can do rss to h-feed to mf2 json in 2 passes
#[eddie][kevinmarks] good to know! I thought that might be possible, wasn’t 100% sure of the pathway for that. Since xray can do it straight from rss I’ll probably stick with that because I host my own xray server so I can depend on it being up and running if my server is up
#[eddie]SSR is particularly talked about in the Angular and React communities because people can program using a front-end framework and then activate server side rendering
#[eddie]I guess the one gotcha that isn’t a requirement of SSR but is sometimes assumed is that once the page is loaded, a front-end framework will be loaded so that it doesn’t require a refresh on the second page navigation
#jackyI'm leaning to it so I can do simpler component fetching of post types
snarfed, micahsilverman, [tantek], stevestreza, ichoquo0Aigh9ie, tbbrown, koddsson, cweiske, KartikPrabhu, [kevinmarks], swentel, [svandragt], [Vanessa], leg and [jdpinto1] joined the channel
#[jdpinto1]Anyone have any idea why brid.gy isn’t picking up the Twitter `u-syndication` links from my `h-feed`? Looking at the crawl report, it looks like it’s not even finding the `u-url` links on my microblog posts (but for some reason has no trouble with those on my blog posts)! My site and h-feed: https://juanpinto.me
[jgmac1106] joined the channel
#sknebel[jdpinto1]: this doesn't work: <article class="micropost h-entry e-content">, you need to put the e-content on a nested tag. what you have here looks like the *h-feed* has a property e-content that contains an h-entry, which is not what you want
[svandragt] joined the channel
#[jdpinto1][sknebel] What a simple fix. Thank you, everything is now working as it should 🤣
KartikPrabhu and ichoquo0Aigh9ie joined the channel
#[eddie]!tell wow, adding podcast artwork is not an easy problem in xray! 😆 I thought I would do it and send a PR. It turns out, PicoFeed doesn't seem to have any functions to get that info, so PicoFeed has to be patched, THEN it has to be used within xray to add it to featured
#[eddie]!tell aaronpk wow, adding podcast artwork is not an easy problem in xray! 😆 I thought I would do it and send a PR. It turns out, PicoFeed doesn't seem to have any functions to get that info, so PicoFeed has to be patched, THEN it has to be used within xray to add it to featured
#Loqiwow: [eddie] left you a message 1 minute ago: podcast artwork is not an easy problem in xray! 😆 I thought I would do it and send a PR. It turns out, PicoFeed doesn't seem to have any functions to get that info, so PicoFeed has to be patched, THEN it has to be used within xray to add it to featured
#sknebel(and got told off by nickserv for using someone elses registered nick)
snarfed joined the channel
#aaronpk[eddie]: ohhh yeah that is probably why i haven't done it yet!
#Loqiaaronpk: [eddie] left you a message 9 minutes ago: wow, adding podcast artwork is not an easy problem in xray! 😆 I thought I would do it and send a PR. It turns out, PicoFeed doesn't seem to have any functions to get that info, so PicoFeed has to be patched, THEN it has to be used within xray to add it to featured
#[eddie]Haha, yep, that's probably why. I even looked through PicoFeed to see if I could like arbitrarily search the xml for the right pattern, but it seems like you have to adjust the parser, the actual feed item to hold it in an attribute on that class, etc.
#[eddie]yeah, I ran into that 404'd repo as well haha
#aaronpki vaguely remember trying this at one point
#aaronpkthis is a perfect example of why i don't like libraries that hard-code accessing properties. too hard to extend.
#aaronpklike just give me a method to find arbitrary stuff in the XML doc, that's what XML is for
#aaronpkanyway shouldn't be too hard to do a patch for accessing the artwork, and i'm already using my fork of this library so we'll just treat it as canonical now
#gRegorLoveyikes to just deleting the repo. glad you have a fork.
#[keithjgrant]What does it take to add one of those Deploy to Heroku buttons for a node app? Do I just add the button & it works, or is there more work required?
[schmarty], micahsil_, benwerd and [kim_landwehr] joined the channel
#[tantek]As GitHub's feature is, though "Unarchive" is a horrible verb
KartikPrabhu joined the channel
#[tantek]well with two visible examples like that in UIs, we probably should figure out a way to incorporate that additional meaning into the /archive dfn and page
KartikPrabhu, micahsilverman, micahsil_, snarfed, benwerd, cjwillcock, [eddie] and bdesham joined the channel
#bdeshamhi all. I'm looking into copying my twitter posts to my own site, starting from the "export your data" download.
#bdeshamI notice that, prior to october 2011 or so, all of the tweets in the archive show a time of 00:00:00, even though the real time is shown in the web interface.
#bdeshamdoes anyone know of existing tools to patch the data download with the tweets' actual times?
[schmarty] joined the channel
#bdesham(some kind of service that would accept a tweet id and return the tweet's time, whether from the website or the api, would also work)
#donpdonpbetter constructed than some general purpose languages :)
micahsilverman joined the channel
#tw2113gRegorLove, nudged me over here, and i'm going to re-word my original question.a little. Which do y'all tend to prefer more? Microformats or Schema/Structured Data?
#snarfedbdesham: twitter offers two archive formats, json and html. sounds like you got the html one. the json one includes timestamps in tweets earlier than 2011.
#bdeshamsnarfed: hmm. is it possible to choose which format you get?
#bdeshamwow, "not obvious" is right! thanks, I'll take a look
micahsilverman, snarfed and KartikPrabhu joined the channel
#gRegorLovetw2113, schema seems to mostly be about making your stuff show up in Google a certain way, which is a low priority for me personally. mf2 has given me really cool things on my site like federated commenting (with webmention) and publishing via Micropub
#[eddie]yay, instead of having to manually create posts in my website when I create a new podcast episode, I now have a node.js script that I can run that (for now) uses unmung to convert XML/RSS to mf2 html and then to mf2 json, it then uses the mf2 json to create a micropub request to my website and posts the podcast episode.
#gRegorLoveI find mf2 easier to comprehend, too, and more DRY
#[eddie] I was able to save probably 20 minutes of creating 3-4 episodes by running the new script. it doesn't run automatically (and doesn't really need to because I don't do episodes every day), but any day that I have posted a new podcast episode, I can manually trigger this script and it automatically imports any podcast episodes created the same day
#[eddie]Tomorrow's project, get [cleverdevil]'s Overcast script working on my website and/or maybe convert it to node.js because python 🤢lol
#sknebel[eddie]: lol, and I'm always tempted to see how many *different* languages I can use in my site, and need to remind myself that has downsides :D
#[cleverdevil]LMK if you do decide to use my script. Happy to help you get it installed and running and to show you how to revise it for a standard Micropub publish vs. my custom craziness.
#sknebel(I can also always be asked for help with Python stuff)