Zegnat!tell [Rose],GWG I was just catching up on #indieweb. If you want to send email from PHP I recommend looking at https://stampie.github.io/Stampie/ as it will allow you to swap out the actual sender-service provider at a later date
LoqiGWG: Zegnat left you a message 2 minutes ago: I was just catching up on #indieweb. If you want to send email from PHP I recommend looking at https://stampie.github.io/Stampie/ as it will allow you to swap out the actual sender-service provider at a later date
[jgmac1106]but I know how to ski and if it gets me down the mountain...not gonna spend time learning to snowboard...even if it takes one plank instead of two
jeremycherfasDates and times are hard. Is there a tool, for example, that would turn 2019-01-16T02:27:34-05:00 into 16-01-2019 02:27 or do I need to break it apart and reassemble it?
jeremycherfasI'm not sure, to be honest. In the YAML there is a value for 'date' that takes that form and that is then used in the templates for display.
ZegnatI am just saying that if Grav is using a YAML parser that does that sort of thing, it is suddenly super important to use a date format specifically supported by YAML. This while it is a serialization format meant to be hand-edited, it is sometimes a little hard to understand how to put data in
[Rose]> ates in the m/d/y or d-m-y formats are disambiguated by looking at the separator between the various components: if the separator is a slash (/), then the American m/d/y is assumed; whereas if the separator is a dash (-) or a dot (.), then the European d-m-y format is assumed.
[eddie]Hmm yeah, I'm not sure what's going on. I manually trigger Bridgy Fed for now because when I tried to add one more webmention process to my server it seemed like the server would crash
[eddie]then I needed to get posts into my jekyll site so I built a Node.js Micropub endpoint. Then I wanted to rebuild my site so I had my Node.js site occasionally run the CLI for Jekyll on the server. Then eventually I wanted parts to be dynamic so I funneled all traffic through node.js and if available showed the static html files. And now everything is dynamic
[eddie]I think the answer to this is sitting down and mapping out the currently functionality and then thinking through what order this stuff should happen in and what things can be tweaked by adding new functionality
Zegnat[eddie], if you find a good queue solution, let me know. I want a queue that is both beanstalkd and gearmand at the same time. Beanstalkd supports delays (ie. “start job 60 seconds from now”) where Gearman supports unique jobs (ie. “do not add this job if it is already on the queue”).
[eddie]tantek: So for readers to do this type of simple filtering, I guess we need a couple things. Encourage people to have feeds with all their posts available on or from their homepage.
[eddie]Aperture can filter channels on it's own currently, but if we want to do author posts by default and everything else gets turned on manually, we'd need that to be part of the Microsub spec so people can do that inside their readers
ZegnatEven for something like receiving webmentions I would like those two options in my job queue. I can delay the webmention checking for say 5 seconds (which hopefully means if you send 20 webmentions I will come later than other verifiers and that way will not help the verification ddos) and if a specific webmention has already been received I do not need to add a verification job for it multiple times
ZegnatDoes that mean something like inotify is watching, or a homerolled thing constantly requesting the dir listing? (I wouldn’t really want to build that, me thinks.)
[schmarty]discussion in main room reminded me: does anyone have an instagram-to-their-indiereader flow that results in the posts having u-photo on them?
[schmarty]i think i could probably run that through granary to get an mf2html feed, but aperture won't find any feeds until there is content, and the feed i'm using comes back empty unless there are new posts since the last time it was polled.
sknebelZegnat: yeah inotify for most (I think one just every 30s or so checks if there's a file, written before I figured out how to use the notify stuff)
ZegnatI just feel like there should be a more elegant solution for this that has me spent less time on the ifnrastructure and more time on just the how-to-handle-job part
sknebel(not necessarily advocating for that approach, it has clear downsides, but it's an option. same with homebrew around a database table or something - at our scale, you can do a lot)
ZegnatI already have beanstalkd powering my web archiver. So I can homebrew a lot. It just doesn’t give me the combination of those two things I mentioned: delaying and deduping
@jgmac1106Another webmention badge shipped. Look at the record I get from Telegraph. Time to own #Open Badges from your own Domain....Forget all the third party nonsense (twitter.com/_/status/1106267421034332160)
iasai, KartikPrabhu and [tantek] joined the channel
[tantek]It encourages faster UX iteration and experimentation which we need to understand what kinds of filters do we actually want/need to represent instead of just pie in the sky academic "gimme all the possible filters I can invent" thinking
[eddie]tantek the issue with that is a Microsub client only gets the posts it receives, so unless it stores the posts locally in a cache, it has to request those filters from the server
[eddie]So when you open Indigenous or Together or any of the other readers, they query the Microsub server for the posts in the timeline. It typically returns 20 posts
[eddie]You would either A: have to intentionally retrieve a LOT of posts, or have the app send multiple of those requests to get the last 100 posts or something and cache all the posts
[eddie]But based on the current Microsub server-client model, it is not incredibly easy to just do fast UX iterations without it being supported at the server level
snarfedand if they do, we optimize, cache, etc. in most cases you don't really want optimization to be a driving factor in architecture/protocol decisions, unless it's a really dominant concern
[grantcodes]The other valid reason for having filtering (and other features) on the server vs client is for the shared experience between readers. E.g. they would have to be set up on mobile and on desktop
[eddie]Yeah, I think one issue with that is I have different peoples feeds in different channels so that I have different channels for various post types, etc. so sometimes I group people by channels other times I group post types by channels
[eddie]One challenge with filters only in a single client means if i build Indigenous like this so i can use it the experience on another client will be bad
[kevinmarks]On the other end of the tasks extreme, I wrote a text file import and munge tool in node today, and it was hard to know when it could terminate.
[grantcodes]Yes, personal opinion is that having something that *permanently* filters content that you never want to see is better on the server to prevent those sort cross reader issues that would be very jarring, and we want to try and prevent lockin
[grantcodes]I effectively have filtering in together with the different layouts, like the gallery view filters out posts without a photo or video, but that makes more sense there on the client because it is obvious visually that it couldn't show text posts in that view
[grantcodes]At the point of syncing it between clients that is already the server having to store the option, so it's not a huge step for the server to also apply that option
[eddie]Yeah I think the challenge is tantek is talking about ux which really comes down to the Clients have to show the filtering controls as opposed to the server
[eddie]Filtering in Microsub servers like Aperture or a proxy server is more like a permanent filtering. It feels like the type of filtering we’re discussing when it comes to UX iteration is really at the client level where individual users are going to live