#LoqiSwentel has 29 karma in this channel over the last year (50 in all channels)
iasai and [Rose] joined the channel
#Zegnat!tell [Rose],GWG I was just catching up on #indieweb. If you want to send email from PHP I recommend looking at https://stampie.github.io/Stampie/ as it will allow you to swap out the actual sender-service provider at a later date
#[Rose]It happens several times, looks like a forced wrap. Just not sure if it's intended
#GWG I write in readme.txt. There's a script that creates the MD file from it automatically
#LoqiGWG: Zegnat left you a message 2 minutes ago: I was just catching up on #indieweb. If you want to send email from PHP I recommend looking at https://stampie.github.io/Stampie/ as it will allow you to swap out the actual sender-service provider at a later date
[jgmac1106] joined the channel
#[Rose]Technically Markdown will ignore a single line break, it's just weird reading it
#[Rose](Also, the hashes at the end of titles are unnecessary, but that was already present in the original document so matching styling makes sense)
#ZegnatAs I rarely read readme files in graphical tools like a browser, I personally do prefer some text based thing like markdown over HTML
#[jgmac1106]okay thx, just wanted to know the reasong
#[jgmac1106]I always think about like skiing and snowboarding same trip just slightly different vehicle
#[jgmac1106]but I know how to ski and if it gets me down the mountain...not gonna spend time learning to snowboard...even if it takes one plank instead of two
iasai and [Zegnat] joined the channel
#jeremycherfasDates and times are hard. Is there a tool, for example, that would turn 2019-01-16T02:27:34-05:00 into 16-01-2019 02:27 or do I need to break it apart and reassemble it?
#jeremycherfasI'm not sure, to be honest. In the YAML there is a value for 'date' that takes that form and that is then used in the templates for display.
#ZegnatI am just saying that if Grav is using a YAML parser that does that sort of thing, it is suddenly super important to use a date format specifically supported by YAML. This while it is a serialization format meant to be hand-edited, it is sometimes a little hard to understand how to put data in
#jeremycherfasThis is all a bit beyond me; maybe [Rose] has an insight
[Rose] joined the channel
#[Rose]So, Grav is really fussy about the date formats
#[Rose]If you use slashes it insists it must be American (despite those being used in the UK), so m/d/yyyy, and - is always dd-mm-yyyy
#[Rose]> ates in the m/d/y or d-m-y formats are disambiguated by looking at the separator between the various components: if the separator is a slash (/), then the American m/d/y is assumed; whereas if the separator is a dash (-) or a dot (.), then the European d-m-y format is assumed.
#GWGIf you are overloading your server, that's a bigger issue
#GWGI am interested because I am always looking to refine my process
#[eddie]Hmm yeah, I'm not sure what's going on. I manually trigger Bridgy Fed for now because when I tried to add one more webmention process to my server it seemed like the server would crash
#[eddie]it would reboot of course, but it wouldn't complete whatever was happening since it would crash
#GWGsnarfed, by the way, I was going to work on your Webmention issues
#[eddie]heh, yeah I think I'm in this position because originally my site was jekyll
#[eddie]then I needed to get posts into my jekyll site so I built a Node.js Micropub endpoint. Then I wanted to rebuild my site so I had my Node.js site occasionally run the CLI for Jekyll on the server. Then eventually I wanted parts to be dynamic so I funneled all traffic through node.js and if available showed the static html files. And now everything is dynamic
#[eddie]but the architecture of my site is probably note great if I were to sit down and try to build a dynamic CMS
#LoqiGWG has 43 karma in this channel over the last year (173 in all channels)
#[eddie]I think the answer to this is sitting down and mapping out the currently functionality and then thinking through what order this stuff should happen in and what things can be tweaked by adding new functionality
#[eddie](like not having to call Bridgy every time if I have reply contexts)
#GWGsnarfed, I have three PRs waiting for pfefferle to look at now, I will be sending more
#GWGI didn't want to overload him, so I started small
#Zegnat[eddie], if you find a good queue solution, let me know. I want a queue that is both beanstalkd and gearmand at the same time. Beanstalkd supports delays (ie. “start job 60 seconds from now”) where Gearman supports unique jobs (ie. “do not add this job if it is already on the queue”).
#[eddie]tantek: So for readers to do this type of simple filtering, I guess we need a couple things. Encourage people to have feeds with all their posts available on or from their homepage.
[grantcodes] joined the channel
#snarfedGWG: eh don't guess at people's workloads, just send everything and let them handle as they say fit
#[eddie]tantek: step 2: Microsub probably needs to work on adding the ability to filter channels to the spec
#[eddie]Aperture can filter channels on it's own currently, but if we want to do author posts by default and everything else gets turned on manually, we'd need that to be part of the Microsub spec so people can do that inside their readers
#[eddie]Zegnat: ohhh interesting. Yeah, I'm not sure what i'll do for it, but I'll let you know when I get there 😄
#GWGsnarfed, you may regret that statement one day
#Loqigrantcodes has 22 karma in this channel over the last year (40 in all channels)
#ZegnatEven for something like receiving webmentions I would like those two options in my job queue. I can delay the webmention checking for say 5 seconds (which hopefully means if you send 20 webmentions I will come later than other verifiers and that way will not help the verification ddos) and if a specific webmention has already been received I do not need to add a verification job for it multiple times
#ZegnatSo anyone who has done queueing in PHP, do let me know if you have a solution :)
[schmarty] joined the channel
#sknebeluses the good old "files in a folder" queue quite a bit
#ZegnatDoes that mean something like inotify is watching, or a homerolled thing constantly requesting the dir listing? (I wouldn’t really want to build that, me thinks.)
iasai joined the channel
#[schmarty]discussion in main room reminded me: does anyone have an instagram-to-their-indiereader flow that results in the posts having u-photo on them?
#[schmarty]i've tried instagram-atom.appspot.com but of course the resulting atom feed puts an <img> tag in the content (as it would)
#[schmarty]i think i could probably run that through granary to get an mf2html feed, but aperture won't find any feeds until there is content, and the feed i'm using comes back empty unless there are new posts since the last time it was polled.
#sknebelZegnat: yeah inotify for most (I think one just every 30s or so checks if there's a file, written before I figured out how to use the notify stuff)
#ZegnatI used to work with “the notify stuff” at my previous job. Where we needed to act on files being delivered by clients over FTP.
#ZegnatI just feel like there should be a more elegant solution for this that has me spent less time on the ifnrastructure and more time on just the how-to-handle-job part
#sknebelI felt like all the queue things I looked at would take me more time to understand then just hacking that together
#sknebelso while a better queue is on the todo list, this works for now
#sknebel(not necessarily advocating for that approach, it has clear downsides, but it's an option. same with homebrew around a database table or something - at our scale, you can do a lot)
#ZegnatI already have beanstalkd powering my web archiver. So I can homebrew a lot. It just doesn’t give me the combination of those two things I mentioned: delaying and deduping
iasai and leg joined the channel
#@jgmac1106Another webmention badge shipped. Look at the record I get from Telegraph. Time to own #Open Badges from your own Domain....Forget all the third party nonsense (twitter.com/_/status/1106267421034332160)
iasai, KartikPrabhu and [tantek] joined the channel
#[tantek][eddie] filtering is a UI thing, why does it need to be in a protocol spec like Micropub at all?
#[tantek]More like in a UX guide for Reader developers
#[tantek]Unless you're proposing saving some sort of "filter state" along with a subscription in the list of subs in the Microsub server?
#[tantek]And then requiring clients to respect that filter on that subscription when showing it?
#[tantek]That could be interesting. However I'd first want to see Reader UI with such filters before taking about standardizing anything about them
#[tantek]Ok if filters only work in a particular reader to start with (e.g. is state kept with the reader)
#[tantek]It encourages faster UX iteration and experimentation which we need to understand what kinds of filters do we actually want/need to represent instead of just pie in the sky academic "gimme all the possible filters I can invent" thinking
[eddie] joined the channel
#[eddie]tantek the issue with that is a Microsub client only gets the posts it receives, so unless it stores the posts locally in a cache, it has to request those filters from the server
#[eddie]So when you open Indigenous or Together or any of the other readers, they query the Microsub server for the posts in the timeline. It typically returns 20 posts
#[eddie]You would either A: have to intentionally retrieve a LOT of posts, or have the app send multiple of those requests to get the last 100 posts or something and cache all the posts
#[eddie]then you could have the client do the filtering through the UI
#[eddie]But based on the current Microsub server-client model, it is not incredibly easy to just do fast UX iterations without it being supported at the server level
#[eddie]which could be a challenge with how we are doing things
#[eddie]I am planning on adding caching support into Indigenous for iOS though so if I do that then I could experiment with filtering client side
iasai joined the channel
#snarfed[eddie]: this kind of sounds like premature optimization
#snarfedat least the concern of a microsub client fetching "too many" posts that would be filtered instead of displayed
#snarfedin practice i kind of doubt either the data volume or number of requests would cause problems in practice very often
#snarfedand if they do, we optimize, cache, etc. in most cases you don't really want optimization to be a driving factor in architecture/protocol decisions, unless it's a really dominant concern
iasai joined the channel
#[eddie]Okay, that’s good to remember. I’ll try to see if next time I work on Indigenous if I’m able to get it shifted in this direction
#[eddie]Do the general idea is just pull in a bunch of posts for each channel to ensure when you filter things that you still have posts available
#ZegnatOr keep pulling in until you reach a screen full of filtered items?
#ZegnatYou probably do not need to fetch a whole bunch before starting to filter
#[tantek]Agreed with snarfed. Just have the client request more posts as needed
iasai, [kevinmarks] and [grantcodes] joined the channel
#[grantcodes]The other valid reason for having filtering (and other features) on the server vs client is for the shared experience between readers. E.g. they would have to be set up on mobile and on desktop
#[grantcodes]Also allowing ordering channels could also be classified as a UI feature and that is already built into the spec
[eddie] joined the channel
#[eddie]Yeah, I think one issue with that is I have different peoples feeds in different channels so that I have different channels for various post types, etc. so sometimes I group people by channels other times I group post types by channels
#[eddie]One challenge with filters only in a single client means if i build Indigenous like this so i can use it the experience on another client will be bad
#[eddie]You are an indigenous user and then you log into Monocle and you feed is filled with aaronpk’s drink and ate posts
#[kevinmarks]On the other end of the tasks extreme, I wrote a text file import and munge tool in node today, and it was hard to know when it could terminate.
#[grantcodes]Yes, personal opinion is that having something that *permanently* filters content that you never want to see is better on the server to prevent those sort cross reader issues that would be very jarring, and we want to try and prevent lockin
#[grantcodes]I effectively have filtering in together with the different layouts, like the gallery view filters out posts without a photo or video, but that makes more sense there on the client because it is obvious visually that it couldn't show text posts in that view
#[eddie]I would even be okay with display filters on a channel (as opposed to permanent storage) as long as there is a way to sync between clients
#[eddie]Yeah that view does work well, and it’s not surprising if you go to a different layout and get additional posts
#[grantcodes]At the point of syncing it between clients that is already the server having to store the option, so it's not a huge step for the server to also apply that option
#snarfedthat would let us iterate on filtering without the spec, and also share the filters across clients
#snarfedbuuuut that wouldn't allow filter experiments in clients, and aperture filtering may be good enough for server side experiments, so...meh
snarfed1 and [eddie] joined the channel
#[eddie]Yeah I think the challenge is tantek is talking about ux which really comes down to the Clients have to show the filtering controls as opposed to the server
#[eddie]Filtering in Microsub servers like Aperture or a proxy server is more like a permanent filtering. It feels like the type of filtering we’re discussing when it comes to UX iteration is really at the client level where individual users are going to live