#dev 2019-03-14

2019-03-14 UTC
KartikPrabhu and iasai joined the channel
#
GWG
Just noted an idea to invalidate Indieauth bearer tokens when someone resets their password
#
GWG
Anyone does that?
eli_oat, iasai, [tantek], [jgmac1106], tw2113, gRegorLove and jjuran joined the channel
#
@kheeleesi
napul an njud kos dagway samo mga microsub
(twitter.com/_/status/1106040857445064705)
snarfed and Rixon joined the channel
#
@pospome
IndieAuthってのがあるのか。
(twitter.com/_/status/1106043140962545664)
iasai, gRegorLove, jjuran, anth_x, KartikPrabhu, gRegorLove_, cweiske, barpthewire, swentel, swentie and [jgmac1106] joined the channel
#
[jgmac1106]
Swentel++
#
Loqi
Swentel has 29 karma in this channel over the last year (50 in all channels)
iasai and [Rose] joined the channel
#
Zegnat
!tell [Rose],GWG I was just catching up on #indieweb. If you want to send email from PHP I recommend looking at https://stampie.github.io/Stampie/ as it will allow you to swap out the actual sender-service provider at a later date
#
Loqi
Ok, I'll tell them that when I see them next
#
[Rose]
Interesting, thanks Zegnat++ !
#
Zegnat
Always nice when you can abstract away platform specific things :D
#
[Rose]
@GWG On line 36 of your readme.md in the pull request linked in General are the extra line breaks intentional?
#
[Rose]
36-38 to be precise
#
[Rose]
It happens several times, looks like a forced wrap. Just not sure if it's intended
#
GWG
I write in readme.txt. There's a script that creates the MD file from it automatically
#
Loqi
GWG: Zegnat left you a message 2 minutes ago: I was just catching up on #indieweb. If you want to send email from PHP I recommend looking at https://stampie.github.io/Stampie/ as it will allow you to swap out the actual sender-service provider at a later date
[jgmac1106] joined the channel
#
[Rose]
Technically Markdown will ignore a single line break, it's just weird reading it
#
[Rose]
(Also, the hashes at the end of titles are unnecessary, but that was already present in the original document so matching styling makes sense)
#
GWG
WordPress readme style
#
GWG
Either way, I should see if there is an update for that converter script
#
GWG
I just hate having to edit two types of readmes
#
[Rose]
Which makes sense
#
Zegnat
Apropos nothing: I just learned the power of jq and I do not want to go back https://gist.github.com/Zegnat/f75a6090b02936eaa186790816b7d6ba#usage
#
cweiske
I once used jq to create SQL statements from a json file ;)
iasai and eli_oat joined the channel
#
[Rose]
What is JQ?
#
Loqi
jQuery is a popular JavaScript framework https://indieweb.org/jq
#
[Rose]
Good, it does stand for Jquery
#
[Rose]
Urgh, capitalisation. You can tell I'm running on very little sleep
#
sknebel
no it doesn't...
#
[Rose]
Then Loqi is confused
#
sknebel
it's a swiss army knife tool for json munging: https://stedolan.github.io/jq/
#
[Rose]
Maybe we should correct Loqi? Thanks for the link sknebel++
#
Loqi
sknebel has 42 karma in this channel over the last year (106 in all channels)
#
sknebel
What is jq?
#
Loqi
It looks like we don't have a page for "jq" yet. Would you like to create it? (Or just say "jq is ____", a sentence describing the term)
#
[Rose]
(Also, justification for indiewebbing at work! We need something like this!)
#
sknebel
jq is https://stedolan.github.io/jq/, a commandline tool for manipulating JSON data.
#
sknebel
... Kaja? asleep?
#
Zegnat
Yeah, jQ != jq
jeremych_, [frank], [keithjgrant], anth_x2 and iasai joined the channel
#
[keithjgrant]
looks like I'm due for another burst of activity on Omnibear
#
[jgmac1106]
is there any problem writing readme files in HTML instead of markdown? I hate googling stuff I already know how to write
#
cweiske
people tend to write readmes in the format they find most easy to write
#
cweiske
many use .md
#
cweiske
i personally use .rst
#
Zegnat
As I rarely read readme files in graphical tools like a browser, I personally do prefer some text based thing like markdown over HTML
#
[jgmac1106]
okay thx, just wanted to know the reasong
#
[jgmac1106]
I always think about like skiing and snowboarding same trip just slightly different vehicle
#
[jgmac1106]
but I know how to ski and if it gets me down the mountain...not gonna spend time learning to snowboard...even if it takes one plank instead of two
iasai and [Zegnat] joined the channel
#
jeremycherfas
Dates and times are hard. Is there a tool, for example, that would turn 2019-01-16T02:27:34-05:00 into 16-01-2019 02:27 or do I need to break it apart and reassemble it?
#
swentel
which language?
#
jeremycherfas
Looks like I have to go through strtotime a couple of times.
#
swentel
date('d-m-Y H:i', strtotime($date));
#
swentel
should be fine
#
cweiske
aaaaaaaaaaaaaaaaaaaaaaaaah
#
cweiske
d-m-y?
#
cweiske
that's the worst
#
cweiske
americans have m-d-y
#
cweiske
sensible people have d.m.y
#
cweiske
ISO standard is y-m-d
#
cweiske
oh, americans is m/d/y
#
cweiske
using "d-m-y" mixes things up
#
swentel
not sure what you mean
#
swentel
the input is fine
#
swentel
that's iso standard
snarfed joined the channel
#
jeremycherfas
Thanks. Let me try that.
#
jeremycherfas
It works! (I should have known it would.
#
swentel
oh wait, you mean that the output is confusing I guess, that's true :)
#
Loqi
Swentel has 30 karma in this channel over the last year (51 in all channels)
#
jeremycherfas
Yes, the output is non-standard, but it is what Grav expects, for some reason.
#
cweiske
I mean the output
#
swentel
jeremycherfas, interesting, for storage?
#
jeremycherfas
I'm not sure, to be honest. In the YAML there is a value for 'date' that takes that form and that is then used in the templates for display.
#
Zegnat
YAML should be able to take ISO
#
cweiske
yaml is primarily a serialization format. what you do with it is your own thing
#
Zegnat
Unless they depend on the YAML parser doing the data handling into a PHP object ;)
#
Zegnat
Because YAML
#
Zegnat
That is how you get the whole base-60 issue for numbers-that-look-like-times
#
Zegnat
https://docs.docker.com/compose/compose-file/#ports - “you may experience erroneous results when using a container port lower than 60, because YAML parses numbers in the format xx:yy as a base-60 value” - cweiske ;)
#
cweiske
that's the yaml spec I think
#
Zegnat
I am just saying that if Grav is using a YAML parser that does that sort of thing, it is suddenly super important to use a date format specifically supported by YAML. This while it is a serialization format meant to be hand-edited, it is sometimes a little hard to understand how to put data in
#
jeremycherfas
This is all a bit beyond me; maybe [Rose] has an insight
[Rose] joined the channel
#
[Rose]
So, Grav is really fussy about the date formats
#
[Rose]
If you use slashes it insists it must be American (despite those being used in the UK), so m/d/yyyy, and - is always dd-mm-yyyy
#
[Rose]
> ates in the m/d/y or d-m-y formats are disambiguated by looking at the separator between the various components: if the separator is a slash (/), then the American m/d/y is assumed; whereas if the separator is a dash (-) or a dot (.), then the European d-m-y format is assumed.
[schmarty], jackjamieson and [eddie] joined the channel
#
jeremycherfas
[Rose]++ for making that clear. I just knew I had to munge the date I had. :)
#
Loqi
[Rose] has 8 karma in this channel over the last year (21 in all channels)
iasai, gRegorLove_, snarfed, [tantek], KartikPrabhu, [jgmac1106] and [eddie] joined the channel
#
[eddie]
Regarding reply contexts, etc. My biggest issue is my Micropub spaghetti code
#
GWG
Okay, dev question?
#
[eddie]
I need a better event system
#
[eddie]
Because when I receive a Micropub request a BUNCH of stuff happens
#
GWG
Event as in h-event or event handling?
#
[eddie]
and I feel like it can cause issues with my server
#
GWG
Stuff?
#
[eddie]
event handling
#
GWG
Can you summarize?
#
[eddie]
For example webmentions
#
[eddie]
I think I need to add an asynchronous component
#
GWG
Can you add a queue?
#
GWG
Ahead of me there
#
[eddie]
Yeah I think that's what I need
#
[eddie]
essentially to say "I need to do these 10 things after a Micropub request is received"
#
[eddie]
but only do them one at a time so my server doesn't get overwhelmed
#
GWG
I still process Webmentions synchronously
#
[eddie]
because when I started trying to add more stuff I kept bringing my server down
#
[eddie]
because with Webmentions, they have to fetch my page content
#
[eddie]
so I think sometimes I've DDos'd myself
#
GWG
Although with a 60 second delay
#
[eddie]
ohhhh!
#
[eddie]
see you're smart
#
[eddie]
Each webmention is 60 seconds after the previous one?
#
GWG
Not really, I have to deal with wp-cron
#
sknebel
[eddie]: the typical post doesn't send that many webmentions though, that seems a bit odd
#
[eddie]
Sooo actually reply contexts would probably help
#
GWG
The 60 seconds is mostly to handle the possibility of syndication links
#
[eddie]
because I'm probably also DDoSing bridgy too
#
snarfed
bring it on
#
GWG
If my post is generated over Micropub, the post will have no context for a few seconds before it does
#
[eddie]
so essentially what I'm doing (which is not good) is every post with a reply or a like gets sent to bridg
#
[eddie]
bridgy
#
GWG
Also the reason for the delays
#
[eddie]
in case they have a github or twitter syndication link
#
[eddie]
but I guess if I add reply contexts, then I would KNOW if there is a syndication link
#
[eddie]
which means less webmention sending to Bridgy
#
GWG
Sounds like you need to make a list of the flow you have and refactor it
#
[eddie]
and less hits on my server as well
#
[eddie]
That is very true
#
[eddie]
pen and paper time? lol
#
snarfed
[eddie]: DoSes are generally measured in Gbps or qps (requests per second). let me know when you start posting at those kinds of volumes 😎
#
[eddie]
snarfed I think bridgy is safe then
#
GWG
If you are overloading your server, that's a bigger issue
#
GWG
I am interested because I am always looking to refine my process
#
[eddie]
Hmm yeah, I'm not sure what's going on. I manually trigger Bridgy Fed for now because when I tried to add one more webmention process to my server it seemed like the server would crash
#
[eddie]
it would reboot of course, but it wouldn't complete whatever was happening since it would crash
#
GWG
How big is your server? Spec wise?
#
[eddie]
It was like 8gb and stuff
#
[eddie]
it was actually way too big so I shrunk it recently and it's still working fine as is
#
[eddie]
so I definitely wasn't maxing out the server
#
[eddie]
I think it was something inside the node.js server
#
GWG
Multithreading issues?
#
[eddie]
That's a good question. I haven't dug into that much on the Node.js side
#
[eddie]
I've just been letting it do what it does
#
[eddie]
either multithreading or not
#
GWG
I am a php person, so can't help you there
#
GWG
snarfed, by the way, I was going to work on your Webmention issues
#
[eddie]
heh, yeah I think I'm in this position because originally my site was jekyll
#
[eddie]
then I needed to get posts into my jekyll site so I built a Node.js Micropub endpoint. Then I wanted to rebuild my site so I had my Node.js site occasionally run the CLI for Jekyll on the server. Then eventually I wanted parts to be dynamic so I funneled all traffic through node.js and if available showed the static html files. And now everything is dynamic
#
[eddie]
but the architecture of my site is probably note great if I were to sit down and try to build a dynamic CMS
#
Loqi
GWG has 43 karma in this channel over the last year (173 in all channels)
#
[eddie]
I think the answer to this is sitting down and mapping out the currently functionality and then thinking through what order this stuff should happen in and what things can be tweaked by adding new functionality
#
[eddie]
(like not having to call Bridgy every time if I have reply contexts)
#
GWG
Refactoring is good
#
snarfed
eh sometimes
#
GWG
On the other hand, have been cleaning my apartment for 18 years and it isn't done
#
GWG
snarfed, exactly
#
GWG
I miss you reviewing my PRs for that reason
#
GWG
Someone telling me not to do something
#
snarfed
happy to review code any time
iasai joined the channel
#
GWG
snarfed, if I did any new features, I would
#
GWG
I haven't since Micropub 2.0
#
GWG
I fixed some bugs
#
GWG
Changed the code to parse geo URIs into geo objects to cover swentel's use of geo URIs in Indigenous for Android
#
snarfed
oh btw GWG odd possible semantic-linkbacks bug: an emoji reaction here seemingly got interpreted as both a reply and a reaction? https://snarfed.org/2019-03-13_36399#comment-2709212
#
GWG
But everything else is the same
#
GWG
snarfed, I have three PRs waiting for pfefferle to look at now, I will be sending more
#
GWG
I didn't want to overload him, so I started small
#
Zegnat
[eddie], if you find a good queue solution, let me know. I want a queue that is both beanstalkd and gearmand at the same time. Beanstalkd supports delays (ie. “start job 60 seconds from now”) where Gearman supports unique jobs (ie. “do not add this job if it is already on the queue”).
#
[eddie]
tantek: So for readers to do this type of simple filtering, I guess we need a couple things. Encourage people to have feeds with all their posts available on or from their homepage.
[grantcodes] joined the channel
#
snarfed
GWG: eh don't guess at people's workloads, just send everything and let them handle as they say fit
#
[eddie]
tantek: step 2: Microsub probably needs to work on adding the ability to filter channels to the spec
#
[eddie]
Aperture can filter channels on it's own currently, but if we want to do author posts by default and everything else gets turned on manually, we'd need that to be part of the Microsub spec so people can do that inside their readers
#
[eddie]
Zegnat: ohhh interesting. Yeah, I'm not sure what i'll do for it, but I'll let you know when I get there 😄
#
GWG
snarfed, you may regret that statement one day
gRegorLove_ joined the channel
#
[eddie]
grantcodes++ awesome
#
Loqi
grantcodes has 22 karma in this channel over the last year (40 in all channels)
#
Zegnat
Even for something like receiving webmentions I would like those two options in my job queue. I can delay the webmention checking for say 5 seconds (which hopefully means if you send 20 webmentions I will come later than other verifiers and that way will not help the verification ddos) and if a specific webmention has already been received I do not need to add a verification job for it multiple times
#
Zegnat
So anyone who has done queueing in PHP, do let me know if you have a solution :)
[schmarty] joined the channel
#
sknebel
uses the good old "files in a folder" queue quite a bit
#
Zegnat
Does that mean something like inotify is watching, or a homerolled thing constantly requesting the dir listing? (I wouldn’t really want to build that, me thinks.)
iasai joined the channel
#
[schmarty]
discussion in main room reminded me: does anyone have an instagram-to-their-indiereader flow that results in the posts having u-photo on them?
#
[schmarty]
i've tried instagram-atom.appspot.com but of course the resulting atom feed puts an <img> tag in the content (as it would)
#
[schmarty]
i think i could probably run that through granary to get an mf2html feed, but aperture won't find any feeds until there is content, and the feed i'm using comes back empty unless there are new posts since the last time it was polled.
#
[schmarty]
ooh, i happened to catch it at the right time and get it added to aperture
#
[schmarty]
please disregard the above, lol
#
[grantcodes]
Yeah granary feeds work for me too
KartikPrabhu joined the channel
#
sknebel
Zegnat: yeah inotify for most (I think one just every 30s or so checks if there's a file, written before I figured out how to use the notify stuff)
#
Zegnat
I used to work with “the notify stuff” at my previous job. Where we needed to act on files being delivered by clients over FTP.
#
Zegnat
I just feel like there should be a more elegant solution for this that has me spent less time on the ifnrastructure and more time on just the how-to-handle-job part
#
sknebel
I felt like all the queue things I looked at would take me more time to understand then just hacking that together
#
sknebel
so while a better queue is on the todo list, this works for now
#
sknebel
(not necessarily advocating for that approach, it has clear downsides, but it's an option. same with homebrew around a database table or something - at our scale, you can do a lot)
#
Zegnat
I already have beanstalkd powering my web archiver. So I can homebrew a lot. It just doesn’t give me the combination of those two things I mentioned: delaying and deduping
iasai and leg joined the channel
#
@jgmac1106
Another webmention badge shipped. Look at the record I get from Telegraph. Time to own #Open Badges from your own Domain....Forget all the third party nonsense
(twitter.com/_/status/1106267421034332160)
iasai, KartikPrabhu and [tantek] joined the channel
#
[tantek]
[eddie] filtering is a UI thing, why does it need to be in a protocol spec like Micropub at all?
#
[tantek]
More like in a UX guide for Reader developers
#
[tantek]
Unless you're proposing saving some sort of "filter state" along with a subscription in the list of subs in the Microsub server?
#
[tantek]
And then requiring clients to respect that filter on that subscription when showing it?
#
[tantek]
That could be interesting. However I'd first want to see Reader UI with such filters before taking about standardizing anything about them
#
[tantek]
Ok if filters only work in a particular reader to start with (e.g. is state kept with the reader)
#
[tantek]
It encourages faster UX iteration and experimentation which we need to understand what kinds of filters do we actually want/need to represent instead of just pie in the sky academic "gimme all the possible filters I can invent" thinking
[eddie] joined the channel
#
[eddie]
tantek the issue with that is a Microsub client only gets the posts it receives, so unless it stores the posts locally in a cache, it has to request those filters from the server
#
[eddie]
So when you open Indigenous or Together or any of the other readers, they query the Microsub server for the posts in the timeline. It typically returns 20 posts
#
[eddie]
You would either A: have to intentionally retrieve a LOT of posts, or have the app send multiple of those requests to get the last 100 posts or something and cache all the posts
#
[eddie]
then you could have the client do the filtering through the UI
#
[eddie]
But based on the current Microsub server-client model, it is not incredibly easy to just do fast UX iterations without it being supported at the server level
#
[eddie]
which could be a challenge with how we are doing things
#
[eddie]
I am planning on adding caching support into Indigenous for iOS though so if I do that then I could experiment with filtering client side
iasai joined the channel
#
snarfed
[eddie]: this kind of sounds like premature optimization
#
snarfed
at least the concern of a microsub client fetching "too many" posts that would be filtered instead of displayed
#
snarfed
in practice i kind of doubt either the data volume or number of requests would cause problems in practice very often
#
snarfed
and if they do, we optimize, cache, etc. in most cases you don't really want optimization to be a driving factor in architecture/protocol decisions, unless it's a really dominant concern
iasai joined the channel
#
[eddie]
Okay, that’s good to remember. I’ll try to see if next time I work on Indigenous if I’m able to get it shifted in this direction
#
[eddie]
Do the general idea is just pull in a bunch of posts for each channel to ensure when you filter things that you still have posts available
#
Zegnat
Or keep pulling in until you reach a screen full of filtered items?
#
Zegnat
You probably do not need to fetch a whole bunch before starting to filter
#
[tantek]
Agreed with snarfed. Just have the client request more posts as needed
#
[tantek]
Zegnat is right too
iasai, [kevinmarks] and [grantcodes] joined the channel
#
[grantcodes]
The other valid reason for having filtering (and other features) on the server vs client is for the shared experience between readers. E.g. they would have to be set up on mobile and on desktop
#
[grantcodes]
Also allowing ordering channels could also be classified as a UI feature and that is already built into the spec
[eddie] joined the channel
#
[eddie]
Yeah, I think one issue with that is I have different peoples feeds in different channels so that I have different channels for various post types, etc. so sometimes I group people by channels other times I group post types by channels
#
[eddie]
One challenge with filters only in a single client means if i build Indigenous like this so i can use it the experience on another client will be bad
#
[eddie]
You are an indigenous user and then you log into Monocle and you feed is filled with aaronpk’s drink and ate posts
#
[eddie]
Or my play and listen posts
#
[kevinmarks]
The appengine queueing model is nice. You schedule POSTs to your endpoints and it deals with deferring and retrying. https://cloud.google.com/appengine/docs/standard/php/taskqueue/push/
#
[eddie]
It does really start to break down the great inter-app experience we have going on
#
[eddie]
I use Indigenous, Monocle or together depending on what device I’m on and what I want to do. This would make me only want to use indigenous
#
[eddie]
Which would be hard on a desktop lol
#
[kevinmarks]
On the other end of the tasks extreme, I wrote a text file import and munge tool in node today, and it was hard to know when it could terminate.
#
[grantcodes]
Yes, personal opinion is that having something that *permanently* filters content that you never want to see is better on the server to prevent those sort cross reader issues that would be very jarring, and we want to try and prevent lockin
#
[grantcodes]
I effectively have filtering in together with the different layouts, like the gallery view filters out posts without a photo or video, but that makes more sense there on the client because it is obvious visually that it couldn't show text posts in that view
#
[eddie]
I would even be okay with display filters on a channel (as opposed to permanent storage) as long as there is a way to sync between clients
#
[eddie]
Yeah that view does work well, and it’s not surprising if you go to a different layout and get additional posts
#
[grantcodes]
At the point of syncing it between clients that is already the server having to store the option, so it's not a huge step for the server to also apply that option
DenSchub joined the channel
#
GWG
Enjoying the reader conversation
[tantek] joined the channel
#
[tantek]
Eddie, grantcodes the downsides of client only filtering are accurate
#
[tantek]
However the trade off is between rapid innovation to better understand actual desired UX, and premature standardization
#
[tantek]
Also trying to code everywhere in sync will slow down iteration and ability to try new & different ideas
#
[tantek]
Something like user-centric filters is still very new territory so it deserves lots of rapid iteration
#
GWG
Couldn't the endpoint support the filtering functionality without the client knowing about it?
#
GWG
As sort of an interim solution?
#
GWG
At least if you write your own Microsub endpoint
iasai joined the channel
#
[kevinmarks]
It could create channels that were filtered
#
[grantcodes]
Pretty sure aperture already can filter stuff on the server
[schmarty] joined the channel
#
[schmarty]
aperture can indeed filter stuff on the server! like photo/video only channels, or excluding (or only including) likes, reposts, etc.
#
GWG
I am only using Yarns
#
GWG
I am eating the cooking from the restaurant that prepares it using food from my garden
#
snarfed
sounds like someone should write a microsub proxy server that adds filtering, with a minimal UI
#
GWG
I might be taking this analogy too far
#
snarfed
that would let us iterate on filtering without the spec, and also share the filters across clients
#
snarfed
buuuut that wouldn't allow filter experiments in clients, and aperture filtering may be good enough for server side experiments, so...meh
snarfed1 and [eddie] joined the channel
#
[eddie]
Yeah I think the challenge is tantek is talking about ux which really comes down to the Clients have to show the filtering controls as opposed to the server
#
[eddie]
Filtering in Microsub servers like Aperture or a proxy server is more like a permanent filtering. It feels like the type of filtering we’re discussing when it comes to UX iteration is really at the client level where individual users are going to live
[kevinmarks] and gRegorLove joined the channel