strugee!tell tantek I seem to recall you mentioning a while back that Google still parses mf2 even though they've replaced their proprietary markup like 3 times. do you remember where that was from? not sure if it was on a web page or in this IRC channel. context: https://github.com/brentsimmons/JSONFeed/issues/20#issuecomment-302654477
Loqi[strugee] @ttepasse https://w3c-social.github.io/social-web-protocols/ may cover a lot of what you're looking for.
Full disclosure, I'm a member of the W3C SocialWG working on this stuff, but a lot of what ActivityStreams is trying to do is simplify OStatus...
aaronpkcsarven: if you read my post, it's not an IWC position, it's me personally deciding how to mark up my reviews to specifically be indexed by google
aaronpkbasically i dont trust google's current recommendation of their markup format, because based on the last 10 years of data, they are likely going to change their recommendation again in a couple years. meanwhile they continue to index microformats just fine.
csarvenExperimented a lot with mf1 with SearchMonkey and they got a lot of stuff wrong.. Same goes for mf1, RDFa patterns in GoogleRichSnippets or whatever
strugeealso, got linked to https://xkcd.com/927/ from that Twitter thread. funny that it mentions character encodings because AFAICT Unicode (UTF-8) has basically destroyed everything else
aaronpki do believe microformats2 is more future proof than 1 because the parser doesn't need to know about vocabulary. also microformats2 isn't exactly new either.
ajordansame for people who write "competing projects" that have the one shiny feature but are more terrible in every other respect. just add the shiny to upstream
sandroShall we go ahead and make a CG issue for agendas/meetings? I'm really missing having a mailing list for reminders, and that seems like the best substitute.
sandroOnly decentralized version I can think of is the chairs have special microblogging account(s) for group announcements. That could also work, although I don't know if the infrastructure is quite ready.
cwebber2aaronpk: anything you want me to bring up on your behalf then? You have the WebSub status update on there, do you want me to relay it or keep it for next week?
Loqitantek: strugee left you a message 6 hours, 29 minutes ago: I seem to recall you mentioning a while back that Google still parses mf2 even though they've replaced their proprietary markup like 3 times. do you remember where that was from? not sure if it was on a web page or in this IRC channel. context: https://github.com/brentsimmons/JSONFeed/issues/20#issuecomment-302654477
tantek!tell strugee looks like you found your answer re: Google and microformats? They definitely still do parse microformats, question is where they are with parsing mf2, since they have unofficially made positive remarks about it for a few years, and may just be waiting for some measure of critical mass (which has accelerated in the past 2 years)
sandrostill can't mumble working on linux. can't decide whether to switch OS or install/debug from source. Oh well, android works, except for constant voice-overs that can't be shut off.
sandroaaronpk: websub test suite finished! websub.rocks. you can go test your implementations. it'll act as fake server, subscriber, hub, so you can see how your implementation is doing
sandrocwebber2: My feelings have shifted since last meeting. I previously imagined I'd feel like I failed if we didnt get AP to Rec before the group ended
cwebber2sandro: w3c has maturity process of proposed recommendation and recommendation, some people take that very seriously and won't build things without a w3c spec. we can't make changes beyond the group charter, and that means freezing the spec, which has risks if there are problems. In that case, shoudl we just not freeze it in stone, and keep it in a living document in the community group? we don't have to decide that immediately
cwebber2sandro: what I was going to say, if you happy to be involved with a w3c member org, then by all means point them at the link I just pasted. that link is access controlled and won't work for anyone who aren't advisory members, but part of the problem is getting peoples' attention so I am urging people there
sandro.. we can have the last CR say where you go instead, eg pointing to github for spec. Implementors have learned to look for stuff like that. So it's not the gold seal of approval, but there is a path forward.
sandroevan: I feel like Mastodon, recent popularity, seems to indicate this space is going to be much more driven by what's out there. Which leads me to living document. People will use what's in use.
cwebber2sandro: that's exactly the argument I made, a few years ago there was business intro in open social and that died down, but I'm trying to make the argument that there's reason to see excitement and the Mastodon stuff shows interest / value
ben_thatmustbemerandom aside, i released a new version of the microformats-ruby gem, includes a console based fetch and parse to json of any mf2 page, which includes most of the social sites here
sandroevan: I've been playing around with HTTP signatures and Linked-Data Signatures. This is not my area of expertise. But once I figured it out, it was fairly straightforward. Unlike OAuth which has a lot of questionmarks.
DenSchub(somewhat off-the-record, but i'd like to join any discussions, aaronpk and evan. the missing/undefined/imprecise definition of signing is one of the main issues we have right now)
cwebber2sandro: I made this because there was a big mastodon thread on mastodon a while ago... someone made a search engine that gathered stuff from public timelines and allowed search, which many of us found useful, but some people were extremely upset about. The person who brought it up took it down again because they didn't want to upset people. In the github thread you see me going back and forth with one of these people to deal with it.
cwebber2as a programmer I like to say "if I have access to this why I can't I index it etc". But there are users who want this functionality, but is there something we can do to balance waht different parties want here
sandro... posts sent to an instance, but ... only some interface ... some instance rules, ... unless enforced by some kind of 'treaty', it works or doesn't if someone tries to abuse it
DenSchubi always argue with "it's clear what server you're sending to, so if you don't trust the server, do not send your messages there" in such discussions about diaspora
wilkieit just needs to be clear that any extension that adds a form of e2e crypto or privacy creates messages that are ignored by implementations that do not understand them, which can be done by an extension that creates a new inbox for encrypted private messages
tantekq+ just to give a personal user anecdote example using robots.txt to block bots from my blog for its first two years felt "good enough" and then afterwards I changed how/what I posted. would like per-post robots.txt controls. might just implement this in my own CMS.
tantekq+ to give a personal user anecdote example using robots.txt to block bots from my blog for its first two years felt "good enough" and then afterwards I changed how/what I posted. would like per-post robots.txt controls. might just implement this in my own CMS.
sandroevan: astronouth7303 made a good point. We could consider, maybe in an extension, rather than auth'ing as user, when fetching an outbox, a search engine could have to provide some proof that it's the user it says it is. So a bad actor
Zakimtantek, you wanted to give a personal user anecdote example using robots.txt to block bots from my blog for its first two years felt "good enough" and then afterwards I changed
sandro... sandro mentioned possibility that someone could still implement a search used by abusers, might be most desired by them, could be flag in opposite direction. I think we have to do a lot of work on anti-abuse tooling.
sandroevan: In terms of how there wasn't a race to the bottom in Diaspora, the problem isn't just technical. I think Mastodon is probably more lgbt / social justice aligned, which indicates to me it would be much more likely for a group of tech-savvy harassers to WANT to break in,
ZakimAs of this point the attendees have been tantek, sandro, geppy, DenSchub, evan, albino, cwebber, Rushyo, MMN-work, knutsoned, aaronpk, astronouth, ben_thatmustbeme
sandroSo, I made a very small stab at this when I wrote the Home Page News item for the ActivityPub CR. No one reads these things, but I included a value sentence:
sandro"ActivityPub allows websites a direct social connection to user software, including Follow, Like, Share, and Comment, without an intermediate social network provider." https://www.w3.org/blog/news/archives/6302
sandroSo, I think news organizations, and maybe media companies more broadly, are sensitive to this. And maybe companies that advertise heavily. But I fear most tech companies wont really understand this message.
saranixif anyone interested my company consults for small businesses on how to take advantage of decentralized social networking to better communicate with your customers and expand business
cwebber2I think it's more about being able to be self-soverign, but there's an impression that you can own in a sense of physical property in those words, and that risks making it sound like social DRM :)
sandroastronouth7303, yeah, this is like, "They're came to your website, and they're ready to engage, so keep them rather then sending them off to some social network"
saranixthere are many popular articles (big name newspapers, guardian, NYT, etc) about how putting your business of Facebook costs you money and only increase FB bottom line while taking away from your own
sandroevan, on your devil's advocate point, I think it's enough to say "there are 600k [positive attributes] customers on Mastodon, and it'll only cost you $x to reach them as well, and you'll be early to that market so your share may increase as it grows." So not saying move off FB, just *also* do federation, because there are also key customers there.
evansandro: I think that argument is going to hold more weight as the fediverse grows. I think the lowest hanging fruit right now is to popularize this among startups.
saranixWhat I tell business owners, and it gets some traction (I'm still learning how to communicate best), is that bringing in new clientele from the internet isn't as important is getting meaningful communication with your existing loyal customer base
sandroRising tide lifts all boats. But it's a VERY hard sell to startups who need a story about their exit, and THAT requires having customer lockin, which we don't allow
saranixThat's one of the reasons I focus on small businesses and not VC's. Small business understands. VC's are still very centralization-minded. They want to "own the pie". Small businesses like having a marketplace full of pies. It's how we thrive :-)
sandroSo, one technical question is how well can these standards be integrated in a big media site. Can mediaCo actually have smooth UX for [Follow Us On Mastodon]? Right now, as far as I can tell is no.
sandrotantek, RSS Feed buttons were always horrible, but I take your point that people can learn to work with stuff if the value is great enough. Still, the bar has been climbing on lower UX friction.
sandroOh, I remember my best idea here -- a polyfill for browser functionality. Some sites kind of make the social-share functionality look like it's coming from the browser not the site (sliding in the from side), so maybe something like that, but more so. But there are some real technical/protocol challenges.
saranixtantek: you can't force lemmings to stop lemming. I wouldn't worry about FB. Their business model will come crashing down as soon as decentralization hits a certain threshold.
saranixabout follow links: if a person is authenticated, it becomes easier. During the authentication process, you found out what software they are running on their end, and can generate a suitable follow link. Without authentication, I think the only way is with browser support and a protocol handler
saranixit doesn't REALLY make sense to have a follow link unless authenticated anyway. Because who are you telling to follow? links are about telling a person where to go. I don't know where you want to go to follow me if I have no idea who you are.
saranixIf you are on your site (your own social stream), you have a connect box somewhere. Your software (generating your stream) should create a suitable follow link attached to my content that ended up in your feed via the Link headers at the top of the content when it was served to your social software (if it's public)
saranixI'd be wary about standardizing any sort of generic follow link unauthenticated either, because that would encourage browser makers to browse the web authenticated, which means you're leaving your identity everywhere and voluntarily tracking yourself. Come on. We know how browser-makers think. They won't play nice.
tantekexisting sites have +1 / like / tweet / share / follow me on twitter links without being authenticated, therefore I submit that it DOES maek sense to have follow functionality, empirically, without being authenticated
saranixMost people, once they have a decentralized identity, will stay on their own home page anyway. They won't do much "browsing". Just like how they stay on FB's page all the time.
saranixIf I'm advocating decentralization, I'm not advocating that you jump on my platform. I'm advocating that you choose whatever option you like to be able to communicate with my platform.
saranixMy instinct is they probably just wouldn't adopt it. Or would find some way to make it Google+ by default and really difficult to assign a non-google id to it
saranixA lot of people still hold out hope for firefox and smaller name browsers (forks really). But mozilla is corrupt by Silicon valley and the forked browsers suck quite frankly.
saranixwithout citing specific decisions, just look at how much they try to copy chrome. Even if it's by accident, they've utterly failed at being the libre option.
saranixI remember way back in the early days of the web, during the "browser wars", it seemed like W3C was doing a terrible job. In present day, I feel like social standards have really lagged behind as well. I'm now realizing that is as much my fault as anybody's though. I never bothered to get involved. Looking around, W3C seems like one of the most functional and open standards bodies that exists.
saranixhttps://www.vice.com/en_us/article/the-secret-ways-social-media-is-built-for-addiction I have to wonder though, if the mechanics really matter. I remember ever since I was a little kid being excited when there was mail in the mailbox. Even today I get a little excited, hoping it's a long-awaited check from a client. How is that different? Sometimes I think people get all bent out of shape "but it's on the INTERNET!"
aaronpkHaha just yesterday I was reminiscing of when I had to dial in to my email provider and download my email to read it offline. I didn't have an ISP, just email. I remember being so delighted when someone wrote me back and there was new email in my inbox after I dialed in to check it!
nightpoolhey everyone! sorry I missed the meeting today, i'm in a different timezone this week and totally screwed up the conversion. Is there anything important I missed that's not on the minutes?
sandroaaronpk, fwiw, reading https://indieweb.org/private-webmention I'm really not convinced the 'extra step of exchanging an auth code for an access token' is warranted. At least, the argument given about it being more secure seems very weak.
aaronpkI don't know about you but I don't want to create tokens that can read private posts indefinitely and send them to servers that aren't expecting them
aaronpkan optimization of the flow allows the receiver to reuse an existing token if the realm matches, so it means future private webmentions can skip the exchange step
sandroYeah, I like that, but the sender still needs to do more than seems necessary. You could let this play out by offering both: post either with code= or access_token= & token_type & expires_in, as you like. code= for the more paranoid about their content. See how often that's actually done.
sandroI mean, the only difference in security between the code and the access_token is a bit more time to exploit it. If you're worried about procies and log files, well, the code can be stolen that way, too.
aaronpkIf you send an access token in the initial webmention request then the spec should absolutely recommend that it has a very short expiration and has no privileges other than fetching the one post
sandroI don't agree, for the reason in my previous message. The extra round trip adds very little additional security, it just makes the time to exploit a little shorter.
sandroie, if I find a place I can see webmentions, then I set something to watch them, and as soon as one appears with a code, I turn it into an access_token and keep that.
saranixI have to agree here. If the reasoning is "When you send a Webmention, you are sending an unsolicited payload to the receiver. The authorization code is not requested by the receiver, so you cannot guarantee they will be protecting it if they aren't expecting it.", then an assumed unprotected code can easily be exchange for the token with no questions asked by anyone, it really isn't much different
sandroAnother approach would be to make a different rel and a different endpoint, rel=privatemention, but that seems kind of silly. I think just webmentions need to be treated as if some parameters in the mention might be highly sensitive -- only store/expose the parameters you know are safe.
sandromacaroons looks cool, but I think bearer tokens are fine for this. I just don't see upgrading from a short-lived bearer token to a longer-lived bearer token as adding enough security to be worth making the system like three times as complicated. It's still simple, of course, but it could be a lot simpler without the upgrade.
sandroOh, here's a cool hack: Make the initial bearer token very short lived -- like the code. If you see it used, then you know the receiver implements private webmentions. Remember that. Now, the next time you webmention to that same receiver, you can send a long-lived one. Now you have your long-lived bearer token, but it's never sent to someone who didn't expect it.
sandroI'm also thinking about how this generalizes. Like, if this were deployed, I'd probably piggyback on it for other things that aren't exactly mentions. I think that's okay, but what exactly are the semantics. Like when alice.example sends bob.example a private webmention, can bob used that bearer token for some other things? Like as a secret key for encrypting messages, or something. Of course it's really bob's webmention service that has the key,
sandrohey saranix maybe you know, speaking of tls, etc, another technique I've long wondered about in this space is if I can use a tls server certificate as a tls client certificate, for the case where a server wants to talk to another server. as I look at the specs, they seem the same syntax, but no one seems to talk about this application.
saranixI can't think of a reason why that wouldn't work. I use client certs for auth in my softwares actually. But not server-server (yet). Interesting concept.
sandroNot sure about that, but I think it's reasonable to have the https client code in your social web server run with the same privs as the https server code. (in my implementations, they're in the same process anyway)
saranixyeah that's what I do, but mostly for historical reasons rather than anything specific. For a second I thought it might be more convenient for setting up shared hosting but then I thought nah, if I did that I'd probably go full-on Containers and per-user server process anyways.
saranixIt still makes me feel uneasy how there's so much talk about security on priv ports like 53 and 25 but for some reason no one cares about 443... the whole thing seems a bit arbitrary