rhiaroCristina: this is the third meeting I attend, and I'm still trying to understand better what the activities are happening around this place. I know about AP from the conferences
rhiaro... was wondeirng how much people who were involved in the fediverse debate now have the experience of the blogosphere 15 years ago and the fears back then about the emergencence of a fragmented space with echo chambers
rhiaro... trying to get up to speed about what ideas people have about trying to avoid the bad things which happened in the blogosphere and the splinternet 1.0
rhiaro... if you have a large instance of 1mil users and half are facists but the other way are within the democratic spectrum, then it is very hard to block the whole instance in terms of democracy and freedom of speech
bobwyman_You don't need algorithms to produce the effect of "filter bubbles." The same effect can be produced when people make individual decisions about the blogs they will follow, what email lists to join, etc. Filter bubbles are a problem whether or not you have algorithms.
rhiaro... my proposal is that the very diverse AP implementers who have now blocked gab for example are coming together to have another layer above the AP
rhiaro... my dream would be that, I know it is very hard to achieve, it begins with wanting election justice, only every human has one vote, but apart from that I am proposing a governance model based on the free city of hamburg
rhiaronightpool[m]: some background info here, when sebastian is talking about instances he is talking about a server or collection of servers that share software, mod team and database
rhiaro... on mastodon and pleroma those are the main tools that admins have to block content, restrict connection, mostly in a blocklist type situation,r estrict content from being processed from other domain names
rhiaro... the instance based with small instances is a good approach but I want to go in a slightly different direction which is ignore the instance and instead let everything be relative to each user
rhiaro... we have that with users able to block other users, doesn't scale well, but with additional tools like user controlled algorithmic blocking or up/downvoting of content
rhiarosl007: the reason to limit instance sizes was what nightpool[m] said, users new to the fediverse join based on topics interested in or because of their friends
rhiaro... I want to avoid that an instance like gab becomes as large as it is, but stays at the same level like local instances like for a village or something
bobwyman_I suggest that we should distinguish between 1) A user's desire or need to limit exposure to non-credible content and 2) An ability to moderate the content that is seen by one or more users.
rhiaronightpool[m]: one important thing is this current situation where the software you use is coupled to the server and the domain name is a little bit, not exactly one the AP spec provides for
rhiaroCristina: thinking that as we see th efediverse as a group of different communities with the core values of diversity, inclusion, feedom of expression
rhiaro... i was wondering if it could be technologically feasible, thinking about also the blockchain idea sebastian mentioned, to define som esort of policy layer
rhiaro... so when you as an admin were peering with another instance you are showing your set of values, and if that other instance believes that they are sharing those values, that instance can peer with you
rhiaro... otherwise it's kind of impossible to envision a situation when you have decentralisation and you are also trying ot centralisae an entire way of doing things for all instances
sandrobobwyman_, I'm not quite following your distinctintion. Is it about user-for-themself vs someone-else-protecting-users, or is it about credibility vs other aspects of content quality?
rhiaro... as users follow other pepole, they subscribe to their updates and as those updates come up we can think of those follower connections as being two instances connected by 3 followers
rhiaro... definitely there are other ways you can learn about a post if someone boosts it or if somebody can send you a piece of content out of nowhere, they can write a reply without your instance every knowing anything about them
rhiaro... cristina was talking about filtering based on identity or history of individuals? essentially blocklists, arbitrarily interesting technology there
rhiaro... the work done currently is more about watching the types of software people implement, the moderation seems to be more based on filtering users because that's the pattern we have looking at the types of moderation examples in the past
rhiaro... we ban people from irc rooms, twitter bans people from tis platform. If i'm on a discord server with people, a person is kicked out, not some of their messages
bobwyman_Sandro, one view relies on users making their own choices, the other view delegates decision making to others. I prefer systems that allow users to craft their own "filters" rather than those that facilitate the ability of others (or software) to make decisions about what should be seen.
rhiaroannette_g: I want to start out from circling back to what i was proposing on the email thread which is coming from th epoint of view of seeing what would happen with the US presidential race recently where it took some examples of multiple platforms deciding to block trump before they all did. there was a groundswell of decision before they decided they should do it
rhiaro... the platform mods were probably holding back to see what the others would do. Feeling if they were the first to block they'd take a hit in terms of how attractive their platform is to their users
rhiaro... to have the right set of people, with expertise in sociology, psychology, politics, all the things that w3c doesn't necessarily have currently
rhiaro... but different groups with have different values, so maybe the best approach is to define levels and saying maybe level 1 protection system you will block with this particular stimulus to do so and another level you have a higher bar that someone has to reach before you block them
rhiaro... and it also occurs to me that defining these levels could be akin to waht was suggested earlier of having different instances that have the same level of values
rhiaro... it cold be that we would want to define values as these different levels and allow maybe more free or i fpeople from different levels are trying to connect then their posts are marked
rhiaronightpool[m]: one thing to note is about twitter and facebook both were watching each other act, and facebook took the first move and twitter had to do it
bobwyman_You may detest my political views or "values," but still find listening to me to be useful if we are talking about software design not social issues..
rhiaro... we imagine you come from a country like romania and [??] becomes a dicatator in the country, to join the fediverse to have a voice there he would have to agree to the human rights at least
rhiaro... if we want to go into human rights, we need to discuss about the topic and define it further, but what we can do I believ eis define a set of best practices of a way of moderating your own instance
rhiaro... defining a set of values which are agreed on or not agreed on at the higher level in terms of this instance will peer with this instance and if they do not peer in that situation
rhiaro... A small remark about individuals - I would be in gneeral a bit reluctant to promote censorship at their level, and let them free to do whatever they want as long as they agree to a certain set of conduct on the platform
rhiaro... defining certain levels, the trouble with levels is with any standard, twitter had standards and ignored them when it came to Trump until they had not chocie
rhiaro... there's an interpretation of the standard, does this content meet our standards or not, two people can have a different answer fo rthe same content
rhiaro... then people on the server presumably respect that level. If they see content that breaks that level of tolerance they can register a vote on it
rhiaro... and the collective votes of the users on the instance inform the algorithm on the instance towards whether the content does respect the servers stated level, and that affects whether it can travel to other instances
rhiaro... is anybody talking about using liquid democracy? Most people do not have time to set ifilters and play with settings, but might trust someone else
rhiaronightpool[m]: when mastodon first formed there were shared blocklist and chained blocklists, especially with in the aftermath of the blocktogether plugin, the initial queer and lgbtq communities who formed mastodon were on the receiving end of a lot of blocking due to conflicting with bigger social media personalities
rhiaro... historically that is why there has been resistance to that liquid democracy subject, when things get out of hand there are a lot of failure modes
rhiaro... On a different topic, applying it, a frien dof mine ahs refused to join the fediverse due to the inability to block all content from a stalker, no matter how that comes in
rhiaronightpool[m]: totally, a valuable perspective. For mastodon specifically all of the areas you mentioned we still block the user to prevent the content, but possibly there's a bug, we're a small team
rhiarosl007: I would propose we do the session about .. we had a lot of policy meetings, we should do the generic servers and diverse clients problem together with pleroma, mastodon, kaniini who was interested, immer.space
rhiarogekk, we discussed moderation policies, how to automate that, whether to filter people and/or messages, how to personalise moderation vs having a group of moderators, that sort of thing
jarofgreennightpool: can I ask a Mastodon question - I’m sending HTTP signed follow requests to Mastodon accounts but getting an HTTP error code back that indicates there was some access or permission problem.
jarofgreennightpool: Is there some test instance that would give me more logs on the exact problem or guide for people who are trying to get their software to federate?
nightpool[m]if you're getting to the point where you're not getting an error message but things still aren't showing up, you can ask Claire (@Thibg@sitedethib.com), she's another core developer who's pretty easy to get ahold of and can spend some time debugging things with you
jarofgreennightpool[m]: ok, got it - I realised I wasn’t checking for error messages properly. now I can see: Mastodon requires the Digest header to be signed when doing a POST reques
GregoryKlyushnikovJust wanted to add that this is a recent change — I didn't send this header and some instances started rejecting my requests even though nothing changed on my side
nightpool[m]it's a pretty critical requirement—if you exclude it, anyone you send a payload to can impersonate you to anyone else within the next 5 minutes
jarofgreennightpool[m]: Well, I now get a 202 response from Mastodon but if I follow a Mastodon account I don’t appear in “Followers” - if I installed my own Mastodon instance and looked in local logs would I see more?