Jump to content

Wikimedia Foundation/Legal/Community Resilience and Sustainability/Conversation Hour 2024 04 30

From Meta, a Wikimedia project coordination wiki

You are invited to the quarterly Conversation hour led by Maggie Dennis, Vice President of Community Resilience and Sustainability, on 30 April 2024 at 18:00 UTC.

Maggie and others from the Community Resilience and Sustainability team will be available to address Trust and Safety, the Universal Code of Conduct, Committee Support, and Human Rights.

This conversation will happen on Zoom. If you are a Wikimedian in good standing (not Foundation or community banned), write to let us know you will be attending the conversation at [email protected] at least one hour before the conversation. Please place “CR&S” in the subject line. Someone will follow up with the Zoom details.

We also welcome advance sharing of questions, particularly for those who are unable to attend. You can share your questions at answers(_AT_)wikimedia.org with the subject line “CR&S.”

If you do choose to participate in this conversation, Maggie would like to bring some expectations to your attention:

I can't and won't discuss specific Trust and Safety cases. Instead, I can discuss Trust and Safety protocols and practices and approaches as well as some of the mistakes we've made, some of the things I'm proud of, and some of the things we're hoping to do.

I will not respond to comments or questions that are disrespectful to me, to my colleagues, or to anyone in our communities. I can talk civilly about our work even if you disagree with me or I disagree with you. I won't compromise on this.

You may view the conversation on YouTube and submit questions live in Zoom and on YouTube.

The recording, notes, and answers to questions not answered live will be available after the conversation ends, usually in one week. Notes from previous conversations can be found on Meta-wiki.

Notes[edit]

  • I voted for the U4C members. The voting interface seems complicated. It was overwhelming. Could we do something different next year?
Maggie: SecurePoll is not perfect and there are challenges. We are open to know how we might do it better. The Elections Committee agreed to support this election even though it was not part of their mandate. We’re looking to hear about how we can improve.
DerHexer: The number has to do with the number of candidates. They are not sorted and it takes quite a while to vote.
  • There have been concerns about the [[Talk:Universal Code of Conduct/Coordinating Committee/Election/2024#:~:text=[subscribe]-,AI use in responses,-[edit source|AI use]] in responses by some U4C candidates in their responses, how do we plan to address this, especially for future elections?
Maggie: There is an assumption this is wrong. We need to see feedback  from the community and consider for the future with AI assisted campaign materials. Right now, there is no policy against it.
  • Not all of the U4C candidates seem appropriate for the role. What happens if the group of candidates elected are not suitable for the purpose?
Maggie: That is a risk for any election. I am hopeful the process will bring us a good roster of candidates. The Foundation would have to work with the community to plan on what to do in this situation if that happens. If people are elected who are not appropriate for the role, we would have to consider the process and the materials.
Nasma: The U4C Charter could be updated as needed depending on the need moving forward if it is determined that there are updates needed for the candidate application process and election procedure.
  • You lead a team that works on committee support. Does that mean you’d support the Global Council? Don’t you think the Global Council should have independent staff?
Maggie: We would support the global council if we were called to do so. We support committees which make governance decisions. We support committees who make decisions, like AffCom, and we support committees which enforce the decisions, like the Ombuds Commission and the Case Review Committee. We support them, but we do not control them. If we support the Global Council, we will endeavor to remain as neutral as possible. At the same time, I find it challenging at times to deal with the assumption that we are not working on the same problems and on the same issues. I do understand how this is a perception. At the end of the day, we are all part of the same movement. I think the Global Council should have staff to get done what they need to get done. If that turns out to be my people, we are going to do our best.
Risker: The MCDC has thought quite a lot about the cost and impact of running a Global Council. This would be somewhere between 3 - 5 staff members. The suggestion is to move the grants team from the WMF to the Global Council. There would be additional staff requirements for elections and processes. There may be staff we share with the Foundation. There are several positions which would be required and could be shared, like with the Wikimedia Endowment. Equity costs. Bringing together the Global Council annually will cost about the same at the Wikimedia Summit, about $600,000 - $700,000. To run the Global Council, it will be a minimum of about $1 million USD annually. Ultimately there would be a Global Council staff that reports to the Global Council. We need to make sure that the staff are appropriately paid and they’re not losing money to go from one role to another, they are representative of our global movement, and those are big picture things that are going to take 2 - 5 years to sort out.
DerHexer: The Wikimedia Summit outcomes included the note that the Global Council must have support staff. https://meta.wikimedia.org/wiki/Wikimedia_Summit_2024#Final_Outputs_of_the_Wikimedia_Summit_2024 "The Global Council must have directly managed staff, which must report directly to the Global Council Board." got most support at the Summit
Risker: If you have comments about the Charter, today is the day to make comments.
Note: Anyone can send emails to movementcharter{{@}}wikimedia.org
  • Several years ago, you released a statement supporting the LGBT+ community and other editors who are targeted because of identity factors. What have you done on this front?
Maggie: One of the biggest things we have done on this front is setting up the UCoC, Enforcement Guidelines and setting up the U4C. This has taken longer than hoped. Our ecosystem is becoming more challenging for some, including the LGBTQ+ editors. We were working on a peer support system. We closed that as we were not effective in that role. Communities are likely better positioned to support each other. I appreciate the partnership with the LGBTQ+ user group who have been instrumental when they encounter challenges. The dangers in the world can range all the way to persecution and death. Some people may not understand how active bad actors can be. It’s been a very valuable partnership. Addressing those challenges is not fast nor perfect. We would already be in a world where every user would be anonymous. I know that is against people in the movement for transparency, but I have seen too many people persecuted and targeted by their own governments to feel like being in this place under your legal name is a good idea. We continue to prioritize and work on this topic.
  • In light of that statement, how should we understand the situation in French Wikipedia?
Maggie: What we are talking about here is the use of deadnames on French Wikipedia. Our communities have had conversations about how to refer to people while being respectful and also encyclopedic. The Foundation is aware of what is going on and looking into the situation. The movement is going to have to help us to solve where we put our influence. The degree to which the Foundation gets involved in local policies and actions is a looming concern. I cannot support a system that allows one body to control everyone else. How do we get everybody in the discussion and how do we provide a safe and secure environment for everyone to contribute to free knowledge?
  • Which state actors pose the greatest threat to editors currently, and how do you recommend editors working on pertinent topics avoid their ire?
Maggie: I’d like to suggest that every editor should consider that their state poses a threat to them. I’m not going to give you a list because I do not honestly know. The first human rights case I ever dealt with was in a western European country. It’s important to remember if you’re going to be editing on sensitive topics, which change, you might look into what your community’s policy is on creating alternative accounts to edit those topics. There is digital security training which the Human Rights Team has created. There are ways to protect yourself - VPNs, alternative accounts, and so on. I would encourage anyone to consider that they are not fully immune.
Diff series from the Human Rights team on selecting usernames.
Risker: One of the cultural values is radical transparency. That means that every edit can be attributed to an account or an IP address. It’s hard to get rid of that information once it’s there. It’s going to be impossible to remove it completely. Step one in digital security is picking a good username. We are trying to learn and shift to give up some of that transparency. We want to protect you.

 

  • (Live from YouTube) As a follow-up, is the UCOC meant to cover the content rather than just the community who is creating it?
Maggie: Content is created by the community. The UCoC covers some factors which relate to behavior which may touch on the content created by that community.
Nasma: The UCoC is meant to provide baseline behaviors for us all. It can connect with content. It’s hard sometimes to figure out where the line is. It’s predominantly focused on behaviors.
Maggie: English Wikipedia handled a group of articles which were developed outside of the community processes. This is where you see content and behavior coming together where people are working to subvert the community processes. The Trust and Safety team is equipped to handle certain situations, but it is far better to have community processes to handle this. It’s much better for the long term health of our movement.
Research best practices on privacy whitepaper draft
Link to content vandalism and abuse of the projects
  • Given past experiences where severe violations involving local Wikimedia chapters were not adequately addressed, particularly when these chapters had significant legal and resource advantages for local issues, how does the U4C plan to ensure fair and effective enforcement of the Universal Code of Conduct in such scenarios? Specifically, what mechanisms are in place to prevent local chapters from using their resources to influence or evade UCoC enforcement actions?
Maggie: I don’t know what plans may evolve. The U4C is still being elected. First will be training on what their accountabilities are, depending on the charter and guidelines. They are going to be a busy committee. How those processes are going to work is going to be something we will develop with them. They will be documented publicly as they are meant to be transparent. This will likely come as the committee is created.
Nasma: We are in the process of electing and it will become clear soon about the type of cases they will receive. It will be a cross-functional approach.
  • The vote of the summit participants seems way more concerned with getting power for affiliates than it is representation for all movement members and diverse contributors. I note that “Hubs and affiliates must have a right to participate in developing core technology” got 78 supports versus 12 opposers. By contrast, “Processes must ensure that unorganized volunteers are significantly represented in regional batches of seats” got more than twice as many opposers (31) 31 and  “Processes must ensure that at least 40% of Global Council Assembly seats are occupied by non-male” got over three times as many opposers (38). Why didn’t you and other Foundation staff even participate in that poll?
Outputs from the Wikimedia Summit
Maggie: It’s a little difficult to assume from this pole what people’s priorities are. This was a poll focused on what the blockers were. It’s difficult to draw conclusions from this.
DerHexer: This event was an event for affiliates. It was not a gathering for the community. The community was invited to provide feedback on the Talk page and in other ways. It is good to open the audience for more community members  to have more conversations between affiliates and community members. Perhaps this is something for the Global Council to fill in the future so there is more diversity in our governance.
Geert: The Foundation staff was not allowed to vote.
Kaarel: Most of the points have been made. The important part is to look at the context. The context was, “Is this a dealbreaker?” to point out if there needed to be more conversation around certain topics.
Challenging Disinformation in the Wikimedia Ecosystem
  • At the recent CAC call, you spoke to an attendee about your disinformation team. There’s been a lot of recent press about potential censorship on Wikipedia before the US election in 2020. What does that team do? Do you think you do enough? Do you think you do too much?
Maggie: The Disinformation team  works similarly to the Trust and Safety Operations team. Concerns are raised and there is an organized effort to mislead. There is a group effort to bypass community processes in an effort to mislead. We have a small staff who look for signs of collaboration or cohesion or other evidence that other people are trying to undermine the processes. When that is discovered, they surface that to the community. We have a limited ability to ban people and often what we see is not to that level and we have regular connections to arbitration groups. We share what we found and leave it to them to manage. The only time we alter content is when we have a court order. There should not be a central body coordinating and managing disinformation.
Nasma: There are various workflows. There is the investigation and the times we support communities and there is the disinformation task force focused on elections. This year we are seeing more elections than we have before. The Disinformation team is working with the communities to identify what they are seeing. There is a clear plan to work with the communities.
  • I heard from a Foundation banned person that he didn’t have any explanation from your team of why he was banned. How come? Do you think that’s fair?
Maggie: There is nuance to that question. That depends on the safety and well being we are prioritizing. There is a little bit of information which is identifying the Terms of Use which they violated. Sometimes we prioritize the safety and wellbeing of others and are not able to provide the level of information some people wish to receive.