Building Vital Communities Virtual & Actual

Talk about Yesterweb-specific projects and initiatives and the forum itself.
Post Reply
User avatar
Posts: 7
Joined: Sat Jan 21, 2023 9:06 pm

Building Vital Communities Virtual & Actual

Post by purelyconstructive »

Since it has been suggested that we should be building communities, here are some general observations about creating them, both virtual and actual. Much of it is speculative. None of this is an attempt to tell people what they "should" or "shouldn't" do, just share some ideas that they can contemplate, elaborate upon, or experiment with if they wish. Also, none of this is a criticism of the Yesterweb. I do not know all of the details behind its operation. But I am painfully aware that what will be needed may not always be clear right from the beginning of a project. Like most projects, I'm sure that its development was a learning process for all involved, and that the knowledge that was accumulated must be carefully integrated into whatever follows in order for it to be effective.

Please feel free to skip over this if it is not of interest to you, or share your ideas down below...Let's try to make communities together that serve everyone!

How Design Affects Interaction & The Integration of Ideals

The tool through which people communicate can influence the dynamic of a conversation. It seems to me that most "chat clients" (which includes Discord, Matrix, IRC, etc.) are structured in such a way that conversations are usually fast-paced and relatively ephemeral because many people use them to catch up with friends on a day-to-day basis. Inversely, "forums" are usually focused on elaborating upon specific topics, with a slower pace that is conducive to more thoughtful replies at sporadic intervals. Of course, these are generalizations that do not always hold true. I just want to emphasize the fact that every tool has its own strengths and weaknesses that must be considered whenever we make the decision to use it. Sticky notes or college-ruled lined paper? Bullet points or an essay?

Further, the design of a tool can elicit certain behaviors from people. As many here are already aware, quite a few "social media" platforms leverage that as a form of manipulation (a "dark pattern"), to get people addicted in order to extract as much attention and money from them as possible (e.g.: "likes"/"follows", "infinite scrolling", "targeted advertising", etc.). But it does NOT have to be that way! How can we design a forum so that it helps to build genuine connections with less work for those who maintain it? Or better yet, how can we make it so that it maintains itself by cultivating an environment of self-responsibility and collective encouragement for everyone both "inside" and "outside" of it?

Here are two interrelated ideas:

1. When the participants within a dialogue choose to be mutually respectful, then there will probably be less of a need for moderation by a neutral third party.

2. If certain standards are expected within a space, then it is important to make them plain from the outset, to reiterate them every time that they are updated, and if possible, to keep a record of why they were changed that is easily accessible to everyone.

Let's give an example:

One way to help implement both of these ideas is to require the Etiquette to be read in order to sign up to a forum, similar to when you have to agree to an "End-User License Agreement" (or EULA) when you install a piece of software. A check box signifying one's consent only appears after they have scrolled through all of the text. Some may just continue without reading it, but it is a small change that could have a profound effect, a kind of "passive moderation". It sets the tone.

There is another consideration that exists in complement. How can we assist others in communicating with more empathy and patience, so as to be able to reasonably enact that agreement? It may be helpful to share a couple of practical examples, such as:
We have to constantly keep in mind that a large amount of how society is currently structured conditions people into enacting anti-social behaviors, such as the aforementioned "dark patterns" that reward extremes to garner more attention. As described within the Manifesto, people literally have to re-learn how to communicate with one another sometimes. Not everyone may be willing, or even capable, of doing that at the same time or in the same ways.

This is where things can get a little tricky, and where a lot of tension seems to exist for many people nowadays on the Internet...

Some use anonymity as an excuse to harass others. When coupled to an "anything goes" attitude to moderation, it can create a very toxic and dangerous environment (e.g.: "cyberattacks", "cyberbullying", stalking, "doxing", death threats, "swatting", etc.). However, "censorship" (in the sense of attempts to control the types of information that other people have access to) never seems to accomplish what it is intended to do for several reasons. A couple of these reasons are:
  • The Backfire Effect - when a person becomes more entrenched within a particular belief whenever they encounter something that seems to challenge it
  • The Streisand Effect - when a piece of information becomes more well-known because more people seek it out when they learn that others are attempting to hide it
People need to be able to speak about difficult problems openly if they are to be resolved, but not every context may be appropriate for sharing them. Also, some conversations are so emotional and personal that striking a balance can be challenging in any circumstance, whether online or face-to-face. For example:

It is not unreasonable to ask people to refrain from using slurs against one another, especially within a public forum where that rule is explicit and the users agree to it upon signing up, like here. It is easy to automate the consequences for violations of such a rule too (e.g.: through a wordlist). Or to formulate a rule set that is relatively forgiving, like the first couple of uses gives the person a warning and a time-out, while the third use is an instant banning of the account.

However, it can be difficult to gauge the intentions behind more subtle word choices when people have strong feelings and rigid ideas about a sensitive subject. Automation may not be able to accurately assess the nuances of human communication. Algorithms can have biases built into them, or a computer system might produce errors when encountering data outside of what it has been trained to recognize and respond to. [In the context of "machine learning", this is called "fragility".]

Therefore, there will always be some element of human discernment involved, personal choices that are unavoidable. This evokes a lot of questions, especially: How does one design a forum to be "fair" to everyone as much as possible, and without all conflicts having to be mediated by only a few?

It would be interesting to try to design the moderation of a forum around the concepts of "restorative justice", wherein people are encouraged to hash out their disagreements by learning of the effects that they have on others directly. One method of doing this is called a "Restorative Circle".

To quote The University of California San Francisco website:
A restorative circle is an approach to repairing harm that has been done within a community. Participants in a restorative circle are encouraged to be open and honest about their perspectives regarding a conflict, how they have been harmed, and how they think others might have been harmed. Participants also work together to come up with ways to fix the harm that was done and restore relationships. People external to the group who support someone in the group may also be included.
Can you think of any ways that this could be structured into a forum? Perhaps whenever one user logs a complaint against another, all associated accounts are put into a group chat that is designed to work as the digital equivalent of a Restorative Circle? Can account reinstatement be done through some process of reconciliation?

Rarely do punishment or shaming lead to resolution, but rather, they have a tendency to create bitterness and vindictiveness. It can also lead to further fragmentation of a community over time.

Increasing Efficiency

Other than automation, two other ways of decreasing workload are:
  • Streamlining - to make the processes required for maintenance more efficient and thus easier for people to do
  • Teamwork - to distribute the implementation of maintenance tasks across trusted parties in ways that are equitable
In order to do either of these things we need to account for everything that must be done, a list of specific actions.

Most organizations are highly "compartmentalized", their activities opaque or purposely hidden. It would be wonderful to see an organization where every process that is done as part of its operation is known publicly. Transparency aids accountability. Likewise, security and secrecy are not always equivalent. People cannot make informed decisions about joining a group if they are not made fully aware of its goals, and people are less inclined to offer help when it is not known what is needed to fulfill them.

Maybe tasks that are pending can be automatically posted onto something like a community "bulletin board" where people can volunteer to contribute how they want? Or maybe each account profile could have an input field for individual skills, and tasks that fit within that category could be suggested to that person through a "recommender system"? They can click "No" to opt out of that task, or "Yes" to agree to do it and periodic reminders will be messaged to them until it is done. Instead of sorting algorithms being used to try to shape perceptions by "curating content", they can make teamwork easier to accomplish.

If there is a backlog of activity that needs to get done, then the number of available sign-ups could slow down. That way the community could stay open to new sign-ups, but still have the speed of its growth checked without anyone having to "gatekeep" who can be a part of it or not.

Generally, trying to design these types of feedback mechanisms could be useful in keeping things from spiraling out of control. If the source code for these designs are free and shared with a permissive license, then it will be easier to make sure that they are actually doing what they are supposed to without bias and their use can spread as more people choose to adopt them for their own projects.

Learning About Community Building

Many of the concepts and methods that are used to build communities offline can be adapted to virtual spaces.

An interesting one that may be worth looking into for this purpose is known as "Sociocracy". The name comes from the Latin "socius", like in the word "associate". It implies a connection between people who know one another. Contrast this with the word "Democracy", which comes from the Greek "demos", meaning "common people". It implies a mass of people who may not know each other.

Generally, Sociocracy is a way of structuring a group to keep everyone unified in purpose, while maintaining flexibility in activity. Communication flows all throughout, but tasks are distributed across semi-autonomous sub-groups that are more specific. There is no central authority. Roles are fluid and the use of voting is optional. Tasks and roles are decided by attempting to come to a temporary consensus through dialogue, with periodic reviews to gauge the suitability of a role or the effectiveness of a task so that they may be refined or removed if necessary. All association is voluntary.

The material which describes it might seem a little "dry" and "corporate" sometimes, but the ideas are interesting and potentially useful.

An Expansive Vision

While completely optional, it seems prudent to continue to expand upon the philosophical underpinnings of what we are doing here. The works of Lewis Mumford seem particularly relevant. Here is a collection of quotations from the associated Wikipedia page. They are organized with the intent of giving a good general overview.

Notice how all of it dovetails nicely with the practice of Permacomputing, and other concepts that connect online and offline concerns to an expansive vision in the long-term.
...For Mumford, technology is one part of technics. Using the broader definition of the Greek tekhne, which means not only technology but also art, skill, and dexterity, technics refers to the interplay of social milieu and technological innovation - the "wishes, habits, ideas, goals" as well as "industrial processes" of a society.
Mumford characterized his orientation toward the study of humanity as "organic humanism." The term is an important one because it sets limits on human possibilities, limits that are aligned with the nature of the human body. Mumford never forgot the importance of air quality, of food availability, of the quality of water, or the comfort of spaces, because all these elements had to be respected if people were to thrive. Technology and progress could never become a runaway train in his reasoning, so long as organic humanism was there to act as a brake. Indeed, Mumford considered the human brain from this perspective, characterizing it as hyperactive, a good thing in that it allowed humanity to conquer many of nature's threats, but potentially a bad thing if it were not occupied in ways that stimulated it meaningfully. Mumford's respect for human "nature", that is to say, the natural characteristics of being human, provided him with a platform from which to assess technologies, and technics in general...
Mumford believed that what defined humanity, what set human beings apart from other animals, was not primarily our use of tools (technology) but our use of language (symbols). He was convinced that the sharing of information and ideas amongst participants of primitive societies was completely natural to early humanity, and had obviously been the foundation of society as it became more sophisticated and complex. He had hopes for a continuation of this process of information "pooling" in the world as humanity moved into the future.
Mumford criticizes the modern trend of technology, which emphasizes constant, unrestricted expansion, production, and replacement. He contends that these goals work against technical perfection, durability, social efficiency, and overall human satisfaction. Modern technology, which he called "megatechnics," fails to produce lasting, quality products by using devices such as consumer credit, installment buying, non-functioning and defective designs, planned obsolescence, and frequent superficial "fashion" changes. "Without constant enticement by advertising," he writes, "production would slow down and level off to normal replacement demand. Otherwise many products could reach a plateau of efficient design which would call for only minimal changes from year to year."
A key idea [...] was that technology was twofold:
  • Polytechnic, which enlists many different modes of technology, providing a complex framework to solve human problems.
  • Monotechnic, which is technology only for its own sake, which oppresses humanity as it moves along its own trajectory.
Mumford also refers to large hierarchical organizations as megamachines - a machine using humans as its components.
Necessary to the construction of these megamachines is an enormous bureaucracy of humans which act as "servo-units", working without ethical involvement. According to Mumford, technological improvements such as the assembly line, or instant, global, wireless, communication and remote control, can easily weaken the perennial psychological barriers to certain types of questionable actions...
Mumford was deeply concerned with the relationship between technics and bioviability. The latter term, not used by Mumford, characterizes an area's capability to support life up through its levels of complexity. Before the advent of technology, most areas of the planet were bioviable at some level or other; however, where certain forms of technology advance rapidly, bioviability decreases dramatically. Slag heaps, poisoned waters, parking lots, and concrete cities, for example, are extremely limited in terms of their bioviability. Non-bioviable regions are common in cinema in the form of dystopias (e.g., Blade Runner). Mumford did not believe it was necessary for bioviability to collapse as technics advanced, however, because he held it was possible to create technologies that functioned in an ecologically responsible manner, and he called that sort of technology biotechnics. Mumford believed that biotechnic consciousness (and possibly even community) was emerging as a later stage in the evolution of Darwinian thinking about the nature of human life. He believed this was the sort of technics needed to shake off the suicidal drive of "megatechnics." While Mumford recognized an ecological consciousness that traces back to the earliest communities, he regarded emerging biotechnics as a product of neo-Darwinian consciousness, as a post-industrial form of thinking, one that refuses to look away from the mutually-influencing relationship between the state of the living organism and the state of its environment. In Mumford's mind, the society organized around biotechnics would restrain its technology for the sake of that integral relationship.

In Mumford's understanding, the various technologies that arose in the megatechnic context have brought unintended and harmful side effects along with the obvious benefits they have bequeathed to us. He points out, for example, that the development of money (as a technology) created, as a side effect, a context for irrational accumulation of excess because it eliminated the burdensome aspects of object-wealth by making wealth abstract. In those eras when wealth was not abstract, plenitude had functioned as the organizing principle around its acquisition (i.e., wealth, measured in grains, lands, animals, to the point that one is satisfied, but not saddled with it). Money, which allows wealth to be conceived as pure quantity instead of quality, is an example of megatechnics, one which can spiral out of control. If Mumford is right in this conceptualization, historians and economists should be able to trace a relationship between the still-increasing abstraction of wealth and radical transformations with respect to wealth's distribution and role. And, indeed, it does appear that, alongside its many benefits, the movement toward electronic money has stimulated forms of economic stress and exploitation not yet fully understood and not yet come to their conclusion. A technology for distributing resources that was less given to abstract hoarding would be more suitable to a biotechnic conception of living.

Thus Mumford argued that the biotechnic society would not hold to the megatechnic delusion that technology must expand unceasingly, magnifying its own power and would shatter that delusion in order to create and preserve "livability." Rather than the megatechnic pursuit of power, the biotechnic society would pursue what Mumford calls "plenitude"; that is, a homeostatic relationship between resources and needs. This notion of plenitude becomes clearer if we suggest that the biotechnic society would relate to its technology in the manner an animal relates to available food - under circumstances of natural satisfaction, the pursuit of technological advance would not simply continue "for its own sake."

Alongside the limiting effect of satisfaction amidst plenitude, the pursuit of technological advance would also be limited by its potentially negative effects upon the organism. Thus, in a biotechnic society, the quality of air, the quality of food, the quality of water, these would all be significant concerns that could limit any technological ambitions threatening to them. The anticipated negative value of noise, radiation, smog, noxious chemicals, and other technical by-products would significantly constrain the introduction of new technical innovation. In Mumford's words, a biotechnic society would direct itself toward "qualitative richness, amplitude, spaciousness, and freedom from quantitative pressures and crowding. Self-regulation, self-correction, and self-propulsion are as much an integral property of organisms as nutrition, reproduction, growth, and repair." The biotechnic society would pursue balance, wholeness, and completeness; and this is what those individuals in pursuit of biotechnics would do as well.
When Mumford described biotechnics, automotive and industrial pollution had become dominant technological concerns, along with the fear of nuclear annihilation. Mumford recognized, however, that technology had even earlier produced a plethora of hazards, and that it would do so into the future. For Mumford, human hazards are rooted in a power-oriented technology that does not adequately respect and accommodate the essential nature of humanity. Mumford is stating implicitly, as others would later state explicitly, that contemporary human life understood in its ecological sense is out of balance because the technical parts of its ecology (guns, bombs, cars, drugs) have spiraled out of control, driven by forces peculiar to them rather than constrained by the needs of the species that created them. He believed that biotechnics was the emerging answer and the only hope that could be set out against the problem of megatechnics. It was an answer, he believed, that was already beginning to assert itself in his time.
...he writes, "for those of us who have thrown off the myth of the machine, the next move is ours: for the gates of the technocratic prison will open automatically, despite their rusty ancient hinges, as soon as we choose to walk out." Mumford believed that the biotechnic society was a desideratum - one that should guide his contemporaries as they walked out the doors of their megatechnic confines (he also calls them "coffins"). Thus he ends his narrative, as he well understood, at the beginning of another one: the possible revolution that gives rise to a biotechnic society, a quiet revolution, for Mumford, one that would arise from the biotechnic consciousness and actions of individuals.
Thank you for reading!
Post Reply