Hopefully, you're familiar with the Web3 product stack, and why it's considered composable. If not, then here's a quick refresher that explains the following layers:
Another precursor for this article that will be helpful is how web3 communities create tokens that become the membership structure that platforms are built off of, whereas web2 communities are forced into membership structures of existing platforms.
Communities in web3 issue tokens which are primitives used for defining membership - and if platforms want communities to use them, then they must support flexible membership structures.
Now if you follow all the concepts above, you'll start to realize that web3 communities already have two sources of contextual complexity:
At least that's a lot of contexts (data and storytelling) to play with! So what do community entry points look like? Well...
...that doesn't really help me understand the community or token much at all. It doesn't really feel fun either - but that isn't FWB's or Uniswap's fault. After all, how is FWB supposed to know where I'm going to enter from and how is Uniswap supposed to know (or care) which ERC20 tokens represent what community (or the tokens' significance)?
That brings me to the point of this article:
How do we comb through the complexity of web3 communities such that we can provide helpful context about what people are "buying" into when they go to app.uniswap.org or opensea.io or any other marketplace/exchange?
I'm not using the term "DAO" here, but you can mentally swap that with "web3 community" if you'd like.
Web2 communities are mostly stadiums, where everyone is fighting to appear on one surface (Twitter feeds, subreddits, GitHub repo PRs/issues, Medium publications, etc.). What happens on that one surface gives us most of the community interaction, with only one layer beneath it for discussion (sometimes it's a comment section, sometimes a separate Discord server).
The discovery surface is fairly shallow, but community managers/leaders have full admin control of these main surfaces on the platform. This "control" is a bit of a farce since they don't really have control of the rest of the community stack. Community leaders have limited say in the stability or features of tooling, and have no way to change payment infrastructure or support/incentive structures. As a result, the ability to cohesively bring together those different community subsets is rarely well supported. Even though the underlying relationship graph across communities is probably a fairly distributed web, that network pattern is only really leveraged by the algorithms.
In Web3, there are many different community contexts (surfaces) - all linked by the token(s). Communities pool around different facets of the token, becoming more uniform as you move down the token context stack.
My working assumption is that token context looks like this:
Anyone can create a new application using "X" token, and it will be fully composable with the existing systems around "X" token. But that new application is now tied to that token ecosystem, where changes at the tokenomics/token/utility levels can affect the applications at the surface. Here are some examples:
Bottom-up management in token context is a key difference from the top-down management of traditional web2 communities. The team has control over treasury/token distribution economics, as well as the base membership definitions (token utility) that govern the community. However, while the team's decisions have a lot of power, they are inherently low discovery. With the discovery stack so deep, how do we choose and represent what to show?
The difficulty here is not just in capturing and displaying the elements in token context but in doing so concisely. Forefront DAO profiles do a great job of showing stuff like access benefits, historical events, and general token stats. But I think we also need to capture something more dynamic.
Let's say I'm deciding whether or not to move into a new city. Pretty pictures, testimonials from friends, and statistics on job opportunities/economics may convince me to visit - but they won't be the reason that I choose to stay. What convinces me to stay is that first-week impression of the "pulse" of the city. You hear people say all the time, "the air is electrifying!" or that "you can just feel that the pace, culture, and emotion" in one city is different from another. That "pulse" is the dynamic bit we want to try and capture and show about tokens.
Continuing with the city model, I'll suggest four themes to work with as research jump-off points:
Bridges/Doors: If I'm buying a token, I obviously am curious about thresholds. But how many times have they been changed and applied? For what purposes? At what pace do I need to accumulate?
People/Politics: I want to know about the history and momentum of proposals, participation, and execution. What tools were used and token/delegation structures?
Public Parks/Spaces: What does the community composition look like and where do they live? What is the relationship graph of interactions across different surfaces?
Energy Consumption/Production: Which tokens are moving and which aren't? Are any of them interlinked/dependent on each other? What about the token's relationships with other communities tokens?
Ultimately, these questions should get us closer to the pulse across the layers of a token's context, in a more concise way than just listing everything or joining a flooded discord. With this data, we get community activation effects alongside more meaningful token trading volume:
Some of this sounds harsh, but I believe the worst is when both the community and the token holders are stagnant. Web3 communities can't really bankrupt like a typical company, but they can definitely still become zombies - no pulse at all. And if your token is trading actively but has no pulse/context - well that’s a clear warning sign too. Pulse and token context can help both initiate and revitalize communities!
On a more technical level, I think this means being able to map and track token activity (across all context layers) - using some sort of changepoint or momentum algorithm. Maybe token context can be represented in pulse as a couple values/models, maybe it can't 🤷♂️
Now, I'd like to start a data guild for studying all this within Mirror DAO. Note that you don't need to agree with my viewpoints in order to join, in fact, I would see that as a plus!
There will be roughly three main goals, using a few case-study DAOs as our base research data:
On top of providing context to newer or inactive community members, having this data will also help us unlock more comprehensive community health dashboards and better context around decentralized digital identities.
Please fill out this form if you are interested in being a core contributor. Looking for anyone with experience in DAOs from a data, storytelling, design, or tool liaison background!
Even if you don't make it into the core group, we'll still find ways for everyone who wants to be involved to contribute or lurk.
The form will stay open until November 26th, 6 pm EST.
For what it's worth, I think this is a strong step towards providing better context to decentralized digital identities as well.