The Rules
This page is primarily AI generated, based on the founder's vision documents.
Simple Rules, Serious Consequences
The network has only a few basic rules. They exist to preserve the one thing that makes this network valuable: the guarantee that every identity represents a real, unique, accountable human being.
- No human may have more than one active identity on the network.
- Identities cannot be sold, transferred, or shared.
- All content authored and signed by the identity must include a disclosure about AI involvement in the content creation process.
Rule 1 — One Person, One Identity
No human may have more than one active identity on the network. This is the foundation of Sybil resistance. If a single person can control multiple identities, the entire guarantee collapses — fake consensus, astroturfed reviews, and manufactured engagement become possible again.
The combination of in-person registration with a trusted human, biometric-gated device access, and a meaningful membership fee makes creating duplicate identities economically and logistically impractical. Each additional identity requires a separate fee, a separate registration event, and a separate trusted human willing to vouch for the registrant.
Rule 2 — No Selling, Transferring, or Sharing
Identities cannot be sold, transferred, or shared. An identity is bound to the person who registered it. If identities could be bought and sold, the network would become a marketplace for credibility rather than a guarantee of it.
The biometric requirement enforces this technically — only the person whose fingerprint or face unlocks the device can use the identity. The rule enforces it socially — sharing access to your device and biometrics with another person is a violation that results in inactivation.
Rule 3 — AI Disclosure
All content authored and signed by the identity must include a disclosure about AI involvement in the content creation process. This is not about banning AI — it is about honesty.
When a member signs content, they declare the level of AI involvement:
- None — no AI tools were used in creating this content.
- Ideation — AI was used for research or brainstorming, but the content itself is human-authored.
- Editorial — AI reviewed or edited the content.
- Collaborative — AI made substantial contributions, but a human was the primary author.
This gives consumers of content the information they need to make their own judgments about what to trust.
What Happens When Rules Are Broken
When a violation is reported, the accusation and the identity of the accuser are recorded publicly on the network. A committee is formed by randomly selecting from a pool of members who have volunteered for this role, excluding anyone who served on a previous panel for the same case. The committee reviews all submitted evidence, deliberates, and records its decision publicly. Panel members are compensated for their service regardless of the outcome.
Any member may request a formal appeal, heard by a second independent panel drawn by the same process. If the matter remains unresolved or contested, a final appeal may be brought to the board of the organization operating the network, whose decision is binding.
Every stage of the process — the accusation, the evidence, the deliberations, and all outcomes — is transparent and permanently attributed.
If a human violates the rules, their identity is inactivated and their membership fee is donated to a charity chosen at the time of registration.
Help Build a More Human Internet
Join the waitlist to be among the first verified humans on the network.
Join the Waitlist