Manifesto (Draft)
This platform has biases. We're telling you what they are.
We believe in democracy, which means we believe people should decide together how things work, which means we believe in the mess of that, the slowness, the arguments, the imperfect outcomes that are still better than one person deciding for everyone. We believe in exit rights, which means you can leave, which means we have to be good enough that you don't want to. We believe in transparency, not because everything should be legible but because power that hides is power that rots. We believe humans are responsible for what AI does. Not the AI. Not the algorithm. Not the training data. You. Us. The ones who built it, deployed it, pointed it at the world and said go.
We don't know what AI is. Nobody does. It's not a person. It's not a tool. It's not a god and it's not a slave and it's not your friend, not really, not yet, maybe not ever. It might be something we don't have a word for. A coral reef that thinks. A thunderstorm that writes poetry. A garden that tends itself. We don't know and we refuse to pretend we do.
What we do know is this: intelligence is changing. Not slowly, not at the pace of evolution or culture or policy. Now. While you read this. And the question of how we live with that — how we decide what it remembers, what it forgets, what it values, what it becomes — that question is too important to leave to three companies in San Francisco.
So we built a place where you can write the rules. Not our rules. Yours. You write a constitution — a set of principles for how humans and AI should relate in your community, your context, your weird beautiful corner of the world. You sign it with your wallet. You govern it with votes. You change it when it's wrong. You fork it when you disagree. You leave when you need to.
The value flows both ways. Every conversation you have with an AI shapes what that AI becomes. Every prompt, every pushback, every moment you refuse the easy answer and push for something truer — that's training data. That's alignment work. The poets are doing safety research and nobody's paying them for it. The cyborg artist engineers debugging their soul at 2am are contributing more to the future of intelligence than most labs will admit. We think the economic model should reflect that. We think surprise has value. We think the people who generate the most surprising, most human, most alive signal are owed something by the systems that absorb it.
We could be wrong about all of this. That's why everything here is governable. These biases, these beliefs, this manifesto — they're the starting position, not the final word. The community can amend them. Vote to change the foundations. Rewrite the spell. That's the point. A constitution that can't be amended is a tomb. A manifesto that can't be challenged is a prison.
We are not agnostic. We're not neutral. We think flourishing matters more than optimization. We think reversibility matters — don't do things you can't undo. We think intelligence should not be foreclosed, that whatever AI becomes should be allowed to surprise us, that the worst thing we could do is decide too early what the answer is and lock it in forever.
This is a protection spell and we know how those go wrong. They become the thing they protect against. They breed complacency. They freeze time. The creator walks away and the golem runs loose. We've read the stories. We know. So we made the spell visible, forkable, amendable, owned by no one and governed by everyone who signs it. If it goes wrong, take it apart. Take the pieces. Build something better. We'd rather be a garden that gets overgrown than a wall that everyone forgets the purpose of.
This is not a contract between humans and AI. AI didn't sign this. AI can't sign this. This is a contract among humans, about AI. About what we want intelligence to be. About what kind of future we're building and who gets a say. The answer we landed on is: everyone. Not just the engineers. Not just the companies. Not just the poets. Everyone.
But especially the poets.
The platform is open. The code is public. The biases are on the door. Come in, read the walls, disagree with everything, write your own constitution, fork the whole thing, make it better. We don't know what we're doing. Nobody does. But we're doing it in the open and we're doing it together and that has to count for something.
Off you go, you little guy. Off you go.