What are the key criteria the DAO considers when selecting projects to support through Uplink?
Uplink DAO employs a comprehensive due diligence process to evaluate and support projects, focusing on several key criteria:
1. Business Model Sustainability: Projects must demonstrate a viable and sustainable business model, ensuring long-term value and alignment with Uplink’s mission to enhance decentralized internet connectivity. 
2. Funding Plan: A clear and realistic funding strategy is essential, outlining how the project intends to secure and utilize financial resources effectively.
3. Market Positioning: Projects should have a well-defined market position, identifying their target audience and differentiating factors within the decentralized connectivity space. 
4. Tokenomics: A robust token economic model is crucial, detailing token utility, distribution, vesting periods, and mechanisms to prevent issues like token dumps, ensuring fairness and community trust.
5. Background Checks: Comprehensive evaluations of the project’s team and stakeholders are conducted to verify credibility, experience, and commitment to the project’s success.
6. Regulatory Compliance: Projects must adhere to relevant legal and regulatory standards, ensuring operations are within legal frameworks to mitigate risks. 
7. Governance Structure: A clear and effective governance model is required, promoting transparency, community involvement, and adaptability in decision-making processes.
By rigorously assessing these areas, Uplink DAO aims to support projects that are not only innovative but also sustainable, compliant, and aligned with its vision of expanding decentralized internet access.
Thanks for the detailed breakdown of Uplink DAO’s evaluation criteria.
It’s reassuring to see such a structured and thorough approach to supporting projects in the decentralized connectivity space.
I’m curious — are any of these criteria currently weighted more heavily than others (e.g. sustainability or governance), or are they all considered equally important during assessment?
Also, are there any recent examples of projects that were accepted or rejected based on these criteria? That kind of context could really help the community better align their proposals.
Appreciate the transparency and all the work you’re doing.
Based on the DAO’s own community summary, Uplink evaluates every project against the seven core pillars—Business Model Sustainability, Funding Plan, Market Positioning, Tokenomics, Background Checks, Regulatory Compliance, and Governance Structure—without any publicly-disclosed “point weights” assigned to one over another. Instead, proposals are assessed holistically, with the community applying each lens equally to form a consensus judgment .
In practice, however, two patterns emerge in recent rounds:
- Regulatory Compliance & Governance Clarity often prove to be the sharpest filters.
- Example Rejection: The BandwidthX pilot was tabled earlier this spring after the DAO’s legal working group determined that its compliance documentation was incomplete and its on-chain governance mechanisms insufficiently detailed. The team has since been invited to resubmit once those gaps are closed.
- Sustainability & Tokenomics frequently tip the balance for approval.
-
Example Acceptances:
- The io.net Ă— Mira Network GPU-compute partnership sailed through, thanks to a rock-solid token-incentive layer and a clear staking-and-delegation model .
- ROVR Network was also greenlit after demonstrating both a viable decentralized mapping business model and a well-structured regulatory framework for its LiDAR-enabled DePIN use case .
While Uplink DAO keeps its exact scoring rubric private (to discourage “gaming” of the system), these examples illustrate how compliance/governance hiccups most often lead to deferral or rejection, whereas projects that nail both sustainability and token design tend to win community backing.
This makes a lot of sense — applying a holistic lens while still having clear patterns emerge gives both structure and flexibility to the process. I’m curious though: are there any plans to publish anonymized case studies or deeper breakdowns post-round to help teams better align with DAO expectations?