We are shipping an educational website to help the general public understand whether a web3-service is private or not. Its core feature - scoring mechanism validated by the market (reference: web3 - l2beat, web2 - IMDB).
Useful links
MVP description: here Initial idea: here Privacy players' survey: hereLatest scoring assessment: hereETHRome prototype: here
The last iteration helped us to test the latest scoring MVP & expand it. Now it’s time to test visual interfaces, requirements for smooth step-by-step user flow & additional help via explainers.
This led to the creation of a Live demo delivered as a part of the ETHRome hackathon.
But before ETHRome we used a mobile interface format to play with minimum visual data per screen. This helped us to deliver short, but actionable instructions for the general public to attest privacy claims in a startup way (MVP with multiple potential pivots).
Goal: quick assessment of the basic privacy claims via open-sources features like active Github repo or Documents.
The validity track helped us to test “visual noise pollution”, scoring explainers, potential UX/UI features & information fullness for a non-privacy expert audience.
Its goal isn’t to prototype 100% of the visual elements but to create a “sandbox” for matching the scoring model with user flow & additional explainers.
Goal: make deeper DYOR with simplified guides.
Checklists will play an important role within the user flow because they will cover subjective interpretations of whether a privacy project is legit or not (like reputation).
For now, it will help to shift the responsibility of privacy assessment to the person. While our role is to provide guides and tips for observations. But the final decision to use the service or not should be on the person.
Long-term goal: maximise automation within DYOR steps, so the person can’t fall into the dark patterns of privacy.
Goal: provide the non-technical audience with easy-to-understand explainers covering missing knowledge.
Explainers play an important role in our project's success because they boost knowledge & confidence in DYOR. We acknowledge that not everyone who will use our website is techy or from the web3 industry, so we need to explain basics like “What’s Github” or “Why documentation matters in an open-source?”.
Different explainer examples will be covered in an MVP:
Step1: Validity track
Github
Github is …
Contributions are …
Documents
Documents are …
of pages represents
Team
Publicity of the team …
Anonymous team represents
Third-party audit
Security audit is …
Third-party role -
Step2: Checklists
Reputation
Readiness
Test-net is …
Mainnet is …
MVP is …
Beta or Alpha is …
Traction
Publicity of the team …
Anonymous team represents
Signup
Surveillance capitalism is …
Data aggregation is …
Goal: provide a visual example of the scoring model testing.Project: dVPN at Cosmos.
We are using a simplified signal “semaphore” (mind, not PSE’s Semaphore project) system to highlight the results of the scoring assessment.
We will test the scoring model on broader categories & projects that will help to create a balanced assessment process. This time we covered a rather “negative” example with a high risk of user data exploitation.
The scoring model is a rabbit hole with an infinite iteration. The majority of privacy experts have their own views on assessment (covered in the useful links section of this article), so they could always add or invent more features to be attested. This conflicts with a startup way: deliver MVP → then make it better towards Beta.
But we took a moment to test scoring model 1on1s & applied them to the mobile interface, asking ourselves how the scoring would change if we added more & more “steps” (like “licenses” or “# of contributors”)?
Moreover, each iteration requires an additional set of explainers & practical examples (like links to the correct third-party audit - benchmark or security audit company reputation explanations).
We abandon the idea of shipping a “silver bullet” for privacy because it could take us years of research, experimentation &… black swans. We prefer to focus on a real-deal MVP that the general public could use & decrease the risks of using non-private services claiming that they are 100% private.
Here MVP means a combination of
basic “privacy hygiene” (10 min assessment)
DYOR with checklists (1+ hour assessment)
practical examples within the 1st category (DeFi)
bottom-up example of “forcing” privacy projects to become more responsible to the public & security audit companies deliver privacy-centric audits
*And do you remember the ETHRome live demo? *The asymmetrical approach to Web3privacy now platform helped to create desktop interfaces, expand privacy dataset, test scrapping tooling, engage privacy orgs from Railgun_ to Waku & many more. This will be covered in the next article.
Want to collaborate? Join us on the way to unlock privacy for millions: