At his recent NFT drop, Jesse Woolston faced a bit of a challenge.
He was pushing the boundaries of computer-generated art, resulting in breathtaking computer simulations of millions of particles smashing off of each other. Initial exports of his masterpieces were a hefty 1.6GB apiece. Unfortunately, the upload limit at SuperRare is 50MB, and compressing the file nearly 100x would result in a significant drop in quality on a 4K screen.
What was an artist to do?
In an ideal world, Jesse and other artists would export several versions of the video file. At the extremes, one file would optimize for an 8K screen while another might optimize for a smartwatch. Additionally, there might be several versions in between for various smartphones and consumer displays.
Unfortunately, Jesse was under a time crunch and needed to make a decision. Looking around, there wasn’t a best practice on how to do this.
Ultimately, we suggested including the following:
This is better than nothing, but still an incomplete solution. Would anyone else clearly understand the difference between
video_url_atomicform_optimized? Would we need to include other parameters (e.g. video height, width, file extension, and bitrate) in order for others to make the determination of which file they could use?
These questions are just the tip of the iceberg.
While this resolved the immediate challenge, it’s one of the many challenges that the NFT space will run into over the next 0-10 years if we don’t have a conversation about NFT standards.
What about Parallel Alpha, an NFT project planning to use its cards as a multiplayer game? At present, all of their NFT imagery uses English. What if the game takes off in South America and they need translations for other languages?
What about needing to search, sort, and filter by a Creative Commons license? What if I need a CC0 option like I can find on Unsplash to use in an upcoming ad?
Anyone attempting to meet these (and many other needs, see below) would address them in an ad-hoc fashion. Even if someone was thoughtful enough to include them, dApps wouldn’t necessarily know to look for and use them on any consumer-facing dashboard.
This is why I believe the time is right to unearth and solve these challenges and opportunities with the broader community.
In the future, most NFTs will be all data and no media - j1mmy.eth
Since NFTs exploded into the culture, the rate of change has been so astounding that most people are struggling to keep up with the present.
However, NFTs as a web3 primitive akin to index.html in web1 will continue to evolve and push the boundaries. This is why j1mmy.eth is correct to say that NFTs of the future will be all data and no media. Perhaps more accurately, they may contain media, but that might no longer be the centerpiece.
Additionally, several leaders in this space are noting a lack of ERC-721 standards around metadata, and how this may create barriers to adoption, interoperability, etc. If NFTs are trapped in hundreds of fragmented islands, this will limit dApp developers and, ultimately, end-user customers. This is the challenge and opportunity in front of us.
Some efforts to spin up an informal working group emerged but lost momentum almost as quickly as it was formed. Frankly, everyone is busy and the initiative is very ambitious and time-consuming.
This is why I’m going to take a different tactic.
A decade ago, I saw a similar pattern within an open-source, e-commerce platform regarding credit card security. PCI compliance was a legal requirement for all merchants, and yet thousands of sites were running transactions without a clue how to meet said compliance. Like all public goods, everyone needed the information, but no one person or company had the time or expertise to solve it.
I posted a proposal to write a Drupal PCI Compliance White Paper, got it funded, and it’s been downloaded thousands of times over a 10-year period. It remains cited on many module pages so merchants can make smart choices to reduce their legal burden to accept credit cards.
What I’m intending this time is similar but different. A full-blown metadata standard akin to an HTML 5 spec or schema.org-for-NFTs is a tall order. However, we need to start somewhere.
I intend to write a white paper titled “The Need for NFT Metadata Standards”. It’ll make the case for this initiative as well as outline all the possible fields or categories (25+) that are currently missing/overlooked in today’s NFTs. It will also provide example implementations and proposals on how to include this either on-chain or as part of secondary metadata layered on top.
Make no mistake. This will not be perfect. It’ll occasionally be incomplete or incorrect. But the hope is that this solicits the proper support and feedback to convert this to a true standards working group (or metadataDAO?) to take this to completion.
If you’re so inclined to join, help, fund, sponsor, etc. Feel free to contact me at email@example.com.
The end goal is a peer-reviewed (no, not you, Charles) white paper released under the Creative Commons (open to CC0, but likely sticking to attribution required if people sponsor/fund the work).
In an ideal world, this will serve as a starting point for what could ultimately be a schema.org for web3 set of recommendations. Below is the initial inventory of topics that I believe need to be addressed in the final white paper.
Here is a non-exhaustive list of things I consider to be missing in today’s most popular NFT projects:
It’s worth noting that many of these fields would be optional for the simplest cases. However, if we look forward 10 years in a world that has between 1 billion and 1 trillion NFTs, having the ability to search, filter, and sort off these values would be critical for everyone from developers to end customers.
Now, more work needs to be done to further justify and explore each of the above categories. Let’s explore 1 as a preliminary example.
Fields in metadata.json would take a form like follows:
Imagine spending/acquiring millions of dollars of NFTs, but being unable to then build a brand or business around the likeness or IP.
@punk4156 learned this the hard way after being an ardent supporter and collector of Crypto Punks.
Now, he ultimately was smart enough to earn millions of dollars along the way, so it’s hard to paint him into the role of a victim per se. However, the entire snafu would have been avoided if there were clear metadata properties that could be reviewed before the time of sale. By having machine-readable values, marketplaces like Opensea could allow customers to filter/sort by available legal rights. Smart contracts attempting to use NFTs could reject using ones that didn’t allow derivatives or commercial use.
In a sense, it’s somewhat absurd this doesn’t already exist on the major minting platforms. Flickr popularized the ability to tag all uploaded photos with the appropriate creative commons license.
And to keep things easy, the default could be “all rights reserved” and let minters opt-in to more permissive licenses. For those uploading or bulk-generating NFTs on the command line, sane defaults can be in place with the option to override by passing through flags.
Things like terms of service might be overkill for most implementations, but a default copyright field would be trivial to include. Ideally, a UI would provide a dropdown on a license selection. Intellectual property becomes a bit hairy and might be optional unless someone is trying to be sophisticated.
We may need to consider (given some may include code and custom logic) that code licenses need to be introduced. This might range from the default open-source licenses to dual open/commercial licenses. In extreme situations, there may be patents cited for particular logic included within or used by a given NFT. This may be an extreme outlier, but having a plan for citing 1 or more patents would be trivial to add as an optional value.
I’m barely scratching the surface in 1-2 of the 25 potential categories listed above. Further conversation and stress testing is needed to put together something sensible.
Like a previous white paper, I believe this might take 100-200 hours to do correctly. A full-blown standards proposal with community buy-in might take north of 1,000 hours. However, the outcome could be a critical document that can push the NFT industry forward and benefit it.
10 years from now, if we still lack standards, the friction this will generate will slow adoption and innovation. However, if we can start getting some metadata best practices rolling out, we can start to see these benefits compounding right now. Interoperable NFTs are the goal.
Maybe this needs a legit Metadata DAO. Until that point, I plan on taking the 1st stab at a white paper.
Are you interested? Would you like to help stress test? Fund? Review?
Even if you would just read it, please drop me a line at firstname.lastname@example.org to let me know you’re interested (and what specific aspects you’d love to see resolved).