For most of those reading this article the concept of search engine optimization, is a no-brainer as its primary purpose is to drive new users to your unknown brand, and if you did a good job on your product or service were able to convert them into a paying customer, way to go!
However if your a builder in the web3 space SEO has more than likely been an afterthought once you finished shipping that initial version of your idea. Now the question remains how do I get people to find me or pay attention to what I just built?
We’re all aware that brands dominate the headlines, but effective search engine optimization coupled with a killer funnel, can turn a hobby side project into a global brand in a matter of 18-24 months.
Now while this is titled “The SEO Guide to Next.js”, it’s written primarily for the Web3 audience. Why? for one I believe SEO will be how dApps reach users on a global scale, and because I like skating to where the puck is going and not where it currently is. So let’s dive in….
I’m going to assume you have a fundamental idea of some of the seo basics? If not no worries let’s go over a few of the basics so we’re all on the same page.
These are the titles you see in your browser tab, they are responsible for telling search engines and more importantly users what a particular page is about. It’s what shows up in the SERPs <- (search engine result pages - see I gotcha ;) like Google.
Maybe you’ve probably heard of metadata on the internet or maybe you haven’t. It was the first seo related attribute that spammers took advantage of in the early days of the seo industry, and it was the reason that search engines stop placing as much weight on the metadata keyword field. It was too easy to manipulate and rank with, so search engines stopped relying on it as much.
Unbeknownst to many the OpenGraph protocol and JSON-LD are actually fairly new in terms of internet protocol age. Opengraph was introduced in 2010, and Schema in 2011 in order to make the web more easily understandable to search engine bots.
<meta property="og:title" content="The SEO Guide to Next.js" />
The structured data format landscape is still very young, but with the impending deluge of IoT devices coming online in the next decade or two this standardized format will only grow in usage and application.
Sitemaps provide search engine bots with a roadmap of how to best categorize and index your website and/or app pages. Without it bots are flying blind, and trust me you don’t want to leave it up to Google to infer what your page means, they’ve gotten better over the years, but having an recently updated sitemap definitely ensures the search index keeps up with the most recent information about your business.
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://www.example.com/foo.html</loc>
<lastmod>2018-06-04</lastmod>
</url>
</urlset>
The purpose of the robots.txt file is to let bots know which url paths to not follow. Wait I thought the sitemap did that? Well yes and no sitemaps provide the roadmap and robots.txt create the roadblocks of which paths to not go down. Adding noindex and/or nofollow to the pages you don’t want indexed in addition to setting the path in the robots.txt file will maintain a strict description of what the bots have to obey.
User-agent: Googlebot
Disallow: /nogooglebot/
User-agent: *
Allow: /
Sitemap: http://www.example.com/sitemap.xml
Next-sitemap offers the ability to generateRobotsTxt
while Next-SEO allows the ability to set canonicals, alternate language versions, and leverage the Open Graph protocol to populate your Twitter and Facebook metadata. You can even add your own metadata tags to your config file.
Of course they can, now how highly they rank is another matter. The reason is the way that search engine bots crawl the web. The bot crawl process is broken down into two distinct phases:
Processing - Parsing the HTML
Rendering - Executing the Javascript
Here’s a diagram if your more of a visual person.
So if you notice at the processing stage the crawler tries to parse the HTML and if it can’t process it efficiently it’s pushed off to the Render Queue. Just to give you an idea of the scale at which we are talking about here are some interesting facts about the search behemoth that is Google:
Every query has to travel on average 1,500 miles to a data center and back to return the answer to the user. [1]
A single Google query uses 1,000 computers in 0.2 seconds to retrieve an answer.
In 1999, it took Google one month to crawl and build an index of about 50 million pages. In 2012, the same task was accomplished in less than one minute.
Google cares about delivering the user the most accurate answer to their query at the fastest possible time, so any impediments in that regard will get push off to the side to be dealt with later (i.e the render phase). We want to provide a frictionless experience for the search engine bots so that they can process our web apps and index them in a orderly manner.
What Are Some Of The Most Common React SEO Issues?
Empty First Pass Content - An app shell model that's empty on first pass and has to execute Javascript in order to load the page content.
Loading Time and UX - Google introduced web vitals to help developers test website speed improvements.
LCP (Largest Contentful Paint) - measures loading performance
FID (First Input Delay) - measures interactivity
CLS (Cumulative Layout Shift) - measures visual stability
Sitemap - React doesn’t come with a built in sitemap generator, tools like react-router-sitemap
can help though.
Metadata - component offers limited configuration of metadata in the Head.
URL Structure - Limit your use of fragments (/#/about) in your urls, because Googlebot won’t crawl the links.
Now loading time isn’t necessarily an issue per se, as we all know that React apps are lightening fast, but if your site isn’t indexed then what does it matter how fast it is if users can’t find it?
Now we can get to the meat and potatoes of the article, and most likely the reason you decided to stop by.
Next.js comes with a lot of the things you need to handle basic seo on a single page app right out of the box, however if your building anything with more than a handful of pages it’s wise to install a couple of packages to multiply your SEO efforts.
Install next-sitemap
npm i next-sitemap
or you can alternatively run
yarn add next-sitemap
Create a config file named next-sitemap.js
under the project root, and configure your sitemap setting options.
/** @type {import('next-sitemap').IConfig} */
module.exports = {
siteUrl: 'https://example.eth.limo',
generateRobotsTxt: true, // (optional)
sitemapSize: 5000, // (optional)
changefreq: 'daily', // (optional)
priority: 0.7, // (optional - default is 0.7)
exclude: ['/404'], // (optional)
alternateRefs: [
{
href: 'https://es.example.eth.limo',
hreflang: 'es',
},
{
href: 'https://fr.example.eth.limo',
hreflang: 'fr',
},
{
href: 'https://de.example.eth.limo',
hreflang: 'de',
},
]
}
Add “postbuild”: “next-sitemap”
to your package.json under the scripts.
"scripts": {
"dev": "next dev",
"build": "next build",
"postbuild": "next-sitemap",
"start": "next start"
},
Install the fabulous package next-seo
created by Gary Meehan
npm i next-seo
or you can alternatively run
yarn add next-seo
This is what your web page looks like before next-seo
is installed.
This is what your web page looks like after next-seo
is installed.
In order to activate the basic SEO options, just replace your component with the following <NextSeo />
component:
<NextSeo
title="Gam.eth SEO Marketing Class"
description="Gam.eth SEO Marketing Class of 2022"
canonical='https://gam.eth.link/'
openGraph={{
url: 'https://gam.eth.link/',
title: 'Gam.eth SEO Marketing Class',
description: 'Gam.eth SEO Marketing Class of 2022',
images: [
{},
],
site_name: 'Gam.eth',
}}
/>
Now if you only have a one pager with no other routes, then that’s all you need to do and your good to go, Google can now more efficiently crawl your web app and index it accordingly.
USE CASE:
Best for single page apps - when you don’t have any other pages besides your home page, and you don’t foresee ever needing to add new pages in the future.
<DefaultSEO/>
allows you the ability to have a default set of seo properties present on all pages, but remains flexible enough to override properties on a page by page basis. In order to accomplish this setup you’ll need a few things in place before we get started.
Custom App - create a new file _app.js
in your pages directory.
Here’s a standard example.
import App, { Container } from 'next/app';
import { DefaultSeo } from 'next-seo';
// import your default seo configuration
import SEO from '../next-seo.config';
export default class MyApp extends App {
render() {
const { Component, pageProps } = this.props;
return (
<Container>
<DefaultSeo
openGraph={{
type: 'website',
locale: 'en_IE',
url: 'https://www.url.ie/',
site_name: 'SiteName',
}}
twitter={{
handle: '@handle',
site: '@site',
cardType: 'summary_large_image',
}}
/>
<Component {...pageProps} />
</Container>
);
}
}
In order to work properly <DefaultSEO/>
should be placed above(before) Component due to the way Next.js handles certain internal behaviors, you can think of it as replacing the component.
Another way to achieve the same effect is by storing all your default values in a config file named next-seo.config.js
export default {
openGraph: {
type: 'website',
locale: 'en_IE',
url: 'https://www.url.ie/',
site_name: 'SiteName',
},
twitter: {
handle: '@handle',
site: '@site',
cardType: 'summary_large_image',
},
};
Then import it into your _app.js
file like so:
import SEO from '../next-seo.config';
And then the <DefaultSEO/>
component can be used like this:
<DefaultSEO {...SEO} />
Now all your pages will have the same defaults applied across all your pages.
Next-SEO offers so many options I couldn’t possibly include them all and still call this an article, it would more like a book. You can find all the available config options on the github repo.
Well that brings us to the end of our Next.js SEO guide, obviously the amount of options and data that are available to you in the Next-seo config file is immense from setting metadata for social networks, to setting JSON-LD for entity recognition for search engines the options are limitless.
As tempting as it is to dive in head first into the world of search engine optimization if you’re not 100% confident of the desired outcome you can actually cause more harm than good in the form of bots indexing incorrect configurations and causing irreparable damage to your new web3 project or business.
It’s rare for Google to delist sites based off on-page seo alone, but if you do get delisted it can be a long and arduous journey to appeal Google when issued a penalty. In rare cases it’s been advised to start over on a brand new domain name, and trust me that’s not a road you want to have to go down.
If you need any help in your SEO or digital marketing efforts please feel free to reach out on Twitter: @gigabit_eth