27
loading...
This website collects cookies to deliver better user experience
- http://site.com/article
- http://www.site.com/article
- https://site.com/article
<link rel="canonical" href="https://MihaiBojin.com/"/>
href
is the URL you choose to represent your original content.)npm install --save gatsby-plugin-react-helmet gatsby-plugin-react-helmet-canonical-urls
.gatsby-config.js
:module.exports = {
plugins: [
{
resolve: `gatsby-plugin-react-helmet-canonical-urls`,
options: {
siteUrl: SITE_URL,
},
},
],
};
npm install gatsby-plugin-sitemap
gatsby-config.js
. When I initially configured this plugin, this step took me a long time to get right. Part of that was caused by a bug that's since been fixed.module.exports = {
plugins: [
{
resolve: 'gatsby-plugin-sitemap',
options: {
output: '/sitemap',
query: `
{
site {
siteMetadata {
siteUrl
}
}
allSitePage(
filter: {
path: { regex: "/^(?!/404/|/404.html|/dev-404-page/)/" }
}
) {
nodes {
path
}
}
}
`,
resolvePages: ({ allSitePage: { nodes: allPages } }) => {
return allPages.map((page) => {
return { ...page };
});
},
serialize: ({ path }) => {
return {
url: path,
changefreq: 'weekly',
priority: 0.7,
};
},
},
},
],
};
siteUrl
, it is required, as explained in the official docs.gatsby build && gatsby serve
and accessing http://localhost:9000/sitemap/sitemap-index.xml. If everything is okay, publish your sitemap and then submit it to the Google Search Console.npm install --save gatsby-plugin-robots-txt
and add the following to your gatsby-config.js
:module.exports = {
plugins: [
{
resolve: 'gatsby-plugin-robots-txt',
options: {
host: process.env.SITE_URL,
sitemap: process.env.SITE_URL + '/sitemap/sitemap-index.xml',
policy: [
{
userAgent: '*',
allow: '/',
disallow: ['/404'],
},
],
},
},
],
};
/robots.txt
file that specifies the options listed above./static/robots.txt
, with the same contents.User-agent: *
Allow: /
Disallow: /404
Sitemap: https://[SITE_URL]/sitemap/sitemap-index.xml
Host: https://[SITE_URL]
Make sure to replace SITE_URL with the appropriate value!
SITE_URL
is less hassle in the long run./static
directory will be copied to and included as a static file on your site.Sitemap
directive. That allows letting crawlers know where they can find your sitemap index. Earlier in this article, we accomplished that using the Google Search Console, which is the better approach. However, there are other search engines except for Google and having this directive means they can all find and process your sitemap.import * as React from "react";
import PropTypes from "prop-types";
import { Helmet } from "react-helmet";
import { useSiteMetadata } from "../hooks/use-site-metadata";
function Seo({ title, description, tags, canonicalURL, lang }) {
const { siteMetadata } = useSiteMetadata();
const pageDescription = description || siteMetadata.description;
const meta = [];
const links = [];
if (tags) {
// define META tags
meta.push({
name: "keywords",
content: tags.join(","),
});
}
if (canonicalURL) {
links.push({
rel: "canonical",
href: canonicalURL,
});
}
return (
<Helmet
htmlAttributes={{
lang,
}}
// define META title
title={title}
titleTemplate={siteMetadata?.title ? `%s | ${siteMetadata.title}` : `%s`}
link={links}
meta={[
// define META description
{
name: `description`,
content: pageDescription,
},
].concat(meta)}
/>
);
}
Seo.defaultProps = {
lang: `en`,
tags: [],
};
Seo.propTypes = {
title: PropTypes.string.isRequired,
description: PropTypes.string.isRequired,
tags: PropTypes.arrayOf(PropTypes.string),
canonicalURL: PropTypes.string,
lang: PropTypes.string,
};
export default Seo;
react-helmet
to define the specified tags.useSiteMetadata
. I learned this trick from Scott Spence.import { useStaticQuery, graphql } from 'gatsby';
export const useSiteMetadata = () => {
const { site } = useStaticQuery(
graphql`
query SiteMetaData {
site {
siteMetadata {
title
description
author {
name
summary
href
}
siteUrl
}
}
}
`,
);
return { siteMetadata: site.siteMetadata };
};
gatsby-config.js
:module.exports = {
siteMetadata: {
title: `Mihai Bojin's Digital Garden`,
description: `A space for sharing important lessons I've learned from over 15 years in the tech industry.`,
author: {
name: `Mihai Bojin`,
summary: `a passionate technical leader with expertise in designing and developing highly resilient backend systems for global companies.`,
href: `/about`,
},
siteUrl: process.env.SITE_URL,
}
};
<Seo title={} description={} tags={} canonicalURL={} />
function renderCanonicalLink(frontmatter, siteUrl, location) {
return frontmatter?.canonical || (siteUrl + location.pathname);
}