-
-
Notifications
You must be signed in to change notification settings - Fork 1k
docs: add sitemap configuration and robots.txt for SEO #3675
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: next
Are you sure you want to change the base?
Conversation
✅ Deploy Preview for fakerjs ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## next #3675 +/- ##
=======================================
Coverage 99.97% 99.97%
=======================================
Files 2995 2995
Lines 236313 236313
Branches 941 940 -1
=======================================
Hits 236256 236256
Misses 57 57 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds SEO improvements by configuring automated sitemap generation and adding a robots.txt file to help search engines discover and index the documentation site.
- Added sitemap configuration to VitePress config with the production hostname
- Created robots.txt file with sitemap reference for search engine crawlers
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
| docs/.vitepress/config.ts | Added sitemap configuration with hostname for automated sitemap.xml generation |
| docs/public/robots.txt | Created robots.txt file directing crawlers to the sitemap location |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
How will this work with different subdomains? Like should next.fakerjs.dev/sitemap.xml point to a different domain? |
I had the same question in my head and decided to ignore that for the first iteration. I'm not sure if other subdomains are real in need to be effectively crawled by bots. |
Improve SEO with robots.txt and sitemap.xml (autogenerated)
for local evaluation, you need to serve a prod generated docs.
Preview: