Sitemap Generator — XML sitemap from URL list
Paste a list of URLs (one per line), pick a default change-frequency and priority, and the tool builds a valid Sitemaps-Protocol XML file you can upload to your domain root.
Paste a list of URLs (one per line), pick a default change-frequency and priority, and the tool builds a valid Sitemaps-Protocol XML file you can upload to your domain root.
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://example.com/</loc>
<lastmod>2026-05-07</lastmod>
<changefreq>weekly</changefreq>
<priority>0.7</priority>
</url>
<url>
<loc>https://example.com/about</loc>
<lastmod>2026-05-07</lastmod>
<changefreq>weekly</changefreq>
<priority>0.7</priority>
</url>
<url>
<loc>https://example.com/blog</loc>
<lastmod>2026-05-07</lastmod>
<changefreq>weekly</changefreq>
<priority>0.7</priority>
</url>
<url>
<loc>https://example.com/contact</loc>
<lastmod>2026-05-07</lastmod>
<changefreq>weekly</changefreq>
<priority>0.7</priority>
</url>
</urlset>
A sitemap is an XML file that lists every URL on your site you want search engines to know about. Crawlers can discover most pages by following links, but a sitemap is the authoritative source — especially important for newly-published pages, deep pages, and orphan pages no internal link reaches.
The Sitemaps Protocol (sitemaps.org) is supported by Google, Bing, Yandex, DuckDuckGo and most other engines. Each <url> entry has an optional <lastmod>, <changefreq> and <priority>. Modern crawlers mostly ignore <changefreq> and <priority> in favour of their own heuristics, but they still respect <lastmod> as a hint for re-crawling.
At the root of your domain so it's accessible at https://example.com/sitemap.xml. You can also place it in a subdirectory and reference that path in robots.txt — but the root is the convention.
The protocol allows up to 50,000 URLs and 50 MB uncompressed per file. For larger sites, use a sitemap index file that points to multiple sitemaps. This generator produces a single sitemap; for index files, see the Sitemaps Protocol docs.
Mostly no. Google has stated that <changefreq> and <priority> are largely ignored in favour of behavioural signals. They still appear in the spec and Bing/Yandex give them slight weight, so it's fine to include them — just don't expect them to dramatically change crawl frequency.
Yes, when accurate. A precise <lastmod> tells crawlers which pages have changed since their last visit, which speeds up re-indexing. The catch: it must be accurate. Setting every entry to today every time defeats the purpose and may get the sitemap deprioritised.
Three options: 1) Reference it in robots.txt with 'Sitemap: https://example.com/sitemap.xml' — Google picks it up automatically. 2) Submit it manually in Search Console under Sitemaps. 3) Ping Google with GET https://www.google.com/ping?sitemap=URL (deprecated for new sites but still works). The robots.txt approach is the lowest-friction.
Where a hand-built sitemap matters more than the auto-generated one.
When moving from Wordpress to a static site (or any platform change), the URL pattern often shifts. A fresh sitemap accelerates Google's discovery of the new URLs and shortens the period when both old and new versions race for the same keywords.
A brand-new domain has no inbound links and no crawl history — the sitemap is the only signal Google has to know what's there. Submit it the same day you launch and add the link to your robots.txt.
Long-tail content (case studies, deep documentation, archive pages) gets crawled less often. Listing them in the sitemap with a moderate priority is the closest thing to a 'please re-crawl' button that exists.
Submit a sitemap with the URLs you expect to be indexed. Search Console shows which were submitted vs which actually got indexed, surfacing problems (noindex tag, robots.txt disallow, duplicate content) page by page.
Habits that make sitemaps effective, not bloated.
Don't list URLs that have noindex, return 4xx/5xx, or rel='canonical' to a different URL. Including them dilutes the sitemap's value and may cause Search Console to flag errors.
If your site uses a static generator (Hugo, Astro, Next.js), wire sitemap generation into the build. A stale sitemap pointing at deleted pages is worse than no sitemap.
If you publish in multiple languages, either include hreflang annotations in the sitemap (xhtml:link rel='alternate' hreflang) or split into one sitemap per language. Mixing without annotations confuses Google about which version to surface.
Sitemaps can be served gzipped (.xml.gz). Crawlers automatically decompress. For sites near the 50 MB / 50,000 URL limit, gzipping or splitting is mandatory.