Skip to main content
UtilityStack

Sitemap Generator — XML sitemap from URL list

Paste a list of URLs (one per line), pick a default change-frequency and priority, and the tool builds a valid Sitemaps-Protocol XML file you can upload to your domain root.

4 URLs
URLs (one per line)
sitemap.xml
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://example.com/</loc>
    <lastmod>2026-05-07</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.7</priority>
  </url>
  <url>
    <loc>https://example.com/about</loc>
    <lastmod>2026-05-07</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.7</priority>
  </url>
  <url>
    <loc>https://example.com/blog</loc>
    <lastmod>2026-05-07</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.7</priority>
  </url>
  <url>
    <loc>https://example.com/contact</loc>
    <lastmod>2026-05-07</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.7</priority>
  </url>
</urlset>

Why a sitemap?

A sitemap is an XML file that lists every URL on your site you want search engines to know about. Crawlers can discover most pages by following links, but a sitemap is the authoritative source — especially important for newly-published pages, deep pages, and orphan pages no internal link reaches.

The Sitemaps Protocol (sitemaps.org) is supported by Google, Bing, Yandex, DuckDuckGo and most other engines. Each <url> entry has an optional <lastmod>, <changefreq> and <priority>. Modern crawlers mostly ignore <changefreq> and <priority> in favour of their own heuristics, but they still respect <lastmod> as a hint for re-crawling.

How to use this tool

  1. Paste one URL per line into the left pane. Use full https:// URLs — relative paths are not valid in a sitemap.
  2. Pick a default change-frequency (weekly is fine for most sites) and priority (0.5-0.8 for content pages, 1.0 for the home page). Toggle 'Include <lastmod>' to add today's date as the last-modified hint.
  3. Click Copy or Download to grab the sitemap.xml. Upload it to your domain root (https://yourdomain.com/sitemap.xml) and reference it from robots.txt or submit via Search Console.

Frequently asked questions

Where do I put the sitemap?

At the root of your domain so it's accessible at https://example.com/sitemap.xml. You can also place it in a subdirectory and reference that path in robots.txt — but the root is the convention.

How big can a sitemap be?

The protocol allows up to 50,000 URLs and 50 MB uncompressed per file. For larger sites, use a sitemap index file that points to multiple sitemaps. This generator produces a single sitemap; for index files, see the Sitemaps Protocol docs.

Does Google still use changefreq and priority?

Mostly no. Google has stated that <changefreq> and <priority> are largely ignored in favour of behavioural signals. They still appear in the spec and Bing/Yandex give them slight weight, so it's fine to include them — just don't expect them to dramatically change crawl frequency.

Should I include <lastmod>?

Yes, when accurate. A precise <lastmod> tells crawlers which pages have changed since their last visit, which speeds up re-indexing. The catch: it must be accurate. Setting every entry to today every time defeats the purpose and may get the sitemap deprioritised.

How do I submit it to Google?

Three options: 1) Reference it in robots.txt with 'Sitemap: https://example.com/sitemap.xml' — Google picks it up automatically. 2) Submit it manually in Search Console under Sitemaps. 3) Ping Google with GET https://www.google.com/ping?sitemap=URL (deprecated for new sites but still works). The robots.txt approach is the lowest-friction.

Common use cases

Where a hand-built sitemap matters more than the auto-generated one.

Migrating from one platform to another

When moving from Wordpress to a static site (or any platform change), the URL pattern often shifts. A fresh sitemap accelerates Google's discovery of the new URLs and shortens the period when both old and new versions race for the same keywords.

Onboarding a new domain

A brand-new domain has no inbound links and no crawl history — the sitemap is the only signal Google has to know what's there. Submit it the same day you launch and add the link to your robots.txt.

Promoting deep content

Long-tail content (case studies, deep documentation, archive pages) gets crawled less often. Listing them in the sitemap with a moderate priority is the closest thing to a 'please re-crawl' button that exists.

Diagnosing indexation issues

Submit a sitemap with the URLs you expect to be indexed. Search Console shows which were submitted vs which actually got indexed, surfacing problems (noindex tag, robots.txt disallow, duplicate content) page by page.

Tips and shortcuts

Habits that make sitemaps effective, not bloated.

Only include canonical, indexable pages

Don't list URLs that have noindex, return 4xx/5xx, or rel='canonical' to a different URL. Including them dilutes the sitemap's value and may cause Search Console to flag errors.

Regenerate on each deploy

If your site uses a static generator (Hugo, Astro, Next.js), wire sitemap generation into the build. A stale sitemap pointing at deleted pages is worse than no sitemap.

Keep one sitemap per language

If you publish in multiple languages, either include hreflang annotations in the sitemap (xhtml:link rel='alternate' hreflang) or split into one sitemap per language. Mixing without annotations confuses Google about which version to surface.

Compress big sitemaps

Sitemaps can be served gzipped (.xml.gz). Crawlers automatically decompress. For sites near the 50 MB / 50,000 URL limit, gzipping or splitting is mandatory.

Related tools