Skip to content

Sitemap & Robots

Velocity automatically generates sitemap and robots.txt files.

The sitemap is auto-generated by @astrojs/sitemap at:

https://yoursite.com/sitemap-index.xml

Sitemap is configured in astro.config.mjs:

import sitemap from '@astrojs/sitemap';
export default defineConfig({
site: 'https://yoursite.com',
integrations: [sitemap()],
});

Exclude pages from the sitemap:

sitemap({
filter: (page) => !page.includes('/private/'),
})

Add external URLs:

sitemap({
customPages: [
'https://yoursite.com/custom-page',
],
})

Robots.txt is generated at src/pages/robots.txt.ts:

import type { APIRoute } from 'astro';
export const GET: APIRoute = () => {
const sitemapUrl = new URL('/sitemap-index.xml', import.meta.env.SITE);
const robotsTxt = `
User-agent: *
Allow: /
Sitemap: ${sitemapUrl.href}
`.trim();
return new Response(robotsTxt, {
headers: {
'Content-Type': 'text/plain',
},
});
};

Block specific paths:

const robotsTxt = `
User-agent: *
Allow: /
Disallow: /admin/
Disallow: /api/
Disallow: /private/
Sitemap: ${sitemapUrl.href}
`.trim();

Configure different rules per bot:

const robotsTxt = `
User-agent: Googlebot
Allow: /
User-agent: *
Allow: /
Disallow: /private/
Sitemap: ${sitemapUrl.href}
`.trim();
  • Keep sitemap under 50MB
  • Limit to 50,000 URLs per file
  • Update regularly for fresh content
  • Include only canonical URLs
  • Don’t block CSS/JS files
  • Don’t use for hiding content (use noindex)
  • Test with Google Search Console
  • Keep rules simple and specific
  1. Submit your sitemap URL
  2. Check for crawl errors
  3. Monitor indexing status

Use Google’s robots.txt Tester:

  1. Go to Search Console
  2. Select your property
  3. Use the robots.txt Tester tool