Event website Sitemap

Sitemaps can help some search engines more efficiently index your event website.

Evessio supports XML site maps and sitemaps links in robots.txt files which are automatically generated - you can view your event robots.txt file by viewing e.g.

>> http://myevent.com/robots.txt

Sitemap creation

Evessio sitemaps are built in the same way that search engine crawlers index the websites: every page and link is visited and catalogued to ensure that all publicly available content appears in the sitemap.

Generating a sitemap

  • WEBSITE > Pages > SEO
  • On demand
    The first request to the sitemap URL will build it if it does not already exist e.g. for new events.

Updating a sitemap

  • The sitemap will automatically be re-generated when you add and publish new pages. This happens in the background and no actions is required to keep the sitemap up-to-date.
  • You can also manually re-generate the sitemap using the "Generate sitemap" button. This can only be repeated in 15min periods due to the processing time required to generate sitemaps.

robots.txt

  • This file makes for efficient and consistent indexing of the event websites by search engines
  • This file is automatically generated by the system and is used to provide instructions to indexers about sites hosted on the domain.
  • If there are multiple event sites active on a single domain, these event sites and their sitemaps will be added to the robots.txt file

For more information about using sitemaps with: 

Google: https://support.google.com/webmasters/answer/156184?hl=en&ref_topic=4581190
Bing: https://www.bing.com/webmaster/help/how-to-submit-sitemaps-82a15bd4