A XML Site Map gives Search Engines a map of pages that should be crawled on your site. A sitemap can be used to stop spiders/spyders from crawling pages that you would rather leave out of search results. It can also of course, get pages discovered.
The xml sitemap is a very important tool that should always be up to date, since once you submit it to Google, Google will keep referencing it. That is why having an automated setup that updates as you update the site is prefered. If you can’t automate it, manualy create one at http://www.xml-sitemaps.com/
Top Things to think about while working on a site map:
1) Make sure the HTML sitemap matches the XML sitemap. This is important so that things stay clear and there isn’t confusion for the spiders on where to crawl and not to crawl.
2) If your site has old pages ranking for terms you rather the new pages rank for, you can either;
301 redirect old page to new or keep the old page up, add a link from it to new page, and tell the bots (in the metas) to follow, but noindex. We actually recommend keeping the old page and referencing the new page. Sometimes you can lose the ranking completly if you redirect.
4) You must reference the XML sitemaps in the robots.txt file.