Every week in "Google Webmasters", they show "crawl Error" from urls which have not been in existence for over 4 years. These urls are from an old site that was replaced 2 years ago. Our Opencart site is now 2 years ago. The sitemap in opencart has never uploaded these urls. Google feeds off our site map every day.
Therefore we need a system which will allow us to make sure that Google ONLY looks for urls that are in the site map. BUT by having a correct site map (which we already do) is clearly not the way to do this, as google is finding ancient data and keeps looking for it. What cure do you have for this problem and at what price?
"Google web masters" believes that each and every page has hundreds of internal links. Our site is very small (14 info pages, 15 category pages and 44 product pages) and we have NO links within the body text of most of our pages. The only internal links are from the header and left margin navigation.
"Delivery details" page (from header navigation) has less than 50 words on it with NO links, yet google believes it has 332 internal links (as seen below).
What cure do you have for this problem and at what price ?