We recently had a client sign up with us to market their web site and help move the site to a new hosting provider. The old host had the files locked down using a template file outside of the root folder and with a database that he was unwilling to export. So we reaped the files using Webreaper as best we could, mostly just to grab the text, then set about cleaning them up. We found lots of errors in code and a few optimization issues but mostly we just rebuilt the same site over again using better, cleaner code. The client called us after just a couple of weeks and said that she had never had calls resulting from her web site before and in the three weeks since we took it over she had three calls!
So why is this? Well with a better site search engine bots are more likely to;
- Stay on the site longer.
- Be better able to reach all of the pages on the site.
- Give you higher trust scores and page rank because there will be fewer errors.
- Locate blocks of text to display in their search result descriptions.
- Extract proper keywords due to better keyword densities.
- Find all of your external and internal links including those to social media.
What did we do to her site? Anyone that knows code will instantly recognize and cringe when they see the source of a bad site. It is ugly, it is long, it lacks flow and is unreadable. Here are a few tips and things to do to make your site return better results.
Clean your code. Simply making the code look nicer with indentations, line breaks, comments, etc. will help you spot problems.
Run a W3C report. The HTML Validator service at the World Wide Web Consortium (W3C) is essential to help streamline a new or rebuilt web site.
Test links. Broken links are bad all around, for bots and visitors. We use the entire suite of software from Inspyder, specifically in this case their Broken Link Checker which is part of InSite.
Just those few quick changes will help you improve search rank, load speed, give a better experience to users and save yourself a lot of headache!