No matter what industry a website is in, the URL structure must be rooted in simplicity. Content has to be organized so that logic can be applied to URL construction. Of course, it has to be readable more than anything. Let’s say you’re looking for further information on London Bridge. You’re very likely to click a URL like http://en.wikipedia.org/wiki/London_Bridge, right?
Chances are just as high that you, or anyone else searching, wouldn’t click on a link that looks like www.sampleurlforthis.com/index.php?id_session=394&cuw=23ba8543r3o0sw2327676765.
Simple URLs Make A Huge Difference
A good way to keep URLs simple is through punctuation. http://thisisanexample.com/skincare-for-kids is better than http://thisisanexample.com/skincareforkids. Hyphens (-) are ideal, even over underscores (_).
When URLs are too complicated and have too many parameters, crawlers will have issues. That’s because unneeded, rather high quantities of URLs will eventually likely end up with similar or identical content on your website. Googlebot could end up unable to fully index your site content or taking on more bandwidth.
What Kind Of Problems Lead To Unnecessarily High URL Numbers?
Additive filtering of set items – Many websites have different views of the same set of search results or items in general. That lets users filter a set based on specific criteria. (‘Show me coffee shops in Coventry’, for example.) Filters that can be combined (‘coffee shops in Coventry with free wifi’, for example) leads the URL numbers to increase. A list with a large number of coffee shops that aren’t really that different will basically be redundant.
Laid out, it looks like this:
- Coffee shop at “value rates”:http://www.sample.com/coffee-shop-search-results.jsp?nq=643&S=954
- Coffee shop at “value rates” in Coventry:http://www.sample.com/coffee-shop-search-results.jsp?nq=643&S=954+56738546378
- Coffee shop at “value rates” in Coventry with free wifi:http://www.sample.com/coffee-shop-search-results.jsp?nq=643&S=954+56738546378+275726543
Broken relative links – Infinite spaces usually happen as a result of this. Repeated path elements will lead to this problem. For example: http://www.example.com/index.shtml/discuss/category/food/073120/html/recipes/ category/airfryer/021020/html/category/vegan/364296/html/category/glutenfree/465437/html/Asia.htm
Calendar issues – When a calendar is dynamically generated, links to previous and future dates alike might get generated without start of end dates restrictions.
Dynamic generation of documents – The problem here is that small changes can make a major difference. It happens because of advertisements, counters or timestamps.
Irrelevant URL parameters – This includes referral parameters.
Problematic URL parameters – Session IDs can increase duplication in a major way.
Sorting parameters – A number of eCommerce websites have several ways for the same items to get sorted.
How Can URL Issues Best Be Resolved?
Add a nofollow attribute to future calendar pages – This is applicable to websites that have an infinite calendar.
Address broken relative links – Sometimes, these can go unnoticed especially if the website is up and running. Check your website every so often for them.
Make use of robots.txt – With this file in place, you can prevent problematic URLs from being accessed by Googlebot. Dynamic URLs should be blocked in particular, like the ones that generate search results. Calendars that could make infinite spaces should also be considered.
Shorten URLs – This may be a given, but it’s worth noting. Take unnecessary parameters out of the equation.
Looking for SEO services in London? Reach out to Market Jar today! We’re a Hammersmith-based digital marketing agency that’s got affordable services.