SEO Strategies from Local Link Catalogs

The Demise of Link Directories

You often hear how link and article directories are dead, and by and large this is true for general link directory websites with the exception of, and a few others. For a historic overview and opinions, you can check these articles from websites that regularly write about search engine optimization and the industry:

Testing Waters With Niche Link Directories

While it is clear that most kinds of websites that primarily function as link catalogs are going downhill, the matter is not completely decided on niche websites that also have strong authority content of which a part is a selective link catalog of vendors and/or service providers. This is why we are currently testing out the concept with a Danish website – it features the following characteristics that differentiates is from other websites of its kind:

  • Articles on do it yourself window washing and equipment.
  • A local catalog of window washers in Denmark organized by city. For examples, see the city pages for Aarhus and Skanderborg.
  • All articles and listings are intertwined with local references and information. The city pages contain local information with regards to window washing as well as local listings of service providers.
  • All submissions are manually reviewed, so only websites from real service providers are accepted.

Tip: Having such a website is also a great way for web developers and search marketers to start interaction with a wide range of business owners.

How to Apply These Concepts

While the concepts described above works well for authority content and link catalog websites, they could also be applied to many service providers that target many cities in an area, e.g. a small state or country.

Building local pages with information not found elsewhere that is relevant for searchers can be the factor that gives the final push up in search rankings. Of course, some niches already have plenty of local competition doing this combined with effective link building, but for those that do not, this can be a very cost effective SEO strategy that many websites owners even can do themselves.

2016 September Update

With the experiment being somewhat successful in grabbing many long tail searches, a sibling website (easy cleaning) was created since many small businesses in this field offer many related services – hence the catalog and business model for lead generation, ads etc. could be shared.

A variant of the concept was also recently launched with (easy betting) – the goal being to test the concept against one of the more competitive niches in the search engine optimization and marketing world. Essentially the providers in the catalog are organized through sports and offers with some unique features as well. In essence, its model is very close to what has been described for the other websites. A later update will let readers now if this was successful or not 🙂

Panda, Penguin and Penalties

Search Engine Updates

Large search engines like Google roll out updates to their algorithms all the time. Some are small while others affect ranking a lot. Two of the most known and “big” Google algorithm parts are Panda (site and page quality analysis) and Penguin (backlinks quality). Common for these two are they make or break many small companies since ranking in search engines can easily account for a large percentage of their sales.
Continue reading Panda, Penguin and Penalties

Running a Software Business

Practical Guide on Steps To Take

When starting out developing and selling software (or mobile and web services for that matter) it is important to focus on finishing and get started on selling some products. However, as your software business starts to grow, new problems and opportunities will arrise.

This blog post will highlight some of the things and decisions you can take to ensure your business will continue to operate and run smoothly.

Continue reading Running a Software Business

XML Sitemaps Submission and Robots Text File

Sitemaps Autodiscovery With Robots Text File

Ever since the beginning of internet and search engines, the robots.txt file has been how website owners and their webmasters could tell search engine crawlers like GoogleBot which pages and content should be ignored and left out of search results.

This was the situation for many years until Google created Google Sitemaps. (This was later named XML Sitemaps Protocol as other search engines joined.)

New functionality called Sitemaps Autodiscovery was added to robots.txt file that made it possible to point search engines to your XML sitemaps. Thus search engine bots can, when they have downloaded and read the robots.txt file, automatically discover and retrieve XML sitemap files located on websites.
Continue reading XML Sitemaps Submission and Robots Text File

Micro-ISV : Independent Sofware Vendor Company Blog