Saturday, April 22, 2017

Keyword Research

Keyword research enables site owners to choose keywords when constructing and optimizing a website (page). Keyword research is also extensively used to manage PPC advertising campaigns or if you are looking to research and identify profitable niche markets.

When performing keyword research for constructing and optimizing a website (page) you are looking to select search terms that will reach your target audience. One of the common mistakes made by some SEOs is to avoid choosing keywords that are very competitive. Searchers tend to use a lot modifiers when they search and the more competitive the search term the more likely they are to use a modifier. Removing the possibility of ranking well for these modified searches is not a good idea. Competition can be a factor in deciding how to target a specific search term but you should never ignore a search term simply because you believe it is too competitive.

There are two basic tools for the site owner:

1. Digital Point Solutions Keyword Suggestion Tool

laptopbattery

Digital Point’s online keyword tool compares Overture and WordTracker data side by side. It is free, quick and easy to use although it lacks WordTracker’s more advanced features. Here is a partial screen shot.

2. Google Adwords Keyword Tool

You will need an AdWords account to use Google Adwords Keyword Tool but signing up is easy and well worth it just to use the tool. Although primarily designed for AdWords it is also ideal for use as a simple keyword suggestion tool. The big advantage is of course that it is using the latest Google data and you can find and select keywords based on this data. You can create keywords from a url (i.e. one page), a whole site, a keyword(s) that you enter or for AdWords users the most relevant terms in your account. The results are shown by relevance but can be ordered by Advertiser Competition or Search Volume on a scale of 1 to 5. You can also download the results as a .csv (for excel) file which makes it easy to compile master lists.

Here is a partial screen shot of a list from a keyword.

googlekeywordtool

Here is a partial screen shot of a list from a url.

gkt2

Most site owners will find the above tools sufficient for their needs but if you want to investigate other tools there are basically two kinds, keyword analytical tools and subscription based tools. Here are some examples with a link to the product and a link to a review of the product.

Keyword Analytical Tools:

  • Keyword Analyzer Review
  • The Keyword Bible Review
  • The Dowser Review

Subscription based:

  • WordTracker Review
  • Keyword Discovery Review
  • Keyword Intelligence Review

A word of caution though if you try these tools. The major search engines (Google,Yahoo and MSN) do not make their raw data available to anyone so these products have to obtain data from somewhere else. For example WordTracker uses data from the Metacrawler and Dogpile metacrawlers which represents a very small and not very representative sample of searches. Not only that but in order to estimate figures like the predicted number of searches for a keyword an extrapolation has to be made. In WordTracker’s case they assume Metacrawler and Dogpile account for 0.86% of all search engine queries (a dubious statistic in itself) and scale up the numbers in their database accordingly. This has the effect of compounding any errors in the original dataset and at the very least means that these derived numbers should not be taken too seriously.

A most important source of keywords that is often overlooked is your server logs. Regularly mine your server log data to find the search terms people are actually using to find your site and use these terms to construct new pages or modify existing ones. You can read more about this process in these two posts Long Tail Search and Long Tail Search Tool.

March 1st, 2016 Wordtracker have introduced a free keyword suggestion tool that will generate up to 100 related keywords.

May 11, 2016 Wordze is a new subscription based tool which has some interesting features.

The post Keyword Research appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2pOUSku
via IFTTT

Friday, April 14, 2017

G profile video

https://www.youtube.com/channel/UCKKraAibgkEjOqvxkC2Vw0A

Short case study showing some rankings & traffic.

The post G profile video appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2pfUbTT
via IFTTT

Sunday, April 2, 2017

Search Engine Friendly Urls

It is important to have search engine friendly urls if you want your pages spidered and indexed by the search engines but what does having search engine friendly urls actually mean? Let’s take a look at what the three major search engines say about urls:

Google has three things to say on the subject in its Webmaster Guidelines:

1. If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

2. Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

3. Don’t use “&id=” as a parameter in your URLs, as we don’t include these pages in our index.

Yahoo in their Search Indexing FAQ say:

Do you index dynamically generated pages (e.g., asp, .shtml, PHP, “?”, etc.)?

Yahoo! does index dynamic pages, but for page discovery, our crawler mostly follows static links. We recommend you avoid using dynamically generated links except in directories that are not intended to be crawled/indexed (e.g., those should have a /robots.txt exclusion).

MSN’s Guidelines for successful indexing say:

Keep your URLs simple and static. Complicated or frequently changed URLs are difficult to use as link destinations. For example, the URL http://ift.tt/1iUlcEq is easier for MSNBot to crawl and for people to type than a long URL with multiple extensions.

The message is clear, static urls are better than dynamic but if you have a dynamic site the urls must be as simple as possible, with only one or two query strings and no session IDs.

A url that might look like this:

http://ift.tt/2ntCivH

Should preferably look like this:

http://ift.tt/2n115vH

How you achieve this depends on whether you are starting out with a new site or have an established site with existing complex urls.

If it is a new site then search engine friendly urls must be built into the design criteria. How this will be done depends on the programming language. For example if you planned to use PHP then you might make use of the PATH_INFO variable or if you use ASP.NET then you could modify the Global.asax file.

If you plan to use a content management system (CMS) then make sure that it generates search engine friendly urls out of the box. The Content Management Comparison Tool has a check box for ‘Friendly URLs’ if you are researching CMS tools.

A completely different approach (not approved of by geeks but worth consideration if you are designing your own site as a non-professional) is to create static HTML web pages from a database or spreadsheets but not in real-time. WebMerge for example works with any database or spreadsheet that can export in tabular format such as FileMaker Pro, Microsoft Access, and AppleWorks. Using HTML template pages WebMerge makes a new HTML page from the data in each record of the exported file. It can also create index pages with links to other pages and generated pages can be hosted without the need for a database.

If it is an existing site then problematic urls can be converted to simple urls in real-time. If you are on an Apache server then you can use mod_rewrite to rewrite requested URLs on the fly. This requires knowledge of regular expressions which can be rather daunting if you are not a programmer. Fortunately there is an abundance of mod_rewrite expertise at RentACoder if you get stuck. If you are on Internet Information Server (IIS) then you can use something like ISAPI_Rewrite to rewrite your urls which also requires knowledge of regular expressions.

What ever your solution you should try to incorporate your keywords in the urls and only ever use hyphens, never an underscore or space.

The post Search Engine Friendly Urls appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2opx5tB
via IFTTT

Monday, March 20, 2017

Frames

Much like Gertrude Stein’s painter, web designers have hopes for their framed web designs but are very often disappointed. This is because framed websites do not fit the conceptual model of the web where every page corresponds to a single URL. Consequently designers must use a variety of tricks to overcome the disadvantages and if you miss a trick there can be unpleasant results.

Designers’ intent on using frames may use the NOFRAMES element which can be used to provide alternative content. However not the useless alternative content provided by so many designers such as “This site requires the use of frames” or “Your browser does not support Frames” which is a great way to prevent your website being found on a search engine. The correct use of NOFRAMES is described in the W3C document Frames in HTML documents.

Apart from having to provide alternative content the other major problem is what happens if a search engine query matches an individual frame on a page? The search engine simply returns the URL for that frame and if a user clicks the link then the page will not be displayed in a frame because there will be no frame set corresponding to that URL. Designers get round this by detecting when a content page is trying to display outside its frameset and redirecting to either the home page or to a framed page that loads the orphan into an alternative frameset. If you really want to know how to do this you can read a description of the technique using JavaScript in Give orphan pages a home.

Also framed sites have a problem with obtaining inbound links because it is not easy for someone to link to one of the content pages. Either they must link to the home page and give directions to the page they want to point to or they must bypass the frame arrangement. If it’s not easy to link then only the very determined will be prepared to go to the trouble of doing so.

If you want the framed look but don’t want the problems you can achieve it through cascading style sheets. Stu Nicholls has an excellent example on his website CSS Play (and there are lots of other interesting experiments with cascading style sheets on Stu’s site).

The bottom line is this, if your web designer uses Frames seek a better and more experienced designer and if you find Framed sites attractive in spite of the problems ask yourself why your competitors do not use them.

The post Frames appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2mMeILn
via IFTTT

Sunday, March 5, 2017

Flash

Triumph and dismay are two feelings Flash designers know well. The triumph comes with mastering a rich diversity of features in a difficult technology and producing a visually appealing result. The dismay comes when the SEO wants to remove the Flash components from the website they have just designed!

Initially a simple tool for delivering low bandwidth animations over the Web Flash has evolved into a platform for delivering complex applications and rich media content. Now Flash is able to deliver far more than animations or short video clips.

Flash has become the delivery mechanism of choice for educational and complex business applications. Universities use Flash to great effect for delivering entire lectures with quizzes and assessments in real-time. In commerce Flash is used for everything from cattle auctions to virtual laboratory experiments.

However its use on websites has declined and there are two reasons for this. Firstly every usability study ever done shows that web surfers dislike Flash intensely, particularly Flash intros. Secondly Flash is a visual experience and search engine robots are blind, which means the SEO of Flash sites is problematical. Sites designed around Flash or with Flash intros and Flash navigation are often developed at the request of clients who do not know any better and the developers have not sought to educate them.

Take for example the following site that is completely built in Flash. Although there are several pages of information, because the navigation and the content are all in Flash the search engines are only aware of one page. Here it is in a reduced size window.

This site cannot even be found for the organization’s name and might just as well not exist. Flash enthusiasts might claim that this is just a poor implementation and that it is possible to optimize Flash sites. It is true that there are a variety of methods used to optimize Flash sites and these include placing the Flash inside invisible framesets or using invisible layers in Cascading CSS to present content to the search engines. Macromedia even have Search Engine SDK but in reality none of these methods is entirely effective. Sometimes you will even see a Flash site duplicated with an HTML version for the search engines but the bottom line is, why bother with the Flash site at all if users don’t like them.

However although this may be (or maybe not) effective as a product demo it does nothing for the search engines. If used it should be placed on a normally optimized page and not considered as a replacement for text. Even then whether something like this is worth spending time and money on is a mute point.

The post Flash appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2lsOOiR
via IFTTT

Tuesday, February 21, 2017

Sitemaps

Sitemaps can be of two kinds; a page or pages on your site that lists the pages on your website, often hierarchically organized or ‘Google Sitemaps’ which is a process that allows site owners to submit urls of pages they would like to have included in Google’s index. The two kinds of sitemap serve slightly different purposes, both important.

A conventional sitemap is designed to help the human visitor if they can’t find what they are looking for and also to ensure that Googlebot (Google’s Web Crawler) finds the important pages of your site. A well executed example of this kind of sitemap is Apple’s sitemap. From the optimization point of view a page like this presents an opportunity to link to your own pages with appropriate anchor text (see the last paragraph of Internal Links). If you have more than a few pages on your site then a sitemap can only be advantageous.

Google Sitemaps however is a solution to a problem that Google has with crawling the entire web. Googlebot spends a lot of time and resources fetching pages that have not changed since it last looked at them. Crawling billions of pages to find that the majority are the same as last time is not very efficient and Google Sitemaps has been designed to improve the process. The idea is that site owners submit a sitemap to Google and next time Googlebot visits their site it knows where to go and look for changed or new pages.

If site owners use Google Sitemaps it will reduce their machine time and reduce their bandwidth i.e. it saves them money. Also site owners get their new content indexed quicker and a reduced load on their servers by Googlebot not fetching unchanged pages. Google have provided a sitemap protocol and an automated process for the whole procedure.

Google Sitemaps does not replace the established Googlebot crawling procedure and should be used to solve specific problems, such as:

  • If you need to reduce the bandwidth taken by Googlebot.
  • If your site has (accessible) pages that are not crawled.
  • If you generate a lot of new pages and want them crawled quickly.
  • If you have two or more pages listed for the same search you can use page priority to list the better one.
  • Google have an extensive help and explanation of the procedure at About Google Sitemaps.

August 5, 2015

Google has renamed Google Sitemaps to Google Webmaster Tools under the new heading of Webmaster Central.

April 15, 2016

Google, MSN, Yahoo and ASK have recently announced support for sitemap auto-discovery via the robots.txt file and have agreed on a standard sitemaps protocol.

By adding the following line of code to your robots.txt file the search engines will now auto-discover your sitemap file.

Sitemap: http://ift.tt/XxxkvM

The post Sitemaps appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2lryX17
via IFTTT

Sunday, February 19, 2017

Why Local Search is Important for Your Business

As much talk as there is about Internet Marketing breaking down boundaries and flattening the world, most commerce is still defined by physical space. For instance, consumers are still going to search for businesses in their area. Because of this, the importance of Local search cannot be overstated.

Local search sites are supported by advertising from businesses that wish to be featured when users search for specific products and services in specific locations. For example, if you live in New York City and you search for “lower west side Manhattan bakery” you are going to get the top Local searches. Local search advertising can be highly effective because it allows ads to be targeted very precisely to the search terms and location provided by the user.

Optimizing your Local Internet Marketing has existed for a long time. 20 or 30 years ago, it was done by businesses putting out adverts in the Local publications. Now consumers search online. And the power of Local searching is only getting bigger, as more and more people search for products or businesses on their smartphones. (Someone looking for a nearby restaurant on his/her iPhone, for instance.) So the idea of the Local search hasn’t changed, just the medium has. Social media marketing demands your business change its approach, too.
One out of five searches on Google is related to location. So make sure your business has optimized content on its site in order to have a high ranking for Local search results.

Here are some benefits of high ranking in Local search.

  • Less competition for your keyword.
  • You’re narrowing your search to your specific location, so you are no longer competing with others around the world. This gives you a chance to shine and better chance to rank.
  • Popularity of Local search is increasing.

We mentioned this before, but it bears repeating. Much of the future of consumer browsing will be done on smartphones. In the age of smartphone apps, you want to make sure that your site is optimized for Local search. If someone is on his/her car and is looking for your services, you need to make sure they don’t drive an extra 10 miles to go to your competitor when he/she could have found you.

Targeted traffic means increased conversion rates.

By consumers narrowing down their search and finding your site, the conversion rates naturally become higher. Now, here are the key things to remember about Local search.

Keywords and Location

One of the biggest mistakes people tend to make it not choosing the right keywords. Use the Google Keywords Tool to look up what keywords people are using for your service.

Title Tags

Be sure your most important keywords are at the beginning of your title tag, and that it includes your targeted location keywords.

Header Tags

They have a lot of impact on SEO.

Internal Linking/Inbound Linking

Internal linking allows both consumers and search engine robots to navigate through your site’s pages smoothly and logically. Inner linking allows the robots to find your most important pages faster besides only relying on the sitemap. Inbound links are basically other sites pointing to your site as a reference. Inbound linking allows the search engines to evaluate how popular and how important your website is.

Sitemap/Robots.txt

Always be sure that you have a proper Sitemaps and Robots.txt set up on your site. This is strictly for the search engine robots to completely understand the structures of your site and any restrictions or directions you have for it.

The post Why Local Search is Important for Your Business appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2kOwNGG
via IFTTT