Thursday, August 3, 2017

Keyword Density

If you ever find yourself reading an article on the importance of keyword density in SEO you can safely assume that the author doesn’t know what they are talking about. Keyword density is not a phrase or even a concept that is used by search engineers working for Google, Yahoo or MSN, it is simply a fiction invented by the lower echelons of the SEO community.

The importance of a keyword or a keyphrase on a page for any given query depends on the content in which the term is used not on keyword density or frequency counts. Search engines use term vector models which are mathematically complex and beyond the scope of this post. If you want to know more about term vectors then take a look at Term Vector Fast Track Tutorial.

Even knowing that search engines use term vector models is not going to help us in writing copy for our webpage. This is for two main reasons, firstly we don’t know exactly what model they are using and secondly in order to make any meaningful analysis it is necessary to perform calculations across every page on the web (which of course is what the search engines do).

So where does that leave us when it comes to writing copy for our webpages and incorporating our chosen keywords and keyphrases? The answer is simple, write your pages for the user and do not write what you think the search engines would like to see! Apart from writing good user copy the only consideration for keywords and keyphrases is to make sure that they appear naturally on the page.

The post Keyword Density appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2woKqCJ
via IFTTT

Monday, June 26, 2017

WordPress Website Design – It’s All Hands on Deck

When you hire Get Found Now as your WordPress design company,  it is not a passive endeavor on your part. We actually need you to actively participate in the project every step of the way because after all, it’s about you, your online presence and your livelihood.

Honestly, we can not pull a rabbit out of our hat and make things happen without your involvement, feedback and total commitment.

We are here to help you help yourself, not carry you if you don’t wan’t to go the extra mile for your own online success. Please don’t say to us that you are putting all of that into our hands. Your online success is 100% dependent on your willingness to write quality content on a consistent basis, which we we will teach you how to optimize and format for Internet usability and search engine placement.

We Need Your Input and No, You Can’t Hurt Our Feelings

Michael and I are about people, not agendas, and every person we work with becomes part of our inner circle of family and friends, which we are committed to and a commitment that we take very seriously.  Go ahead and ask anyone who has ever worked with us and they’ll tell you – we honestly care. That being said, we have to get to know you in order to give you what you need and that means giving us your input. It’s all hands on deck with WordPress website design, that is if you want it done right or the way YOU want it.

We Love Working with Smart, Enthusiastic, Upbeat and Amazing People

Yes, money is very nice to have in the old bankola, no doubt about it, but so is the good feeling you have about the quality of your life at the end of the day and your integrity when it comes to building relationships with good people and doing what you do well. We could just take your money and toss up a mediocre WordPress website design like any  Joe calling themselves a WordPress blog design expert, but we are not into that, not at all.

We love working with smart, enthusiastic, upbeat and amazing people that want to make things happen for themselves and who TRUST our online game plan. Why?  Because it works.  Not to be arrogant, but we do have the search engine placement to substantiate everything we we say and do in order to achieve the same kind of results we can help you achieve.  Type in WordPress Design Company or WordPress Design Companies into Google.

It is not all about building up back links or buying back links (please) from any of the other stupid stuff that Online SEO EXPERTS try and sell people. What it is about is providing quality content that REAL people enjoy reading, sharing via Social Media, commenting on and linking back to ‘organically.’

No SEO Smoke and Mirrors Required in Our Online Game Plan

We are not search engine SEO tricksters.  As a matter of fact, we don’t subscribe to search engine trickery one single bit.  We don’t like it. There is no no smoke and mirrors with us, just WordPress done our way and it works – quality content that is informative, engaging and enjoyable to read, written on a blogging platform that’s set up for server to search engine communication .  Our search engine understanding is based on YEARS of trial, error and experience and success.

We are here teach you how to achieve outstanding results in the index and convert the amazing content you write to leads. It’s actually quite a simple formula that we have come up with and the only online formula (as in real) you will need to be competitive and successful online.

The post WordPress Website Design – It’s All Hands on Deck appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2rTX1vY
via IFTTT

Sunday, June 25, 2017

More4Marketing Ltd – SEO Consultant Berkshire, Online Marketing Berkshire, Lead Generation Berkshire

The post More4Marketing Ltd – SEO Consultant Berkshire, Online Marketing Berkshire, Lead Generation Berkshire appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2t7hccZ
via IFTTT

SEO Reading Berkshire

The post SEO Reading Berkshire appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2u55ugi
via IFTTT

SEO Slough Berkshire

The post SEO Slough Berkshire appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2t7j31t
via IFTTT

East Berkshire SEO Services

The post East Berkshire SEO Services appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2u4NpyR
via IFTTT

SEO Special Offer From Business In Berkshire

The post SEO Special Offer From Business In Berkshire appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2t7pUIh
via IFTTT

Saturday, June 24, 2017

Windsor Berkshire SEO Company

The post Windsor Berkshire SEO Company appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2s9JmA6
via IFTTT

SEO Services For Small Businesses In Berkshire County MA – 888-796-7478

The post SEO Services For Small Businesses In Berkshire County MA – 888-796-7478 appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2sEiZWF
via IFTTT

Page 1 Google SEO Done Right02 9542 3211 Berkshire Park

The post Page 1 Google SEO Done Right02 9542 3211 Berkshire Park appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2s9sUQd
via IFTTT

SEO Wokingham Berkshire

The post SEO Wokingham Berkshire appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2sEm8Wb
via IFTTT

SEO Berkshire

The post SEO Berkshire appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2s9bOSw
via IFTTT

Flash vs. Search Engine Optimization

Adobe Flash is a popular method to develop the websites. Not only design agencies or photographers using it, but even reputable institutions such as banks, don’t want have only pure graphics and HTML sites. This is also OK, so long as you do not exaggerate and use flash in the places where it is needed. A purely Flash-based site will disapoint you if you try to optimize it for the search engines. The reason is the most of search engines were developed under the assumption that the target page is written in HTML. Major search engines such as Google are already able to extract text and links from a flash file, but because the text in flash does not have the structure, the search engine can not see whether it is a title, a quote or another part of the document. This means that the search engine is not able, to determinate the importance of the text section and to sort the results by relevance. In HTML, on the other hand you can use H1-tag to label the heading and the other text sections in conjunction with the appropriate tags. This problem with assessing the importance of the results is a reason why the search results from a flash site, almost never shown by the search engine. Only if you explicitly search in Google for flash files, you get the right results.

flash

What must be done to minimize the problem?

The fact is that we can no longer simply renounce the flash. We must therefore find a way to satisfy both the visitors, as well as the search engine with the result. There is more than one way to Rome, but not all are good and without threats. The most commonly used method is to create two separate web sites, as a purely flash-based file and other – in HTML. But I think that this solution is not optimal and it would be rather to use a combination of flash and html. It is advisable therefore to use an HTML site with embedded in that flash at appropriate places. There is an opinion that it is good to use JavaScript to show a flash-based site for a genuine visitors and a text-based for a search engine. I find this method more than questionable, since it so easy to get deleted from the search engine index.

The post Flash vs. Search Engine Optimization appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2sD6DOw
via IFTTT

Targeted Web Site Traffic Requires Forethought

In the world of search engine optimization there are some black and white truths but just as many gray areas. It can be enormously difficult for many web sites to set a course for search engine optimization that will ultimately deliver them the results they desire. The reason for this is that most people do not fully understand how search engine optimization works and, worse yet they don’t fully recognize their target audience. In order to reach targeted web site traffic it is necessary for a web site owner to first understand who they are speaking to before they craft the message that they will deliver.

traffictarget

Just as with any business plan, search engine optimization must be put into place with a comprehensive plan – a method that site owners – and perhaps their search engine optimization specialist – will work through until they reach their goals.

1.    Identify the target audience. This is advertising 101. You need to know your audience in order to craft an appropriate message. Targeted web site traffic is just the same as the audience we target in traditional advertising. We need to identify them before we can reach them.

2.    Craft the message. In the case of increasing web site traffic the message is online content. This content is keyword rich – utilizing those keywords that will help to elevate the site’s page rankings with the search engines. Additionally, this content must serve to inform, entertain, and reach out the audience that is reading it. By understanding that content must serve several purposes, web site owners have a better chance for crafting an effective message.

3.    Be consistent. Search engine optimization only works as long as a web site works it consistently. By continually updating the site with fresh content, search engine optimization goals can be met.

The post Targeted Web Site Traffic Requires Forethought appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2u0RCUe
via IFTTT

Desperately Seeking Page One

The goal of any web site is to attract their targeted web site traffic and ultimately increase their customer base. But this is actually a secondary goal. This is because savvy web site owners understand their first goal is to achieve high page rankings with the search engines; because without the achievement of this goal there is very little chance of achieving the secondary goal.

The search engines are the door through which potential customers must enter in order to find a site. They use the engines as a way to navigate the web; by inputting a search using a keyword or keyword phrase, search engine users are essentially asking the search engine to deliver them with the sites that are most relevant to their search criteria.

It is incumbent upon a web site that is looking to increase web traffic to increase their page rankings. Being delivered on the first or second page is absolutely critical to a site’s success in being found by targeted web site traffic. It has been shown time and time again that most search engine users rarely go beyond page two of the search engine’s results. And so in order to be seen – to attract the attention of those search engine users that could be potential customers – web sites need to work their search engine optimization plan aggressively to move up the ladder of engine results.

To this end, many web sites find it helpful to work with a SEO specialist that will help a web site define their goals and set forth a comprehensive SEO plan. With consistent effort, page one could well be within sight.

The post Desperately Seeking Page One appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2u0UXml
via IFTTT

What is Local Maps Marketing?

Is your Business on the Map?  New customers and clients are looking for your business on Google, Yahoo, and Bing Local Maps.  Above is an example of a Google result for the search term “Los Angeles Attorney” which is searched 165,000 times a month.

Local Maps Marketing is the new online marketing tool that local businesses are using to find new clients.  Why?  Because Local Maps Marketing provides local results.  All of the businesses listed in the map are localized to the person searching.  Having your business listed on the first page of Google, Yahoo, and Bing maps will attract local clientele.

dotdot

It doesn’t matter if you’re an attorney, plastic surgeon, dentist, contractor or any other local business… Local Maps Marketing is for you!

Google, Yahoo, and Bing control 98% of online search traffic.  Why pay thousands of dollars a month to be listed online with companies like Yellow Pages and Dex when they control less than .5% of search traffic.  If you want new clients, you need to be on the first page of the Google, Yahoo, and Bing in the Local Maps.

Online searches are growing at a massive rate:

  • 74% of Internet searchers use local search to find services. Kelsey Group
  • 54% of people have substituted Yellow Pages for online search. ComScore Networks
  • 73% of all online activity is related to maps or “local search”. Google 5/07

It only takes a few weeks to have your business listed on the front page of the maps and to start receiving inbound phone calls from interested clients.

The post What is Local Maps Marketing? appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2u0h7Fo
via IFTTT

Friday, June 23, 2017

Image Search

Why would you want to optimize for image search and how do you do it? Here are some reasons for doing it:

Image search results are being increasingly used by the search engines in contextual search results to improve usability.

Google is serious about images. In October last year Google added an opt-in to Enhanced Image Search in Webmaster Tools. In conjunction with Image Labeler this allows Google to associate the images included in your site with labels that will improve indexing and search quality of those images.

If you search in Google or Yahoo for |pictures of diamond earrings|, |images of mowers|, |hammer image| or something similar, just above the organic listings you will get a row of relevant images. In Yahoo’s case sometimes the images are from Fliker rather than Yahoo image search, for example search Yahoo for |funny pictures|.

All three of the major search engines have a separate image search; Google image search, Yahoo image search and MSN image search.

Image search Statistics from Hitwise http://www.hitwise.com/ show image search is growing at 90% year on year and represents nearly 0.5% of all internet visits.

Traffic from image search can be targeted. It may not convert as well as organic search but it’s free!

Ok so how do you do it? The easiest way to explain is by example and I have created a new image for this page:

googlebot

We will be optimizing this image for the term |image search| which currently has 5,660,000 results on Google Image Search.

  • Put the search term in the page url. In this case it’s http://ift.tt/2tBCNbm
  • Put the search term in the page title. In this case it’s <title>Image Search</title>
  • Use the search term in close proximity to the image. In this case the search term appears twice in the sentence immediately following the image.
  • Make sure the page topic corresponds to the search term. In this case the page topic is definitely image search!
  • Make sure the image size is non-standard. In this case it’s 304 x
    203 pixels.
  • Make the image in .jpg or .gif format. In this case it’s .jpg
  • Name the image with the search term. In this case it’s image-search.jpg
  • Use the search term in the alt attribute, the title attribute and make sure that you have included the width and height declarations. In this case we have <img src=”http://ift.tt/2t2faMl; alt=”Image search” title=”Image Search graphic signed by Googlebot” width=”304″ height=”203″ />

I am not suggesting that you do all of the above for every image on your site but if you choose some key pages and optimize the image(s) (or create them specially and then optimize them) on those pages there will be a tangible benefit.

The search engines image databases are not updated all that frequently but when the image above ranks I will post an addendum here.

The post Image Search appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2t2fbQp
via IFTTT

Saturday, June 3, 2017

Software Submission

Using submission software is a very bad mistake made by a small minority of site owners. Tempted by sales copy that promises to “Submit Your website to more than 1.2 Million Search Engines, Directories and Link pages” they pay for software that is completely unnecessary and in most cases will have a negative effect.

In practice there are only a handful of search engines that people use and all of those have robots that spider the web looking for new pages and sites. There is absolutely no need to submit your pages to these search engines because they will find you quite quickly if you have a link from at least one site that is in their index already. Of course you should have a lot more than one inbound link!

The disadvantages of using such software is that most of the sites they submit to are there only to collect your email address and then make it available to spammers. So when you purchase the software you are in effect paying to be spammed which is not a good way to spend money.

Here is a section of a page from a site that sells this kind of submission software:

engines

Do some people really believe there are 700,000 search engines? Probably not but it doesn’t seem to stop them buying and using the software. Notice that one of the logos is DMOZ which is not a search engine but a directory. Automated DMOZ submission would be a very big mistake and all directory submission should be done by hand as outlined in a previous post Directory Links.

Some submission software claims to auto submit to guestbooks like this example here:

permalinks

This is even more ludicrous than search engine submission software and should be avoided at all costs. Guestbook spamming doesn’t do anything for your site as far as the search engines are concerned, except perhaps label you as a spammer.

Guestbook spamming has become less of a problem as the number of sites with guestbooks has decreased but has now been replaced by blog spam (posting comments to blog posts that then provided a link to a spammer’s web site). Here is a section of a page from a site that sells blog link generating software:

linksback

Of course it doesn’t work because current blog software has methods of preventing blog spam such as registration and posting with captcha (an acronym for “completely automated public Turing test to tell computers and humans apart”). Captcha is used on this blog. Even if the blog spammer breaks through the barriers and manages to post their spam most blogs apply the nofollow attribute to urls in comments, so it really is a complete waste of effort.

The post Software Submission appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2rCDYZJ
via IFTTT

Tuesday, May 23, 2017

Hidden Text (Revisited)

Just over a year ago I posted on the dangers of hidden text and concluded with the advice “…don’t use hidden text to try to improve your rankings“.

Here is a practical example of what may happen if you do.

Yesterday John Frost who runs the very popular Disney Blog posted that his blog had been delisted from the Google index and sure enough it had:

web

Such is the power of popular blogs that within a couple of hours of John’s plea for help their was an explanation and a resolution from none other than Google Engineer and spam fighter in chief, Matt Cutts. He explains in a diplomatic and friendly comment that hidden text was responsible for the ban. Specifically this page code:

<h2 id=”banner-description”>Informing Disney Fans the World Over with the latest news and updates from all Disney companies, divisions, and related stories. Disney World, Disneyland, Disney Cruises, Disney Animation, Pixar, ESPN, and more are covered in as much detail as I can muster.</h2>

With this in the external CSS file:

#banner-description
{
overflow: hidden;
width: 0;
height: 0;
margin: 0;
padding: 0;
text-indent: -1000em;
}

As it happens this appears to be a generic Typepad problem in that when you set up a Typepad blog you are asked to enter a Weblog description which ends up being hidden by the CSS. However after Matt had pointed it out and John had removed the text, Matt helpfully submitted a reinclusion request.

Matt has gone off to talk to Six Apart the Typepad developers and The Disney Blog will be back in the index sometime next week.

The moral of the story is still the same – don’t use hidden text to try to improve your rankings.

The post Hidden Text (Revisited) appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2rcIRJw
via IFTTT

Wednesday, May 3, 2017

What are keywords?

Keywords or keyphrases are the search terms that a user types into a search engine text box in order to find information relevant to their search. For example if you search Google for |chess| you will see something like this:
chess

This is called the Search Engine Result Page or SERP and Google tries to put the most relevant result first, then the next and so on. What you see is the top ten results out of, (in this case) over 24 million pages ranked in order of relevance.

Google can’t know if the user is looking for something more specific like for example; chess sets, chess clubs or the rules of chess, so the results will be a broad range of pages related to chess in some way.

A user seeking a chess club in Chicago is more likely to search for |chess club chicago| in which case they would see something like this:

chessclub

Notice how for this more specific search the number of candidate pages has gone down from over 24 million to around 1.8 million. As Google says “Choosing the right search terms is the key to finding the information you need”.

The obvious corollary for site owners is that choosing the right keywords to optimize for is the key to maximising the number of visitors and conversions (the percentage of visitors who take a desired action like buy a product or subscribe to a newsletter). In general the higher you are in the SERPs the more visitors you will have and the more specific the keywords the higher the conversion.

Users search in different ways with different words and site owners need to know what these keywords and phrases are for their particular business.

The post What are keywords? appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2p8t21J
via IFTTT

Saturday, April 22, 2017

Keyword Research

Keyword research enables site owners to choose keywords when constructing and optimizing a website (page). Keyword research is also extensively used to manage PPC advertising campaigns or if you are looking to research and identify profitable niche markets.

When performing keyword research for constructing and optimizing a website (page) you are looking to select search terms that will reach your target audience. One of the common mistakes made by some SEOs is to avoid choosing keywords that are very competitive. Searchers tend to use a lot modifiers when they search and the more competitive the search term the more likely they are to use a modifier. Removing the possibility of ranking well for these modified searches is not a good idea. Competition can be a factor in deciding how to target a specific search term but you should never ignore a search term simply because you believe it is too competitive.

There are two basic tools for the site owner:

1. Digital Point Solutions Keyword Suggestion Tool

laptopbattery

Digital Point’s online keyword tool compares Overture and WordTracker data side by side. It is free, quick and easy to use although it lacks WordTracker’s more advanced features. Here is a partial screen shot.

2. Google Adwords Keyword Tool

You will need an AdWords account to use Google Adwords Keyword Tool but signing up is easy and well worth it just to use the tool. Although primarily designed for AdWords it is also ideal for use as a simple keyword suggestion tool. The big advantage is of course that it is using the latest Google data and you can find and select keywords based on this data. You can create keywords from a url (i.e. one page), a whole site, a keyword(s) that you enter or for AdWords users the most relevant terms in your account. The results are shown by relevance but can be ordered by Advertiser Competition or Search Volume on a scale of 1 to 5. You can also download the results as a .csv (for excel) file which makes it easy to compile master lists.

Here is a partial screen shot of a list from a keyword.

googlekeywordtool

Here is a partial screen shot of a list from a url.

gkt2

Most site owners will find the above tools sufficient for their needs but if you want to investigate other tools there are basically two kinds, keyword analytical tools and subscription based tools. Here are some examples with a link to the product and a link to a review of the product.

Keyword Analytical Tools:

  • Keyword Analyzer Review
  • The Keyword Bible Review
  • The Dowser Review

Subscription based:

  • WordTracker Review
  • Keyword Discovery Review
  • Keyword Intelligence Review

A word of caution though if you try these tools. The major search engines (Google,Yahoo and MSN) do not make their raw data available to anyone so these products have to obtain data from somewhere else. For example WordTracker uses data from the Metacrawler and Dogpile metacrawlers which represents a very small and not very representative sample of searches. Not only that but in order to estimate figures like the predicted number of searches for a keyword an extrapolation has to be made. In WordTracker’s case they assume Metacrawler and Dogpile account for 0.86% of all search engine queries (a dubious statistic in itself) and scale up the numbers in their database accordingly. This has the effect of compounding any errors in the original dataset and at the very least means that these derived numbers should not be taken too seriously.

A most important source of keywords that is often overlooked is your server logs. Regularly mine your server log data to find the search terms people are actually using to find your site and use these terms to construct new pages or modify existing ones. You can read more about this process in these two posts Long Tail Search and Long Tail Search Tool.

March 1st, 2016 Wordtracker have introduced a free keyword suggestion tool that will generate up to 100 related keywords.

May 11, 2016 Wordze is a new subscription based tool which has some interesting features.

The post Keyword Research appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2pOUSku
via IFTTT

Friday, April 14, 2017

G profile video

https://www.youtube.com/channel/UCKKraAibgkEjOqvxkC2Vw0A

Short case study showing some rankings & traffic.

The post G profile video appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2pfUbTT
via IFTTT

Sunday, April 2, 2017

Search Engine Friendly Urls

It is important to have search engine friendly urls if you want your pages spidered and indexed by the search engines but what does having search engine friendly urls actually mean? Let’s take a look at what the three major search engines say about urls:

Google has three things to say on the subject in its Webmaster Guidelines:

1. If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

2. Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

3. Don’t use “&id=” as a parameter in your URLs, as we don’t include these pages in our index.

Yahoo in their Search Indexing FAQ say:

Do you index dynamically generated pages (e.g., asp, .shtml, PHP, “?”, etc.)?

Yahoo! does index dynamic pages, but for page discovery, our crawler mostly follows static links. We recommend you avoid using dynamically generated links except in directories that are not intended to be crawled/indexed (e.g., those should have a /robots.txt exclusion).

MSN’s Guidelines for successful indexing say:

Keep your URLs simple and static. Complicated or frequently changed URLs are difficult to use as link destinations. For example, the URL http://ift.tt/1iUlcEq is easier for MSNBot to crawl and for people to type than a long URL with multiple extensions.

The message is clear, static urls are better than dynamic but if you have a dynamic site the urls must be as simple as possible, with only one or two query strings and no session IDs.

A url that might look like this:

http://ift.tt/2ntCivH

Should preferably look like this:

http://ift.tt/2n115vH

How you achieve this depends on whether you are starting out with a new site or have an established site with existing complex urls.

If it is a new site then search engine friendly urls must be built into the design criteria. How this will be done depends on the programming language. For example if you planned to use PHP then you might make use of the PATH_INFO variable or if you use ASP.NET then you could modify the Global.asax file.

If you plan to use a content management system (CMS) then make sure that it generates search engine friendly urls out of the box. The Content Management Comparison Tool has a check box for ‘Friendly URLs’ if you are researching CMS tools.

A completely different approach (not approved of by geeks but worth consideration if you are designing your own site as a non-professional) is to create static HTML web pages from a database or spreadsheets but not in real-time. WebMerge for example works with any database or spreadsheet that can export in tabular format such as FileMaker Pro, Microsoft Access, and AppleWorks. Using HTML template pages WebMerge makes a new HTML page from the data in each record of the exported file. It can also create index pages with links to other pages and generated pages can be hosted without the need for a database.

If it is an existing site then problematic urls can be converted to simple urls in real-time. If you are on an Apache server then you can use mod_rewrite to rewrite requested URLs on the fly. This requires knowledge of regular expressions which can be rather daunting if you are not a programmer. Fortunately there is an abundance of mod_rewrite expertise at RentACoder if you get stuck. If you are on Internet Information Server (IIS) then you can use something like ISAPI_Rewrite to rewrite your urls which also requires knowledge of regular expressions.

What ever your solution you should try to incorporate your keywords in the urls and only ever use hyphens, never an underscore or space.

The post Search Engine Friendly Urls appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2opx5tB
via IFTTT

Monday, March 20, 2017

Frames

Much like Gertrude Stein’s painter, web designers have hopes for their framed web designs but are very often disappointed. This is because framed websites do not fit the conceptual model of the web where every page corresponds to a single URL. Consequently designers must use a variety of tricks to overcome the disadvantages and if you miss a trick there can be unpleasant results.

Designers’ intent on using frames may use the NOFRAMES element which can be used to provide alternative content. However not the useless alternative content provided by so many designers such as “This site requires the use of frames” or “Your browser does not support Frames” which is a great way to prevent your website being found on a search engine. The correct use of NOFRAMES is described in the W3C document Frames in HTML documents.

Apart from having to provide alternative content the other major problem is what happens if a search engine query matches an individual frame on a page? The search engine simply returns the URL for that frame and if a user clicks the link then the page will not be displayed in a frame because there will be no frame set corresponding to that URL. Designers get round this by detecting when a content page is trying to display outside its frameset and redirecting to either the home page or to a framed page that loads the orphan into an alternative frameset. If you really want to know how to do this you can read a description of the technique using JavaScript in Give orphan pages a home.

Also framed sites have a problem with obtaining inbound links because it is not easy for someone to link to one of the content pages. Either they must link to the home page and give directions to the page they want to point to or they must bypass the frame arrangement. If it’s not easy to link then only the very determined will be prepared to go to the trouble of doing so.

If you want the framed look but don’t want the problems you can achieve it through cascading style sheets. Stu Nicholls has an excellent example on his website CSS Play (and there are lots of other interesting experiments with cascading style sheets on Stu’s site).

The bottom line is this, if your web designer uses Frames seek a better and more experienced designer and if you find Framed sites attractive in spite of the problems ask yourself why your competitors do not use them.

The post Frames appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2mMeILn
via IFTTT

Sunday, March 5, 2017

Flash

Triumph and dismay are two feelings Flash designers know well. The triumph comes with mastering a rich diversity of features in a difficult technology and producing a visually appealing result. The dismay comes when the SEO wants to remove the Flash components from the website they have just designed!

Initially a simple tool for delivering low bandwidth animations over the Web Flash has evolved into a platform for delivering complex applications and rich media content. Now Flash is able to deliver far more than animations or short video clips.

Flash has become the delivery mechanism of choice for educational and complex business applications. Universities use Flash to great effect for delivering entire lectures with quizzes and assessments in real-time. In commerce Flash is used for everything from cattle auctions to virtual laboratory experiments.

However its use on websites has declined and there are two reasons for this. Firstly every usability study ever done shows that web surfers dislike Flash intensely, particularly Flash intros. Secondly Flash is a visual experience and search engine robots are blind, which means the SEO of Flash sites is problematical. Sites designed around Flash or with Flash intros and Flash navigation are often developed at the request of clients who do not know any better and the developers have not sought to educate them.

Take for example the following site that is completely built in Flash. Although there are several pages of information, because the navigation and the content are all in Flash the search engines are only aware of one page. Here it is in a reduced size window.

This site cannot even be found for the organization’s name and might just as well not exist. Flash enthusiasts might claim that this is just a poor implementation and that it is possible to optimize Flash sites. It is true that there are a variety of methods used to optimize Flash sites and these include placing the Flash inside invisible framesets or using invisible layers in Cascading CSS to present content to the search engines. Macromedia even have Search Engine SDK but in reality none of these methods is entirely effective. Sometimes you will even see a Flash site duplicated with an HTML version for the search engines but the bottom line is, why bother with the Flash site at all if users don’t like them.

However although this may be (or maybe not) effective as a product demo it does nothing for the search engines. If used it should be placed on a normally optimized page and not considered as a replacement for text. Even then whether something like this is worth spending time and money on is a mute point.

The post Flash appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2lsOOiR
via IFTTT

Tuesday, February 21, 2017

Sitemaps

Sitemaps can be of two kinds; a page or pages on your site that lists the pages on your website, often hierarchically organized or ‘Google Sitemaps’ which is a process that allows site owners to submit urls of pages they would like to have included in Google’s index. The two kinds of sitemap serve slightly different purposes, both important.

A conventional sitemap is designed to help the human visitor if they can’t find what they are looking for and also to ensure that Googlebot (Google’s Web Crawler) finds the important pages of your site. A well executed example of this kind of sitemap is Apple’s sitemap. From the optimization point of view a page like this presents an opportunity to link to your own pages with appropriate anchor text (see the last paragraph of Internal Links). If you have more than a few pages on your site then a sitemap can only be advantageous.

Google Sitemaps however is a solution to a problem that Google has with crawling the entire web. Googlebot spends a lot of time and resources fetching pages that have not changed since it last looked at them. Crawling billions of pages to find that the majority are the same as last time is not very efficient and Google Sitemaps has been designed to improve the process. The idea is that site owners submit a sitemap to Google and next time Googlebot visits their site it knows where to go and look for changed or new pages.

If site owners use Google Sitemaps it will reduce their machine time and reduce their bandwidth i.e. it saves them money. Also site owners get their new content indexed quicker and a reduced load on their servers by Googlebot not fetching unchanged pages. Google have provided a sitemap protocol and an automated process for the whole procedure.

Google Sitemaps does not replace the established Googlebot crawling procedure and should be used to solve specific problems, such as:

  • If you need to reduce the bandwidth taken by Googlebot.
  • If your site has (accessible) pages that are not crawled.
  • If you generate a lot of new pages and want them crawled quickly.
  • If you have two or more pages listed for the same search you can use page priority to list the better one.
  • Google have an extensive help and explanation of the procedure at About Google Sitemaps.

August 5, 2015

Google has renamed Google Sitemaps to Google Webmaster Tools under the new heading of Webmaster Central.

April 15, 2016

Google, MSN, Yahoo and ASK have recently announced support for sitemap auto-discovery via the robots.txt file and have agreed on a standard sitemaps protocol.

By adding the following line of code to your robots.txt file the search engines will now auto-discover your sitemap file.

Sitemap: http://ift.tt/XxxkvM

The post Sitemaps appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2lryX17
via IFTTT

Sunday, February 19, 2017

Why Local Search is Important for Your Business

As much talk as there is about Internet Marketing breaking down boundaries and flattening the world, most commerce is still defined by physical space. For instance, consumers are still going to search for businesses in their area. Because of this, the importance of Local search cannot be overstated.

Local search sites are supported by advertising from businesses that wish to be featured when users search for specific products and services in specific locations. For example, if you live in New York City and you search for “lower west side Manhattan bakery” you are going to get the top Local searches. Local search advertising can be highly effective because it allows ads to be targeted very precisely to the search terms and location provided by the user.

Optimizing your Local Internet Marketing has existed for a long time. 20 or 30 years ago, it was done by businesses putting out adverts in the Local publications. Now consumers search online. And the power of Local searching is only getting bigger, as more and more people search for products or businesses on their smartphones. (Someone looking for a nearby restaurant on his/her iPhone, for instance.) So the idea of the Local search hasn’t changed, just the medium has. Social media marketing demands your business change its approach, too.
One out of five searches on Google is related to location. So make sure your business has optimized content on its site in order to have a high ranking for Local search results.

Here are some benefits of high ranking in Local search.

  • Less competition for your keyword.
  • You’re narrowing your search to your specific location, so you are no longer competing with others around the world. This gives you a chance to shine and better chance to rank.
  • Popularity of Local search is increasing.

We mentioned this before, but it bears repeating. Much of the future of consumer browsing will be done on smartphones. In the age of smartphone apps, you want to make sure that your site is optimized for Local search. If someone is on his/her car and is looking for your services, you need to make sure they don’t drive an extra 10 miles to go to your competitor when he/she could have found you.

Targeted traffic means increased conversion rates.

By consumers narrowing down their search and finding your site, the conversion rates naturally become higher. Now, here are the key things to remember about Local search.

Keywords and Location

One of the biggest mistakes people tend to make it not choosing the right keywords. Use the Google Keywords Tool to look up what keywords people are using for your service.

Title Tags

Be sure your most important keywords are at the beginning of your title tag, and that it includes your targeted location keywords.

Header Tags

They have a lot of impact on SEO.

Internal Linking/Inbound Linking

Internal linking allows both consumers and search engine robots to navigate through your site’s pages smoothly and logically. Inner linking allows the robots to find your most important pages faster besides only relying on the sitemap. Inbound links are basically other sites pointing to your site as a reference. Inbound linking allows the search engines to evaluate how popular and how important your website is.

Sitemap/Robots.txt

Always be sure that you have a proper Sitemaps and Robots.txt set up on your site. This is strictly for the search engine robots to completely understand the structures of your site and any restrictions or directions you have for it.

The post Why Local Search is Important for Your Business appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2kOwNGG
via IFTTT

Blogging for SEO

One of the pillars of Search Engine Optimization is content generation. It’s a simple formula, really. Your site must produce fresh, unique content. This is what the search engines love the most.

Google will rank sites based on the popularity of your content. That’s why your site must create content!
Blogging is probably the best way to create content. It gives your site a way to make more of everything- pages, posts, links, keywords, etc. The more content you have, the better your chances of ranking high on Google. This is why so many business sites have blogs!

Here are some tips to blogging for Search Engine Optimization.

Write what you know.

If you’ve never maintained a blog before, you might be unsure of what to do. But don’t worry. The blog is designed to give your readers information you know and they want. That means writing about your business! You’re the expert, so feel free to schedule a series of blog posts that cover all topics you can think of. Be theoretical as well as topical. Write about the nuts and bolts of your business, as well as newsworthy info that matters to your business and your customers.

Keep it fresh and unique. Put it in your own words.

We can’t stress this enough! Google doesn’t simply reward content. It rewards NEW CONTENT. So make sure you aren’t copying another site, word for word. The chances are, you might be blogging on a topic a competitor has already written about. There’s a lot of content on the web, and you might not be the only business in your industry with a blog. That’s not a problem! You can write about similar things, but make sure you aren’t copying what the’ve written. You can get into copyright trouble for that. But more importantly, IT DOESN’T HELP YOU WITH SEO. YOU NEED FRESH, UNIQUE CONTENT. For SEO, just make sure you put everything in your own words. That’s what counts!

Blog early and often.

  • Blog regularly. Like every business day.
  • Keep it short.
  • Blog posts should be 150-250 words each. People aren’t willing to spend too much time on a blog. They are scanning, not reading in-depth. So keep it punchy and to the point.
  • Be playful.
  • You want to be informative, first and foremost. You want people to come to your blog for insider information. But your blog must be entertaining and informal. Feel free to tell jokes, and include colorful stories and photos.
  • Be yourself.
  • Popular blogs are the ones that take on the personality of its author(s). So be yourself.
  • Include photos and videos.
  • The more visual the better. People will read your blog if breaks up the text with great images.
  • Remember those keywords!
  • All your site’s content MUST BE OPTIMIZED. Always, always use your keywords in every single blog post. Use the 3-5% keyword density formula!

Use Social Media.

Last but not least. Make sure all of your blogging is sent out to your Twitter feed, your Facebook Page, your LinkedIn profile, etc. This is easily than it seems. Just set up the RSS feed in your blog. That will allow all of your posts to go directly to your social media platforms.
Follow that list and you’ll do great!

The post Blogging for SEO appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2kXIzj3
via IFTTT

Thursday, February 9, 2017

First Impressions

For years usability studies and server log file analysis have tended to indicate that home page web designers have just a few seconds to create a favorable initial impression on the user. New evidence contained in a soon to be published paper suggests that a few seconds may be a gross over estimate.

Gitte Lindgaard and her colleagues from the Human Oriented Technology Lab (HOTLab) at Carleton University have conducted studies to ascertain how quickly people form an opinion about webpage visual appeal. The paper is to be published in the March-April 2016 issue of Behaviour & Information Technology.

Three studies were conducted in which subjects were presented with brief glimpses of previously classified home pages and asked to rate them for visual appeal. The results were highly correlated with assessments made over much longer periods of time and indicated that visual appeal can be assessed within 50 milliseconds. This is an astonishingly short period of time given that a normal human blink lasts 200–300 milliseconds.

Gitte Lindgaard and her colleagues have given the paper a rather apposite title “Attention web designers: You have 50 milliseconds to make a good first impression!”

In practice this means fast downloading home pages, limiting the graphics and providing information in the simplest way possible. If you are explaining, you are losing.

The post First Impressions appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2k7QNbZ
via IFTTT

Sunday, January 22, 2017

Five Questions for Web Designers

If Frank Lloyd Wright were alive today I wonder what he would say about web designers’ mistakes. I get to see thousands of prospective clients and their competitors’ websites over the course of a year and although web design is improving I am still left thinking that 95% of web designers and web design firms just don’t understand the basics.

I have had to become an expert in diplomacy while explaining to prospective clients that the website for which they have paid hard earned money is (to put it politely) not as good as it might have been.

There seem to be five web design and build failures that come up again and again that require discussion with website owners. I rarely if ever get to talk through these points with the designers so I have listed them here as questions.

If you are thinking of having a new site or revamping your existing site you may want to make sure that these questions will be unnecessary before you appoint someone to carry out the work.

Here are the five questions for web designers:

1. Why don’t you learn what goes in the HEAD element?

Just because your client is unlikely to peruse the HEAD element doesn’t mean you should ignore it or fill it with garbage.

2. What’s so difficult about producing search engine friendly urls?

Dynamically generated urls can cause problems for search engine crawlers and may be ignored. Why not generate search engine friendly, human readable urls instead?

3. Why large logos?

Logos that take up 25% of the home page are a waste of valuable real estate. Users want to see what they came for not pictures of models staring up at the camera.

4. Do you leave blank alt tags for a reason?

Alt tags really do have a purpose. They are for the many users who use talking browsers, screen readers, text browsers or browsers on small devices.

5. Why don’t you use web standards like W3C?

Did you know that separating structure from presentation makes it easy for alternative browsing devices and screen readers to interpret the content? Or that using semantic and structured HTML makes for simpler development and easier maintenance? Or that less HTML means smaller file sizes and quicker downloads? Or that a semantically marked up document is easily adapted to alternative browsing devices and print? Or that if you use standards and write valid code you reduce the risk of future web browsers not being able to understand the code you have written?

The post Five Questions for Web Designers appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2jEjmKI
via IFTTT

Thursday, January 5, 2017

Information Architecture

Information architecture is simply the practice of structuring information and is most often put to use when designing large websites. It is a term used to describe the organization of a web site and includes aspects such as navigation, functionality, content, information, and usability.

Designers of large and complex websites are particularly concerned with information architecture and the formal processes that it involves. In fact most of the larger web design companies have established information architecture departments. However small business website owners/designers can also benefit from some elements of the process and here is an illustrative example.

Let’s say you are an up-market antique jewelry store owner and are planning a website.

The first thing you would do is determine your website’s goals. As a store owner this is quite straightforward because the goal is simply to sell your inventory of antique jewelry. However you can imagine that in a large organization there will be multiple goals. For example the human resources department will want to advertise vacancies, the PR department will want to communicate with the shareholders and so on for all the various stake holders within the company.

The next thing to do is determine who the audience will be (your target market). Given your experience in the store you may decide that these will be couples who are soon to be engaged or married, collectors and people looking for a gift to mark an anniversary. There will be other audiences but this is a simplified example.

Now that you have a clear idea of your site’s goals and who the audience is you can compile a list of what it should contain. That means simply write the headings not the content itself. When you have an exhaustive list of headings you can group and label them.

After you have decided on the content groupings and labels, use them to define the major sections of your site and the names of each section. The sections will become the basis for your site structure.

(Information architects will at this point produce architectural blueprints or visual representations of the site structure. These are simply diagrams showing how elements of the site are grouped and relate to one another. If the site is large they will make several architectural blueprints starting with an overview and working toward diagrams with a finer grain within the site).

Next create the site structure by arranging the sections from broad to narrow. Here is a simplified diagram of how it might look.

The idea here is that given the overall theme of the site (Antique Jewelry) you move from broad (Periods, Types, Makers, etc.) to narrow (Edwardian, Victorian, Regency etc.) to specific. Specific is in this case your inventory. The structure resembles a pyramid with your home page at the apex and your inventory at the base.

So for example the ‘Tiffany Co. three stone Edwardian engagement ring’ will be found easily and naturally by someone looking for Edwardian jewelry, or someone looking for an engagement ring or by a collector of Tiffany jewelry. Site linkage is done from top to bottom (downwards) with the minimum of linking across the width of the pyramid.

There are many other possible structures but ‘pyramid theming’ has many advantages, not least that if you add content to your site (which you should do regularly) it is easy to see where it should placed for maximum benefit.

Once the site structure has been determined you can see what sort of functionality will be required and then the site navigation will just fall logically into place. (At this point Information Architects will construct a model and conduct cognitive walkthroughs. This is simply a review technique where expert evaluators role play the part of a user working within the navigation system on particular ‘tasks’ i.e. they are ‘walking through’ the interface. Any problems with the navigation can be picked up before the site is built).

You may want to use some of these ideas when building your next website or working on a redesign.

The post Information Architecture appeared first on Opmax Online Marketing & More.



from Opmax Online Marketing & More http://ift.tt/2iM2H9T
via IFTTT