Web site statistics, logs, analysis and promotion:

We are not talking about spying on your visitors. A browser visiting your site only gives out a request for something (the web page html file, images, etc.) Included in that request is a very limited set of information, as shown below. Most servers keep a record of these requests (a log file), which your host can make available to you, either in raw form, or they may run a statistics program and furnish reports.

For a typical Oregon small business webmaster, analysing log files can reveal what works, and what doesn't about your site. It can also help you guide visitors to your site.

Alternatives to log file analysis.

Third party services - usually include a small image on the page, called from another server. These can slow performance - each page request includes a call to a different server. Also may expose your statistics to a competitor. Not recommended.

Cookies. Can produce more precise session information, don't work if you visitor has cookies turned off.

A Typical server log entry:

192.168.0.1 - - [26/Oct/2001:19:56:36 -0700] "GET / HTTP/1.1" 200 7172 "http://www.google.com/search?q=web+promotion+oregon&btnG=Google+Search" "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)"

Looking at this information, A log analysis program can derive a variety of clues about visitors. We can get some idea of how many page views per day, what the entry page was, what the exit page was, how long they spent on a page, what path they took through the site, where they came from (another web site, a search engine, etc). All this information is inaccurate- but useful. Log files come in various flavors, depending on the server. In general, for basic web site promotion, a combined or extended log is good. If you have a choice between an access log and a referer log, take the referer. If your host will not furnish logs, find a new host. (most hosts will not furnish logs for home pages, just domains).

The limitations of server statistics:

1. Number of visitors: Not all people who look at a web page get it directly from the server, it can be stored (cached) by a search engine, or perhaps someone else at their ISP viewed the page, and the ISP catches pages, or they had been to the site before, and their browser cached it. Many webmasters don't like search engines caching pages. Some visitor sources may assign different IPs to requests from the same user.

2. The User agent (browser, robot, etc) might be lying. If it is an e-mail havesting robot, it certainly isn't going to tell you its home page, if it is an honest, hardworking search engine bot, it will probably include an address for more information about it. Good bots always ask for robots.txt before sampling pages.

3. The derived information, such as path through a site, is not necessarily accurate either, as a person may revisit pages, etc.

Some Information is much better than no information:

For the owner of an Oregon web site, the most useful benefits of log analysis are the following:

1. Is traffic increasing? - Are folks staying to look around when they get to your site?

2. What search terms are they using to find your site? Are they the ones you expect? or where are they coming from?

3. Are the search engine robots coming around to index you site regularly? If not, you probably need incoming links (or to pay them, depending on the SE)

4. Are there broken links to your site? - either internal or external. (you can also run a link checker)

5. What pages do people seem to like? What page do they seem to leave from (perhaps it has a problem, or does not give them a path onward)

Helpful Information about Log files:

Discussion forums and a few web sites can help you identify spiders (robots) that visit your site. A couple examples of log analysis software.

 

Oregon web site design and promotion guide

May 24, 2002

ML WEBB, web site design and promotion, Eugene, Oregon
Willamette Valley, Oregon

 

© 2001 ML WEBB - www.mlwebb.com/about.htm
See our Privacy Policy