One Brad Williams To Rule Them All

By StrangeWork.com: After over a year of blogging I have finally taken over the number one spot on Google for the search term “Brad Williams”. It wasn’t an easy task, namely because there is a little person comedian of the same name that was on TV for a bit. Having a somewhat common name didn’t help either!

I realize everyone hits different Google servers and this could change at any minute, but for now it sticks and I have a picture for proof! BOOYA!

Google SERP for Brad Williams

How To Increase Blog Traffic part 1: Google Webmaster Tools

By StrangeWork.com: I was asked by a couple friends of mine, Brooke and Toni, about how to get more traffic to their blogs. I decided to write a quick tutorial to help everyone learn a couple techniques I use to help rank higher in search engines.

Google Webmaster Tools

Google Webmaster Tools have been around for a while now, but many bloggers do not know how to utilize these tools to increase blog traffic. I use two quick and easy techniques, Sitemaps and Robots.txt, that I will discuss in this article.



Add Your Blog Sitemap

Step 1: Find Your Blog Sitemap

A blog sitemap is an XML feed of every link available on your blog. By default most blogging platforms, including WordPress, Blogger, and Moveable Type, have an RSS feed preinstalled on your blog. An RSS feed is a feed made using XML so there is no setup, it’s already there!

Examples for popular blog platforms:

WordPress – http://www.blogname.com/rss
Blogger – http://blogname.blogspot.com/feeds/posts/default
Moveable Type – http://www.blogname.com/atom.xml
Feed Burner – http://feeds.feedburner.com/blogname

Just replace blogname with the name of your site. Use the examples above to figure out what your feed URL is. You will need it in the next step.

Step 2: Load your blog sitemap in to Google Webmaster Tools

Visit the Google Webmaster Tools Home Page and login to your Google account. If you don’t have a Google account, register one real quick.

Once logged in to the Tools dashboard you will see a textbox at the top allowing you to add your blog:

Google Webmaster Tools Dashboard Screenshot

Type your site URL in and click the “Add Site” button.

Next you will need to Verify your blog to Google. Click the “Verify your site” link and follow the instructions given by Google.

Once you have verified your blog click the “Sitemaps” menu item from the left hand side and click the “Add A Sitemap” link.

Select “Add General Web Sitemap” when asked to Choose Type. You will then see a text box to enter in your RSS feed URL like this:

Add Sitemap Tool

Enter your feed URL and click the “Add General Web Sitemap” button and your done! Once Google verifies your feed is in the proper format they will then crawl your feed daily to look for new links.

The benefit of providing Google with a sitemap is to help them improve how they crawl your blog. Over time you will start to notice an increase in search engine ranking which will result in more traffic.



Add Your Robots.txt File

Step 1: Update/create your robots.txt file

A robots.txt file is a small text file placed on your blogs web server that tells search engines what they are allowed, and not allowed, to index on your blog. Most blogs come with a robots.txt file preinstalled. Check to see if your blog has one by visiting the following URL on your blog:

http://www.blogname.com/robots.txt

Just replace blogname with the name of your site. We are going to create a new robots.txt file so it doesn’t matter if you do not currently have a robots.txt file on your blog.

Create a text file named robots.txt on your desktop. Enter the following code in to your newly created file:

User-agent: *
Disallow:
Sitemap: http://www.blogname.com/sitemap.xml

User-agent: * – indicates to all search engines they are allowed to crawl your site
Disallow: – incidates what URLs to exclude from being crawled by search engines. Since we are not entering a value to the right we are telling all search engines to crawl everything on your blog
Sitemap: http://www.blogname.com/sitemap.xml – indicates to the search engines where your sitemap is located. Use the sitemap link from the previous tip.

Save your file and upload to the root directory of your server. Google looks for a robots.txt file on your blog once a day. You can verify that Google finds your file by visiting the following Webmaster Tool section:

Dashboard > Tools > Analyze robots.txt

That’s it! Now all search engines will know where your sitemap is located from your robots.txt file. The major benefit of this is to help search engines find your new posts and links easier.

Google Webmaster Tools offer some really great statistics about how Google crawls and indexes your blog. Make sure you poke around in the other sections.

To install a true Google Sitemap be sure to check out the Google Sitemap Generator Plugin for WordPress.

I hope you find these tips helpful! Stay tuned for part 2.

Upgraded Blog to WordPress 2.3

By StrangeWork.com: After reading Jeff’s WordPress upgrade post I decided to take the plunge and upgrade my blog to WordPress 2.3. As always WordPress has made this process extremely easy. Simply follow their online instructions and you shouldn’t have a problem.

One small annoyance, which was changed a few versions ago, is the removal of the Post Preview Pane located at the bottom of the posts form. WordPress 2.3 does include a Preview >> link, but I hate opening new windows just to preview a post. Here is a quick hack to get the Post Preview Pane back:

Open up your post.php file from your wp-admin directory.
Navigate to approximately line 72 and find the following line of code:
include(‘edit-form-advanced.php’);

Directly underneath that line, add the following code:

?>
<div id=’preview’ class=’wrap’>
<h2 id=”preview-post”><?php _e(‘Post Preview (updated when post is saved)’); ?></h2>
<iframe src=”<?php echo clean_url(apply_filters(‘preview_post_link’, add_query_arg(‘preview’, ‘true’, get_permalink($post->ID)))); ?>” width=”100%” height=”600″ ></iframe>
</div>
<?php

Save your changes to post.php and upload. Easy as that!

WordPress 2.3 has added some really slick new features including native tag support, plugin update notifications, and a few new SEO URL updates. It appears all of my plugins ported over without issue, but be sure to check your plugins before upgrading.

I also have the dashboard bug that Jeff posted here. Apparently it has something to do with using Google Blogsearch rather than Technorati. I’ll see if I can find a hack to switch it back to Technorati, because that is a feature I’ve always enjoyed.

* UPDATE
I found the hack for incoming links.

If you want to get Technorati results back change:
“http://blogsearch.google.com/blogsearch_feeds?hl=en&scoring=d&ie=utf-8&num=10&output=rss&partner=wordpress&q=link:”

on lines 11 and 12 in index-extra.php to
“http://feeds.technorati.com/cosmos/rss/?url=”
and
“http://www.technorati.com/search/”

I recommend everyone take the plunge and get upgraded!

How To: Remove nofollow from WordPress Comments

By StrangeWork.com: I decided to remove the nofollow code from my blog comments. Nofollow is an attribute that can be added to links to discourage comment spam. I use Akismet to block spam comments, which works about 98% of the time, so why penalize good comments from reaping the SEO benefits of outbound links on my blog?

I used the dofollow WordPress plugin. You can download the plugin here:
http://www.semiologic.com/software/wp-fixes/dofollow/

To install upload the dofollow plugin folder into your plugins directory and activate the plugin through your WordPress Plugins admin panel. Easy as that! Now readers of my blog who post comments will receive a good inbound link to their site. I wonder if this counts for good karma points?

301 Permanent Redirect using ASP

By StrangeWork.com: I’ve been using my search engine optimization skills on a few different sites recently. The Democrats for Monmouth County website is my current challenge. One of the main challenges I faced was converting the old ugly links over to the new SEO friendly links without losing placement in the search results.

Link conversion example
Old URL:
http://mcds2007.com/Default.asp?P=393

New URL:
http://mcds2007.com/contact

To keep your existing search engine rank, use a Permanent 301 Redirect to force the spiders to the new link. Simply add the below code to the top of your ASP script:

Response.Status=”301 Moved Permanently”
Response.AddHeader “Location”, “/contact”

Easy as that! Google, Yahoo, and the rest of the search engines will update your site links in their index without penalizing the new link.

How To: Create Custom ASP URLs with Querystrings using ISAPI Rewrite

ISAPI Rewrite is a powerful URL manipulation engine based on regular expressions. It acts mostly like Apache’s mod_Rewrite, but is designed specifically for Microsoft’s Internet Information Server (IIS). ISAPI_Rewrite is an ISAPI filter written in pure C/C++ so it is extremely fast. ISAPI_Rewrite gives you the freedom to go beyond the standard URL schemes and develop your own scheme.

Below are a few very easy to follow ISAPI Rewrite rules using regular expressions:


EXAMPLE 1

DESCRIPTION:
use one querystring as a subdirectory

ORIGINAL URL:
domain.com/member.asp?username=brad

NEW URL:
domain.com/brad

ISAPI REWRITE RULE:
RewriteRule /([^/]+) /member.asp?username=$1 [I,L]


EXAMPLE 2

DESCRIPTION:
use two querystrings as subdirectories

ORIGINAL URL:
domain.com/member.asp?username=brad&page=2

NEW URL:
domain.com/brad/2

ISAPI REWRITE RULE:
RewriteRule /(?!images|js|css)([^/]+)/([^/]+) /member.asp?username=$1&page=$2 [I,L]

* notice the (?!images|js|css) section of the rule. This piece tells the above rule to ignore those subdirectories (images, js, css).


EXAMPLE 3

DESCRIPTION:
use one hard coded subdirectory and one querystring as a subdirectory

ORIGINAL URL:
domain.com/member.asp?user_id=1

NEW URL:
domain.com/widget/1

ISAPI REWRITE RULE
RewriteRule /widget/([^/]+) /member.asp?user_id=$1 [I,L]


Search engine spiders, and users, will ONLY see the newly formatted link. The querystrings are still being passed to the server, but they are not visible to anyone surfing the site. This is a HUGE advantage for making a dynamic site SEO friendly. Search engine spiders have always had issues with long complex querystrings, but masking the URL using ISAPI REWRITE has finally closed the gap between dynamic sites and SEO.

ISAPI Rewrite Homepage

Moving up on Google

I was searching my name on Google last night and noticed my blog had finally creeped onto the 1st page of search results:

Brad Williams on Google

6 spots left! Pastor Brad is going down next!