Get a free consultation now
Name  
Phone  
Email  

Client: SiliconSlopes.com

“The newly designed SiliconSlopes.com is a vast improvement over past iterations of the site. The new design created better usability, increasing conversions and community participation.”

– Mark Adams, Dir. of Community Development, Omniture

RSS Feed

Laboratory Lingo

Our blog of tips and insight from the Foundry

Jerrel Tucker & TCA Partners

November 20th, 2014 at 3:04:02 pm by

We recently created a website for TCA Partners in Fresno California. They are a group of brilliant accountants who have extensive experience doing accounting in the healthcare industry.  Jerrel Tucker is a partner with the firm and has been great to work with.  You can learn more about Jerrel here.

Learn more about TCA Partners, and the great group of accountants that work there by visiting their website at http://www.tcapartnersllp.com

TCA Partners has been great to work with.  The challenge of developing a site that communicates their ability to strategically align your business and accounting goals was not insignificant.  View the site below for more information.

TCA Partners

TCA Partners

Direct Mail Campaigns

March 27th, 2013 at 2:00:54 pm by

We’ve begun work on a few direct mail campaigns that we’ll be rolling out over the next few months.  We’re trying to creatively market to people we think will be interested in similar services.  Keep an eye out in the coming weeks for pictures, videos and links to the microsites that will go along with the campaigns.

search-friendly-web

December 5th, 2012 at 10:56:35 am by

Persistence Pays in Search Marketing

February 16th, 2011 at 4:30:04 pm by

I can’t tell you how many times I’ve seen a website go from obscurity in the search engine rankings up to the top for its targeted keywords after a consistent and persistent optimization approach.  We recently experienced a real struggle with a client’s website that we have been trying to get ranked for months.  We didn’t see any movement in the search engines, not even an indexing for most of the keywords.  We were baffled because the site was setup up very well from an onsite SEO perpective, had a registration far into the future, has been online for months and has a whole slew of great inbound links.  Just today, after running new reports, we found that all the hard work has just finally started to pay off—and all at once it seems.

I’ve seen this happen before in the past.  It’s not very common to go from zero to hero like this, and I don’t expect that this is permanent by any means (especially if we foolishly decided to quit link building, publishing new content, and hoping to retire off of this quick boon).  However, I’m really optimistic about the long term success of this site where just a month ago I thought there might not be any hope.

The moral of the story: persistence pays off.  Stick at it, keep building links, keep publishing content, keep innovating onsite and off and you’ll see results eventually.  Up next: conversion optimization.  Now that we’re getting traffic, we need to be sure we’re doing everything possible to monetize it.  Stay tuned.

Google Webmaster Tools & Top Search Queries

April 19th, 2010 at 10:04:58 am by

I’ve spent some time today comparing Google Analytics data with the data now shown in Google Webmaster tools. I was intrigued after hearing about Google’s announcement on Friday and learning that they were now offering more data in the Webmaster tools, including top search queries complete with click through data.  Today I learned that the data is not really accurate—yet.

I’m showing, in most cases, that my sites are getting many more actual click throughs (as shown in analytics and server logs) than Webmaster tools is even registering in impressions.  For example, I’m showing 61 visits in analytics for a certain keyword that Webmaster tools shows as only getting 58 impressions.  Time to wait for more data to come through.

It will certainly be nice once the data is more accurate.  Makes me wonder if Google will eventually allow you to create goals in Analytics that let you incorporate impression and click through data for certain organic keywords and essentially add one more level of visualization for your data.  That would be awesome!  As they say in that Coke commercial, “scientists, I’m looking at you”!

Reviewing Advanced Web Ranking and Advanced Link Manager Software

March 4th, 2010 at 9:22:15 pm by

This is the first time we have done a software review on our blog.  We have been experimenting with a variety of software systems and thought we would give this a try.  You can download a free trial of both Advanced Web Ranking (software used for search engine ranking and website ranking) and Advanced Link Manager (a keyword research tool) on the Caphyon website.

The first thing I did when I opened the software was click the button to get the tour.  I was a bit uninspired by the dull nature of the information it pulled up.  The tour, although thorough, didn’t seem to give me what I hoped in terms of direction to get started and make the most out of the software in as little time as possible.  But let’s be realistic, learning takes time.  If I were the product manager in charge of product developments for this software I would make a serious effort to make the initial introduction to the software a bit more user friendly.  Maybe a video tour (à la Apple’s iWork suite introduction videos that play when you first open the software).

After diving into the software, I began to see things that I’ve deemed as hugely helpful.  First, creating a new project is very simple.  There’s a wizard that you go through to setup your websites, but it’s more than the simple wizards you often see on other software.  During the keyword setup for example, you can do things like add keywords and check for keyword suggestions at the same time.  This is very helpful if you have not already identified the keywords you want to go after.  You can even import keywords from a website, a file, or even copy and paste them in if you have them already organized somewhere else.  That’s awesome—this part of he project setup was very well thought out and will save me a lot of time.  I also like that I can set the priority of the keywords.  I have an idea about what this might do, but I’m not sure exactly what it’s going to mean once I start running reports.  Maybe an additional explanatory paragraph on the right side of the software in the “context help” would make it so I don’t have to think about that.  I’m a fan of the Steve Krug mantra “Don’t Make Me Think.”

Moving along.  I’m now in the websites tab in the project setup wizard.  I’m supposed to enter a URL for the website associated with this project.  Again, here it would be nice to have a bit more information before moving along.  Does Advanced Web Ranking consider www.industryforge.com and industryforge.com as the same site, or does it consider them as two separate sites like many search engines would?  An explanation here would be good, even if it’s off to the right, or in an intro video.

Under SMTP settings, there is a box that lets me inherit SMTP settings from the global preferences.  I’m sure I can change this later, but it would be nice to be able to setup the global preferences right now too.  Either that, or have the software detect that I have no global preferences, and add a checkbox that says “use these settings as global preferences.”  Then I could enter it once here and never do it again.  As it is, I have to just check the box and then go to the global preferences once I’m done with the wizard.  That’s no big deal, but would have been simpler if I could enter them here and apply to global preferences as I go.

The next step is entering client’s emails.  This is very straight forward.  I like that I have the option to add as many emails as I want and it does not restrict me to emails within the domain I’m setting up a project for.  Many times my clients don’t even have email addresses with their domain.

I like a lot of features on the software.  There are many things that make life very simple, like compare to dates, the ability to take notes, pdf report exporting and a few other things.  The fact that I can customize reports to export in a variety of formats and according to schedules is huge.  I’ll be able to develop a system that exports data to CSV format that is automatically added to my clients’ portal for them to review the report data on our monthly status call.  Fantastic!

Now on to Advanced Link Manager.

The first thing I noticed was that there is no way to import settings from one software to the other.  At least it’s not very apparent.  I wish I could pull the settings from Advanced Web Ranking over to the Link Manager.  Even if it’s just to setup SMTP.  That would be really nice — again, it’s probably available, but is not apparent from the get go.  And I’ll probably never need it after I’m past the get go, so it’s probably best to make it apparent at that point.

I think there is a bug in the SMTP testing.  I know for a fact that I’ve entered the correct information for my gmail account, yet I’m not getting any email from AWR or ALM.  Not sure what’s up.

The wizard for setting up a new project is also very simple in ALM.  I like that they give an example of the domain here (either with www or without).  The search engine setup is simple but could really use a select all button for the checkboxes.  Maybe they left that off on purpose to discourage people from selecting them all so the data gathering doesn’t take so long.

Overall, I really like the Advanced Web Ranking and Advanced Link Manager softwares.  They are both pretty easy to setup and are extremely powerful in helping you with your link popularity.  I’m excited to see how these tools will simplify my efforts going forward.  As it is now I cobble together a bunch of disparate tools and a few homegrown resources.  I’ll update this post once I have a few months of trial done on these softwares to let you know how it’s gone.

PubSubHubbub Is An Example Of Web 3.0 – You Better Know What It Is

March 4th, 2010 at 5:26:43 pm by

I just had to write a quick blurb about the astute post written a few weeks ago by Brent Nef, a brilliant Laboratory Lingo contributor.  His article on pubsubhubbub,  is ahead of the curve on technology trends and how they relate to Google and its tools and services.  Recently, Google has made efforts to push (no pun intended) a new protocol that would allow them to index new web content in real time.  The protocol PubSubHubbub (PuSH for short) has quickly become a new and exciting technology that would allow Google to have the web come to them instead of having to go find the web.  Read Write Web published an article that highlights some things said by Dylan Casey, a senior project manager at Google.  This is an example of how the web will become programmable someday, with many applications talking to each other and using APIs to exchange data–all in a way that doesn’t require polling.

I see all this as further indication that Google is trying to shift the way it calculates relevance to be based more on site content than linking.  Somehow, Google wants to be able to deliver the best possible results to searchers by placing sites at the top that actually have the most relevant content.  For now, the best way is still by counting the incoming links as votes for that site.  Right now, relevant site content + quality inbound links = top rankings.  Someday that equation may be more one sided with relevant site content trumping all.  SEO will be all about creating great content that is relevant for particular searches rather than creating link-bait or doing endless link building.

Some people say that this will not change the way Google calculates page rank, and is only relevant to searches that require real time results to answer the search query.  That may be true for now, but PuSH is definitely a way for Google to get one step closer to identifying the quality of a website’s content based on something other than inlinks.

10 Easy Tips for Better Search Engine Friendly Web Design

October 20th, 2009 at 11:10:15 am by

1. Write Good Page Titles and Headings

Make sure that you use keywords that people are searching for in your page’s header tags (i.e. h#: where # is a number).  Also, be sure the H1 tag on each page accurately describes the page’s content. Good headlines are good for your users, and good for the search engines. Here’s a link to a previous post that goes into much more detail.

2. Utilize Title Tags to Their Utmost

You want your title tag to contain keywords that are important to your SEO efforts.  Make sure those keywords show up near the front of your title as Google only picks up the first sixty to seventy characters.  Don’t just list a bunch of keywords; take a targeted approach that makes sense to your user.  Include a clear call to action!  Learn why a call to action is important.

3. Use Natural Language in Your Content

Don’t try too hard to stuff keywords in your page just to make it more “keyword dense.”  The search engines are getting better and better at identifying natural language.  They can spot your keyword fluffed content much easier than they could in the past.  What I mean to say is, don’t use the same keyword phrase over and over and over and over and over and over and over and–you get the idea.  Try including semantically related words in your writing.

4. Create a Smart Internal Linking Structure

Use keywords and user friendly descriptions in links to help your users (and the spiders) navigate from page to page on your site.  Properly position navigation to make it simple for users to know where they are and where they can go.

5. Include a Sitemap File for The Search Engines

We at Industry Forge create sitemaps at http://www.domain-name.com/sitemap.xml.  This is the standard way to do it, and we’ve been pretty happy doing this.  You may choose something else, but keep it simple.  Also consider creating an HTML version of your sitemap for your users.  You might put it at http://www.domain-name.com/site_map/ and include the same content you have in the XML version.  Now go submit your sitemap to Google using the webmaster tools.

6. Protect Your Site From 404 Errors!

You can use your new Google Webmaster Tools account to track the 404 errors identified by Googlebot when spidering your site.  It’s essential that all links coming into your site, and your site’s own internal linking, do not produce 404 errors for your visitors.  By creating a redirect strategy to handle redirection of old pages, you can avoid this problem and make happier visitors.  We’ll post a full article on a great way to handle this with PHP.

7. Use Statically Typed (Pretty) URLs

This is a great usability feature for your site.  A “pretty” URL looks like this: http://www.industryforge.com/blog/dev-techniques/.  You can see in the URL what the page (HTML resource) is all about.  You may even have some keywords in the URL.  It is the alternative to a dynamically typed URL that might look like this: http://www.industryforge.com/blog?articleid=34&category=4&name=value&ugly=yes&user_friendly=no.  Most people like the first type of URL–the pretty kind–because they are easy to read.  We have a framework that helps us build sites with pretty URLs.  We’ll post a full article on this at some point too so you can see how we do it.  Until then, roScripts has an article on how to do it with Apache and PHP.  Or, check out SitePoint’s article on how to do it with simple HTML.

8. Design For Accessibility

With the Web 2.0 craze came a huge push by web developers to use XHTML and CSS for page markup and to follow strict compliance guidelines from the W3C.  Thank goodness!  Now if we could just get Microsoft to jump on board we could quit worrying about creating websites to be “cross browser/cross platform” and just focus on content.  Pages built to standard are often much lighter (in data size) and therefore load more quickly.  Google gives you a bit of preference for a quickly loading site, plus you’ve only got your user’s attention for so long.  Make it accessible and your users will see what you expect them to, and the spiders will more simply index your pages.

Don’t forget cell phones and mobile devices.  iPhones dominate web traffic for mobile devices.  You might consider user agent detection to provide content formatted for individual devices.  One of my favorite tech resource sites, quirksmode, has a good article on Javascript browser detection.  You can also detect user agent info before outputting headers to your visitor’s browser (so you can immediately deliver properly formatted content).

9. Use “Spider Visible” Content

Consider using text instead of graphics for your navigation, page titles and other page elements.  Google cannot process the content of some rich media files or dynamic pages.  Some search engines still have a hard time with Flash as well.  Some flash elements on your site can come in very handy, however Flash-only sites are often difficult to maintain and hard for some search engines to index.

Speaking of dynamic content, don’t create your site with blocks of text produced by javascript.  Make it all visible by default, hide it using javascript when the page loads, and then show it with Javascript on some event (e.g. onClick, onMouseOver).  Because most spiders won’t bother parsing your javascript, any text that is produced dynamically via javascript will likely be skipped.

10. Don’t Try to Trick The Spiders

We figured we would include at least one don’t in our list of dos.  Google warns against shadow domains, doorway pages, spyware and scumware.  Don’t use them!  They won’t do anything for the long term success of your business.  They don’t work and they will get you banned from the search engines (at least from the ones that matter).

Thanks for reading!  For your quick reference, here is a list of external resources used (and linked to) in this article.

Writing Headlines

http://www.nytimes.com/2006/04/09/weekinreview/09lohr.html?ex=1302235200&en=fd2082be97aa034d&ei=5088&partner=rssnyt&emc=rss

http://www.industryforge.com/blog/dev-techniques/creating-user-and-seo-friendly-headlines/

http://www.industryforge.com/hire_us/conversion/

Keywords and Copywriting

http://www.gorank.com/seotools/ontology/index.php?keywords=seo

Site Maintenance & Monitoring

http://www.google.com/webmasters/

Pretty URLs

http://www.roscripts.com/Pretty_URLs_-_a_guide_to_URL_rewriting-168.html

http://www.sitepoint.com/blogs/2009/07/07/pretty-urls-pretty-easy/

Accessibility & User Agent Detection

http://www.w3.org/

http://www.quirksmode.org/js/detect.html

http://www.bushidodesigns.net/blog/?p=72

Industry Forge Launches New Website for Perry Homes

September 8th, 2009 at 1:04:12 pm by

After working closely for a number of months with the wonderful people over at Perry Homes, located near Salt Lake City, Utah, we are excited to announce the launch of the new PerryHomesUtah.com website.  The new site features Perry Homes’ conveniently located communities and a gorgeous portfolio of gorgeous homes.  Please take a look here.

Tech Rambling: Pubsubhubbub

September 1st, 2009 at 6:24:09 pm by

Are you easily distracted by technology? Do you troll the Interwebs looking for the latest whatsit or whatchamacallit to try? Have you promised yourself just 5 more minutes on the computer and later find an hour has passed while you research some esoteric feature that might or might not have anything to offer you, but you want to know about it anyway? If so, I probably don’t have anything to offer you, but I do want to be your friend — tell me everything you know.

For the rest of you, I offer you this: my latest find, pubsubhubbub.

Syndication feeds (RSS/ATOM) as a technology have their limitations.  Obviously they are fantastic for aggregating large amounts of content across the Internet into an easily accessible format, however, RSS and ATOM remain tethered to the concept of polling.  Requiring the subscriber to continually ask for updates to your content, instead of accessing that content on an as-needed basis is poor design and inefficient.  For instance, looking through server logs, it is common to see Google Reader accessing RSS feeds several times an hour, even if the blog only gets updated on average once a week.

This seems wasteful for google (as well as any other RSS agregators) and wasteful for you, the publisher, as your server expends precious CPU cycles, which could be better used serving timely content.  Alternately, some feeds get so little traffic that Google Reader might not update them regularly and several days might pass before your readers are alerted to new content.

The answer to efficient syndication lies in webhooks, or more specifically pubsubhubbub - henceforth called PSHB.  PSHB is an effort on the part of some google employees to provide a protocol where syndication is event driven rather than polled.  When a publisher creates or updates content, a “hub” is notified by a POST request.  The hub manages a list of “subscribers” (PSHB speaking clients) for each feed.  When the hub is notified of new content, it notifies all subscribers to pull the latest subscription feed.

After installing this wordpress plugin, wp-pubsubhubbub, (super easy to install — just activate and it’s done), I’ve seen marked improvements in the timeliness of my content appearing in Google Reader.  Granted the sample in this case is currently 1, so I would love to hear from any others that have implemented this and what their experience has been, in the comments.

The nice folks here at industryforge have invited me to share some of my thoughts and feelings on random nuggets of technology.  We (meaning probably just me) need to come up with a good title for these thoughts, please leave your suggestions in the comments. I think that I’ll leave several suggestions there myself to avoid any potential embarrassment from lack of comments.