13 Things Developers Forget When Launching Public Websites

Posted 7/6/2015 7:14 PM by Jeff Julian

As I sit here getting ready to present this topic to an audience of developers, I am quickly becoming aware that I am not as passionate about development as I once was. 

I sat in a few sessions yesterday, and the debates I was once famous for having were sort of meaningless.  Not to say they were meaningless to the debaters, change has to happen and debate is required, but meaningless to me. 

I also noticed something else.  The folks who are here early while I have an hour before my lecture are also passionate about other things besides development.  They are interested in growing their teams, building the business, and one high school kids is trying to find how he can match his development skills with some other calling.

The point is, when we discover obstacles like the things we forget when releasing websites, they produced from conversations driven by groups to come together to discuss needs. 

Each one of the items we will discuss can be ran by multiple groups.  It could be the marketing department, IT team, or the entrepreneur that is wearing many hats.  The reason we have to have the conversation is we are part of something that is much bigger than any of us.  Forming and nurturing an audience is not something the marketing team, business owner, or the development team will do on their own.  We need to have everyone at the table to share the needs and concerns and maybe a little debate.

So getting off my soapbox and back to the ground, this article will explain 13 things developers typically forget when launching public facing websites based on best practices in SEO and audience building.  These are the tools both marketing and IT teams can use or the savvy business owner can embrace, to help set yourself up for success. 

None of these tools will guarantee success!  I need to stress that right now.  You can follow each one of these practices and implement every tool perfectly and not have an audience.  Google doesn’t look for the best in breed, they look for the one that entertains and educates to most.  However, leaving these out will slow the progression down.

Google Analytics

It is interesting I still need to list this one but one of the first requests I hear from marketing teams is a location to put their GA codes.  As we move further into the digital world, great analytics is a must for tracking return of investment and momentum. 

Developers do not track success of a website by these same measurements.  No bugs and high performance are the keys to their success, so monitoring logs and using tools like Stackify are the ways they win. 

Marketers on the other hand care about the number of visitors, ability to convert them to customers or audience members, device saturation, and other indicators of successful content.  Google Analytics gives marketers these tools for free and with very minimal integration cost.

To learn more about Google Analytics, check out this site.

Google Tag Manager

While we are on the topic of “snippets” or tags, Google has another tool developed for IT or marketing folks to solve another common problem, getting custom script on pages.  The Google Tag Manager allows for either team to create very simple copy-and-paste tags for their integrated components. 

Whether it is Facebook, Google Analytics, LuckyOrange, ClickDimensions, Marketo, etc, you can use GTM to store these codes.  Tags are placed in containers and attached to different events on your site for GTM to know when to publish them.   Maybe you have some extra GA code you would like to place on a eCommerce page or a pop-up manager like Opt-in Monster you want to use to drive conversion on your content pages, GTM events allow you to define the activity. 

This is another free solution from Google and just like all their products, you can be assured that it is an enterprise-class hosted service that you can count on.  Find out more about Google Tag Manager here.

Social Meta Tags/Cards

The more we use social for broadcasting our messages and tracking success of content, we need to embrace the Meta tags and card code developed by these networks to make sharing easier for the audience. 

Whether you are using Facebook’s Open Graph, Twitter’s cards, or anyone’s networks code, the idea is still the same.  Rather than decorating a submission page on the network with fields like image, description, link, and title, when a URL is shared, these tools will load your site and look for these codes.  This puts you in control of how you want to present your content on these networks.

If you do not decorate your pages with these tags, the sharing engine of these networks will do their best job to find relevant information.  Using tags like title, H1, Meta description, and prominent images, the tool will show the user the best guess for their share. 

One of the main reasons we need cards is the headline on social media versus the one used for SEO can be different.  The reason is because people use a different style of language when searching for content.  Look at the BuzzFeed effect on your Facebook feed.  Remember all those posts that ended in “you will never believe what happened next?”  These post draw curiosity and drive to the heart of the reader to want more.  However, have you ever searched for that phrase on Google?  Probably not.

Another reason we need these cards is control of the image.  What looks good within the content or at the top of the page might not meet the social media guideline for the given network.  This leaves it up to the tool to do the appropriate cropping.  Cutting away text or the subject of the image are a common problem with uncontrolled cropping.  Your marketing team should be prepared when releasing content to offer the values and images for these fields to ensure success, but the developer will most likely need to provision the fields in the CMS to make it possible.

Here is an example of the social tags used by Twitter and Facebook:

<head>
<title>Page Title Around 70 Characters</title><!-- Twitter Card - https://dev.twitter.com/cards -->
<meta property=“twitter:card” content=“summary” />
<meta property=“twitter:title” content=“Page Title” />
<meta property=“twitter:site” content=“@Publisher” />
<meta property=“twitter:creator” content=“@Author” />
<meta property=“twitter:image” content=“URL for Sharable Image” />
<meta property=“twitter:description” content=“Sharable Description” />
<!-- Open Graph Facebook - https://developers.facebook.com/docs/reference/opengraph -->
<meta property=“og:title” content=“…” />
<meta property=“og:type” content=“article,profile,video,other,etc” />
<meta property=“og:url” content=“Primary URL” />
<meta property=“og:image” content=“URL for Representative Image” />
<meta property=“og:description” content=“Sharable Description” />
</head>

Unique Meta Descriptions and Page Titles

Allowing Google or any other search engine to distinguish between one piece of content and another is the best way to ensure your website is being crawled correctly.  The more landing pages we have with quality content, the higher likelihood we will have success.  Each of these pages should have a strong title that is well thought out and a Meta description describing the content. 

The ability to manage these fields via out-of-the-box content pages in most Content Management Systems, like the Sitecore Experience Platform, is pretty standard.  However, when a developer starts to build more dynamic features like search results pages, they tend to leave this requirement out. 

Commonly you will see this:

  • Title: Search Results
  • Meta Description: This is the search results page for ABC Corporation.

Then the developer will build an amazing search page with tons of content and the ability to select facets on one of the site columns to help isolate what the audience is looking for.  Each time they do this, the page may refresh behind the scenes, but typically a query string can attach to the URL to select these facets.  But the title and description remain the same.

So now look at all the opportunity lost.  Let’s for a second assume you list vehicle inventory.  Your page for 2014 Mazda sedan has the exact same title as the page for a 2005 Lexus SUV.  The search engine will be left to show the searcher a boring title and description leaving your content limited ability to land on keyword searches. 

Now imagine this solution.  The developer gives you a way to add titles and descriptions attached to the different combinations of facets along with the custom content and Meta description.  Now you can have listings that will show examples like these:

  • Facets: 2015, Lexus, Missouri
  • Title: View Our New and Used 2015 Lexus Luxury Vehicles in Kansas City and St. Louis
  • Meta Description: We have several options when it comes to your Relentless Pursuit of Luxury

This may seem like a daunting task to produce these fields on your marketing team, but you can give the developers the request to automate some of these titles just to get out of the gate, as well as overrides for your custom titles, you can see success right away.

XML Sitemaps

One of my favorite topics, XML Sitemaps.  In my previous career focus, I was a Microsoft Most Valuable Professional in XML.  At the time there were 8 of us who got together to promote all things XML.  We all had such passion for seeing people embrace the technology during the 2000s. 

Today, most remnants of XML are gone.  Most people have stopped subscribing to RSS feeds and no one shares their OPML feeds as a blog roll.  However one very important XML feed is still of great importance, the XML Sitemap. 

This file is like a directory to your content for search engines and other bots.  It outlines the content pages you have on your site, the priority you believe it has, the last date modified and the frequency of modification. 

Without an XML Sitemap, you are leaving your content up to discovery via crawling.  So if you have an article three levels deep in a directory, you assume the search engine will discover the link to that article the next time it loads the page for the parent directory.  But when it does it is at the mercy of the search engine.  You can also use URL submission tools for most major networks, but it is very unclear of the priority they give to the content review of those links.

Sitemaps usually involve developer intervention to provide.  Most major CMS platforms have third-party modules for creating XML Sitemaps so the need for custom development is limited.  This can be something a tech savvy marketer or entrepreneur can install, but you should consult with your development team to ensure they know what was done and how to support it.

Here is an example of an XML Sitemap:

<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
     <url>
          <loc>https://ajisoftware.com</loc>
          <lastmod>2015-04-15T18:36:17+00:00</lastmod>
          <changefreq>weekly</changefreq>
          <priority>0.75</priority>
     </url>
</urlset>

ROBOTS.txt

This is one of the oldest ways to communicate with search engine bots and crawlers on your website.  The concept originated when webmasters needed a way to tell these tools what to look at and what not to.  If you have an administration, diagnostic, or staging environment that is publicly accessible, you most likely do not want these pages crawled and accessible via the search engine. 

The solution was a very simple text file placed on in your root directory called ROBOTS.txt.  The unwritten contract between bot and site is that if you have this file at your root, you should abide by the rules listed within. 

I say it is unwritten because it is not necessary to follow the rules.  Systems that use look through your site for contact information and other relevant data by scraping the HTML for specific values typically do not follow the rules. 

Inside the ROBOTS.txt file you can list what agents (those bots and crawlers) are allowed, a list of directories and/or pages you would like them to load or leave alone, and list of your XML Sitemaps.  Sitemaps are a more recent addition to the supported fields, but it is by far my favorite way to let others know where your pages are. 

For more information, check out this site for details about the file and format of the content.

Here is a simple example of a ROBOTS.txt file:

User-agents: *
Disallow: /admin
Sitemap: https://xyz.com/sitemap.xml

Responsive or Mobile-Ready Design

This suggestion is more appropriate on this side of April 2015 than before.  While I never like to alienate any users by the device they are using, technology disruptions are moving extremely fast and it is hard to keep up the pace when you don’t control the budget.  Most developers would love nothing better than to work on cool stuff, but are not given that ability due to time and budget constraints. 

That being said, mobile has reached the saturation we need to be outside of the 80/20 rule.  I have seen an average of 25% traffic from mobile devices on our B2B clients and it is growing much faster than years before. 

In 2001, I created a mobile site for purchasing tickets for concerts and other events.  While that was only 15 years ago, at the time I was looked at as a dreamer and I heard from so many people, “that’s nice, but why would anyone do that?” 

Now at the midpoint of 2015, this dream of a 19 year old kid is now the reality.  B2C audiences expect to find websites and eCommerce transactions to be supported on your devices.  When you don’t give them the experience they expect, they will take their money down the digital superhighway and stop off at the next store. 

If you ever catch yourself saying those words together, expected and users, you should assume some team at Google is working on an algorithm change to address that very issue.  If Google shows a result to a user on a mobile device and they do not get the expected experience, you look bad and Google looks bad for telling them about you. 

So in April of 2015 Google released an update to their algorithm that would negatively affect your Page Ranking on mobile search results if you were not deemed mobile-ready.  In an impressive move on their part, they told the world before they made the change and offered tools to test your site to ensure you met their standard.  This really leaves you with minimal or no excuse to ensure your site is ready for mobile consumption. 

One solution for websites to become mobile-ready is called Responsive Web Design.  The idea behind it is to use CSS media queries and JavaScript frameworks to change the content on the page based on the device dimensions and characteristics.  It may just be a “squishing” of the content to fit vertically in with limited width devices like a phone, or it could be a complete reposition and display of the content. 

This approach is great for marketing and software teams because the site responds to the device and the content is entered as usual.  Big considerations you will face are image and text sizes, positioning of text, and limiting your use of inline CSS styles.  These traditionally have been hacks marketing teams have used to get their content to look like they want and it is important to make sure we use established style patterns, or CSS classes, that are supported by the different media queries.

This approach has been around for a few years so you should not feel the need to build your own Responsive framework.  Here are some tools you can use to help you get down the road right when you get started.  Each have tons of community and product team support.

Suggested Responsive Frameworks:

Minified CSS/JavaScript

Seconds matter in web traffic.  If you are sitting at your desk visiting a new website, that first impression means everything.  Have your page load promptly the first time and even faster as I click around is an expected result.  Blame Google and Facebook if you want, but I think it is a great thing.  I hate it when I see websites that should load fast and because of poor design or development, it sits there with the “spinning ball of infinity” right in the middle of the page leaving me bored. 

Those precious seconds add up quick.  If the average user gives you 8 seconds to make a lasting impression, if you page is sitting there doing its own thing, moving weird boxes and content around until everything is in place, you can assume your visitor is not happy. 

Something a developer can consider to make this process go faster is to “minify” our CSS and JavaScript.  This is a fancy and fun word that means to remove any unnecessary information in these files to shrink the file and speed up processing.  Things like descriptive variable names, whitespace, carriage returns and comments are removed from the file.  These files are usually human readable to make it easier to change but they are not compiled when they are released.

Many web platforms have built-in tools now that make it possible but the developer has to make the conscious effort to use it. 

Examples of minified versus original file sizes:

  • JQuery JavaScript File – 84kb minified, 247kb original
  • Bootstrap CSS File – 120kb minified, 144kb original
  • Bootstrap JavaScript File –36kb minified, 68kb original

Use of HTTPS as Primary

Security has been and will always be a topic of interest for those of us who live in digital marketing and web development.  We absolutely want to keep our customer’s information as private as possible but we do not always take the appropriate security precautions to do so.  I love to think of the analogy of leaving your car unlocked.  You might live in a place where leaving your car unlocked overnight is just the norm.  Or you may live in a place where leaving your car unlocked for 10 minutes will guarantee you will be missing something.  Then there are other times where you did everything right and someone broke your window out. 

Having SSL certificates on your site is like having a door lock on your car.  You have the ability to decide when it is locked and when it is unlocked by the internal links you use between pages.  And sometimes, no matter whether it is locked or unlocked, people will break in anyway. 

Google made the decision to give those sites that are always locked a boost in Page Ranking in August of 2014.  About the same time, they to locked their site by default causing the end of referral information about what search terms were used to get to your site.  The reason behind it is good and  is a no brainer if you can afford the certificate and your CMS or web host supports having a default HTTPS setup. 

Making this decision is costly beyond just the financial aspects of purchasing your site certificate.  You also have the price of adding the encryption of your data and the larger file sent over the wire to be unencrypted on the other side.  When you make this jump, it is very important you do some of the other suggestions in this post like minified files and image formats and sizes to be sure you have the smallest file possible before the encryption occurs.

After the move to HTTPS as a default, you need to double check your Sitemap XML and ROBOTS.txt files are pointing to the SSL secured pages instead of the unencrypted ones.  Over time the search engines will replace the links to the new pages in their index and you will potentially see some boost in your Page Ranking.

Rel=”Canonical” Links

Having duplicate content on your site is bad, but did you know that sometimes search engines can see the following pages as duplicates?

Kind of annoying right? 

Well it is a little more complicated than Google just fixing the site.  With the use of subdomains on the rise due to shared Page Ranking between web applications and sites, along with some web servers having case sensitivity in their URL parsers, the search engine has to make the assumption that these pages are unique.  There are ways in Google Search Console to set the standards for your site and make the tweaks you need to make most of my examples irrelevant, but that doesn’t solve every problem. 

The best solution is to associate one link to rule them all.  Sorry, geek humor.  We need one appropriate path that no matter what URL was used to reach the content on our site, there is an indicator that tells the caller that if you are storing the URL for this page, use this specific one.  To do that, webmasters use a HTML tag in the header known as a Canonical tag.  In speech, most people call this “Rel Canonical” because of the link tag’s rel attribute with a value of “canonical”.  Here is an example of what it should look like:

<head>
       <title>Unique Page Title</title><link rel=“canonical” href=“Your Primary Url />
</head>

Typically the implementation of this requires developer intervention and it should be a priority item.  If you do not see this tag on your pages, use a tool like Moz Professional to investigate further and to gain some suggestions on how you can solve this with your platform.

Image Formats and Sizes

I have covered the reason you would want to select a particular format in a previous post, but I cannot stress the importance enough so here is another section of content.  We went through an interesting transition that is clearly not intuitive to innovation.  As things got faster, we became less concerned with file size, but I think it is time we refocus on this issue as developers.  Let’s take a journey back in time to see why:

  • 1990s – Early days of the public internet and the creation of the web browser occurred about mid-decade.  In the beginning, you didn’t have much CSS supports and images were used to decorate pages.  Background colors, corners, and flash ran crazy and everyone was forced to wait until every resource was fully downloaded before the page would render.  No one had a taste for speed and people doing cool things with graphics was exciting.  Most of us had 14.4k or 28.8k modems and had to dial into a service provider like AOL or Juno.  GIFs ruled the day and JPGs only became popular at the end of the decade when Sony made their consumer digital camera (fully loaded with a floppy-disk drive on the side).  Primary development concerns were browser compatibility and monitor width.
  • 2000s – We started to move from dial-up connections and into cable or DSL broadband connections.  Soon we had 1.5mb downloads available in most of the US and worldwide.  Data plans came toward the end of the decade, but mobile-friendly meant you had a mobile-specific site that everyone wanted to skip.  Since we didn’t care about mobile, we loosened up the belt towards the end of the decade on image sizes because of the broadband availability.  CSS helped solve the table-driven HTML design approach and we started to get rid of the background images in our designs.  Without great support of PNGs, we were stuck with 256-color GIF images when we needed transparency.  Primary development concerns were browser compatibility of HTML and CSS, fitting everything into 1024 pixel width monitors, and offering web applications instead of desktop client applications.
  • 2010s – Well since we are only halfway, I can’t promise this will be the whole decade, but mobile has been a priority for most developers for the majority of the 2010s.  At first, it was pretty basic and when the tablet came out, most of our sites looked “decent” on the larger screens.  Broadband availability became a norm for most mobile devices and what held us back from using our phones and new tablets lost its grip and it opened the floodgates for high-resolution images, videos, and other artifacts.  Corporate and Residential connections are reaching ludicrous speeds with companies like Google and AT&T trying to own the home.  Podcasts were cool again because it was not reliant on the early adopters to pull an entire content type along and people even moved into video podcasts which will be a huge disruptor.  Primary development concerns were MVC frameworks on client and server, responsive design frameworks, and rich two-way communication with the cloud.

If you read the history, I looks like we are on the right path, but there is one huge catch.  Data plans have limits or high speed quotas.  It is easy to use up your data and get stuck on the slow connection quickly in the month.  If you don’t have the “unlimited” option, you are still concerned with what you can do when you are on cellular. 

So when you open a page towards the end of your data plan or on a slower Wi-Fi network, you expect to see the amazing experience you are used to on a much faster connection.  Waiting for a poorly saved image that could have been 20kb, but is instead 50x larger is not the experience you want to deliver.  Until the cellular networks are disrupted, we need to spend time as developers and marketers to optimize image sizes for the devices and connections our audiences have. 

For example, recently I visited a page with around 300 words of content.  Doing a speed test on the content resulted in a 3mb+ payload for the images in the carousel, JavaScript, and other assets on the page.  Ultimately the end user would quickly consume the information and move to the next page if they felt compelled.  On a mobile device, this responsive page still used all the assets the desktop version used and it took over 30 seconds to pull up the page.  Are those assets really worth the user’s time and data?

Here are my quick rules with image sizes and formats:

  • Only use high-resolutions images when necessary.  Save the original, but compress large and small images as far as you can before artifacts show.  Consider linking to a higher resolution version in your design if they are needed for viewing.
  • If you have an image that needs transparency or only has a few colors in it, go with PNG and selected the lowest color count possible
  • Need animation for specific entertainment reasons (not the little dude shoveling next to the Under Construction sign), go with GIF
  • Use JPG for other needs and try to keep the compression level high as possible
  • Consider a mobile size version and a desktop version of the same image due to scaling and orientation of the screen (Most mobile browsing is done in portrait while desktop is typically landscape)
  • If you just want to be a jerk, use BMPs

Fully-Qualified URLs

If you have a page where the content is controlled by a query string (the data after the question mark in the URL), then chances are you could solve the problem with a Fully-Qualified URL instead.  Fully-Qualified URLs give you more options when it comes to sharing your content with search engines and users. 

For the search engine, they won’t have to figure out what is different about these URLs that all point to the same page.  For users, they are usually pretty readable and add a similar benefit that a breadcrumb offers on the page. 

When you take advantage of Fully-Qualified URLs, it is important you make sure the page has a unique page title and Meta description for the rendered content as mentioned above.

If you want to see how your URLs are structures, do a search on Google with the term “site:” and add your domain without the “http://” or “https://”.  An example is of this search would be “site:cnn.com.” This search will bring back all the pages Google has indexed for your site and you can see underneath the titles what URLs is used.  If they all look the same and have questions marks, you should talk to your team about moving to the Fully-Qualified URL approach.

Google Search Console

Formerly named Google Webmaster Tools, this tools offers developers and marketers the ability to monitor their site and see how it is indexed in Google.  Most other search engines offer tools like this one, so make sure you review them as well if you see them as a potential source of traffic.

These tools offer you the ability see if you are following the majority of the suggestions I made in this article.  I believe this is the most important tool on the list and for all the reasons listed in the article. 

Here is a list of tools offered for free by the Google Search Console:

  • Sitemap Tester
  • Mobile Usability
  • HTML Improvements
  • Site Links
  • Search Queries
  • Crawl Errors and Statistics
  • ROBOTS.txt Tester
  • Current Security Issues
  • Page Speed Test

Conclusion

This list could go on-and-on and I did my best to make sure I did as well (seriously, near 5,000 words?).  As a promise, I will make sure I keep it updated as things change.  When I originally wrote the piece, the majority of the work is in the hands of the web developers and designers, but I believe this will be a much larger team effort going forward. 

Check out the SlideShare Presentation for this topic here or in the frame below.