Log in

No account? Create an account
Web Lessons I Have Learnt
...and I like deprecated grammar to boot!
Recent Entries 
doom, fire, evil, eyes, code monkey
It's been a while since I last posted, but this bears note. Search engine optimization, commonly called SEO, is all about getting search engines to notice you and people to come to your site. The important thing about good SEO is that it will do more than simply get eyes on your site, but it will get the RIGHT eyes on your site. People typically misunderstand the value of optimizing their site or they think that it will radically alter the layout, message or other core elements they hold dear.

First, what SEO isn't. I think it's best to get this out of the way early so we can get into helping you do good stuff without a bunch of "but-but-buts." So, SEO isn't cramming a bunch of keywords into the bottom of your page. It also isn't redesigning your entire site so it looks like garbage but Google can read it like a dream. SEO is not putting your site on every link farm in the world and it is not spamming people on social networking sites. SEO is also not spamming people on message boards. SEO is not about fads and fast grabs. It's not about people coming to your site and then bouncing immediately. SEO isn't about a bad web experience, plain and simple.

Now let's look at what SEO is. SEO is about strategic placement of key concepts in your web site to encourage traffic that will be interested in your message, product or service. SEO is about making the best of what you have to offer and making your web presence work for you. SEO is about traffic analysis and evolution. SEO is about marketing in a smart way and encouraging your customers to think of you first. SEO is about becoming an industry leader and a recognized authority.

I don't believe in mincing words or trying to sneak around and do back-room deals to become another SEO douchebag, so I felt it was only right to lay that all on the line first and foremost. Now that we have a picture of what SEO is and is not, we can benefit from looking at ways to improve your site today and give you tools to improve your site more over time.


Keywords are important for any search. Regardless of the way that the searching is done, eventually it comes down to what the customer is searching for and keywords are precisely this in a crystalline structure.

The thing to know about keywords is they are meant to be a focus. If you are planning on making a page about anything, it would behoove you to understand the essential idea you are trying to convey. Once you have this in mind, write down three to seven keywords. Use these as a guidepost and they will keep you on target. Moreover, if you are on target and your keywords were selected properly, they will appear naturally in the text. This is good. Don't try to overdo it. If a keyword is important it will appear in the main copy a few times. The appearance of keywords in 400-1000 words of copy should live around fewer than 10 times.


Every document should have titles. These titles are going to range from the overarching document title down to sub-sub-subtitles. This hierarchy of titles is important in SEO because it tells both your reader and the search spider what is most and least important. If you have good titles, they will work for you. If you choose poor titles, or worse, your hierarchy is haphazard, then they may well work against you. Take care to pick the right title structure for what you are actually trying to say and you will do well.

In-bound Links

Any site that is considered an authoritative source on anything is bound to have in-bound links. If people care about what you do or say, they are going to refer to you for citation. The people who created search algorithms know this and they take advantage of it. The more in-bound links your site has, the more likely it is to be an authority. Authoritative sources show up higher in search rankings than derivative sources. Keep this in mind and strive to be an authority. Pick something that you do which the rest of the world could do to know about. Push that and become a key player. This will boost your site rankings as well as being a generally good business practice.

While we are discussing links, let's discuss directories. There are several directories on the web but, as far as I know, there is only one that is still completely human edited and maintained. That site is the Open Directory Project (http://dmoz.org). Since the Open Directory Project is human maintained, it is given more value by the search engines. It is the equivalent of having a single person, who is a noted authority, personally vouch for your site. This would be like having someone with a doctorate vouch for your research in their field. It's mega bonus points and you should use it to your full advantage. It takes a while to get listed so don't fret if, after you submit your site, it takes months to see a result.

Meta Information

This is where things get a little more technical. Meta information, generally referred to as meta tags, provide spiders with direct information about your site. You can add things like a description and keywords. Both of these things will help people find your site more easily. The meta keywords typically aren't given as much weight anymore, since people abused them in the past. Your meta description is the vital one. Google, for one, uses your meta description to tell people about your site in your own words. That's a good thing since it gives you personal control over what people see before they hit your site. Below are the tags that should be included in the head of your page:

<meta name="description" content="your site description goes here" />
<meta name="keywords" content="your keywords go here" />

Document Format

The underlying format of your document will tell spiders a lot about what they are looking at. This is one of those items that can be worked on over time. As long as spiders know you are out there, they will check back on pages from time to time to ensure they know about the latest changes. First, it is best to pull your site out of that table layout you are using. Spiders think tables mean that each piece of data is related to another in a certain way. If your entire format is a table then they will not correctly interpret the content you have on your page and you might lose brownie points.

Commonly, sites are created using page divisions, as God intended. This means that you tell the browser "this is a piece of this page and it stands on its own." Spiders can read this much more easily and the whole site degrades much more gracefully if you do a good job. Graceful degradation makes your users happy, especially if you have users with limitations, using a special browser. Once your document is formatted properly, you can arrange your divisions into an aesthetically pleasing format using Cascading Style Sheets (CSS). CSS is outside of the scope of this discussion, so I am going to let that dog lay.


I'm not talking about your run-of-the-mill sitemap for your visitors. I am talking about a carefully crafted and standards compliant XML sitemap. There is a standard used for creating a sitemap and, once you have one, it makes indexing your site a breeze. Search spiders commonly grab a sitemap so they can better understand what they should and should not index. Sitemaps also allow you to tell spiders how often certain pages are updated. This allows them to index pages that change all the time more often than pages that may not change for months at a time.


The robots.txt file is a simple text file that tells search spiders what to index and what to ignore on your site. It's similar to a sitemap, but the parameters are limited. You can say "index this" and "don't index that." This is great if you have some pages that are currently in development, should not be seen by the public or other strangeness like that. This also allows you to have a copy-testing site that should be ignored. Duplicate copy on sites is looked down on by search engines, so anything you can do to avoid indexed duplicate copy is a good thing. I typically have a sparse robots.txt file as most of my site is viable content, but government agencies and Rupert Murdoch seem to like robots.txt quite a bit.


This is probably a book in its own right, but it is something that people should be aware of. The .htaccess file allows administrators to control site access and redirect people to new pages and away from missing pages. Correct use of the .htaccess file can limit the number of broken pages that spiders will encounter and keep your visitors happy. One of the best features of a good .htaccess file is the ability to redirect users and spiders alike to new pages and send a message back to them, letting them know the redirect is permanent. Spiders like knowing that a page has moved. It makes the whole process of reindexing faster and easier. This kind of redirect is generally called a 301 redirect for the code that is returned by the browser.


We live in a time where the blogosphere is king. What this means to the rest of the world is, blogs influence online life. Blogs change rapidly and bloggers tend to stay atop issues that are near and dear to them. Blogs are also a boon to your industry. If you have a blog that reflects knowledge and a profound understanding of your industry, you are more likely to be considered an authority. Blogs also give your site an opportunity to generate new content on a regular basis. Search spiders like new content and index sites, which are regularly updated, more frequently. This also provides an opportunity to share information about your industry in a non-business environment and generate in-bound links, you remember what I said about those, right?

Social Networking

This is closely tied to blogging, but it can impact your business and website in many ways. From forums to MySpace to Facebook to twitter and others like these, people will talk about what they do and don't like. If you are a well liked provider, the word will get around and people will head to your site. These sites help people to find things that others have recommended and they are a great source of in-bound links. Search spiders check these sites often as content changes minute to minute. Also, if someone recommends your site on a social network, it is taken as a personal recommendation and spiders will take note.

There are so many ways to take advantage of social networking that it should probably be at least one, if not several, college courses. I did choose to list this one last, however. If you have not worked on everything else first, social networks can be your worst enemy. People will say negative things about their experience and spiders will touch your site more often only to pick up your poor SEO. Once this happens, it's downgrade city, so watch out!

In the end, there are many facets to SEO, but most of them can be worked on and improved by anyone that helps build, update or administer your site. With this information I charge you, go forth and make the web a better place.
9th-Jun-2009 03:02 pm - Information and the state of the web
doom, fire, evil, eyes, code monkey
I only post here occasionally and it has crossed my mind that I might almost be wise to just create a separate blog on my web server.  I have these thoughts and then I realize that I don't have time to muck with that when I have good blog content to post, or perhaps it is simply laziness.  Either way, I only post when something strikes me.

Oh, and strike me it did.  Today I was hit with a brick by some guy running a site all about that crazy thing called love.  Lies!!  It was all about the web.  More importantly, it was about how to do various things on the web.  The site is called "The Noodle Incident" (http://www.thenoodleincident.com/) and, though it was not my favorite site to visit, I wasn't terribly bugged about it.  Let me say, I wasn't bugged until I got to the design page.  Finally, I'd had enough.

The navigation was impossible, the selection of navigation copy was mystery meat and backwards navigation was impossible.  I have to hand it to the guy, his site was clean and easy to read.  Applause deserved for that, but while he realizes that machines and all sorts of accessibility interfaces must interact with his site, he forgot that PEOPLE have to interact with the site too.  This is a really important thing to remember.

Wonderfully, this brings us to the topic du jour: Information Architecture and User Experience.  These are buzzwords right now, but they are really important buzzwords.  They represent something that people have worried about and fussed over for ages.  The question is always the same, "how do I make this easier for people to do?"  The IxDA community holds the key to this particular castle and I promise you the princess is in there.

So, where did our wily friend go wrong?  Simply put, everywhere.  Honestly, the site is easy to read as I had said before, however getting to that information is a real bear.  If you start on the front page then you are going to do well.  The main page of the site leads off to all of the information, as far as I can tell.  The real problem is navigating from another page back to main or to some other set of content.  If you landed anywhere but the front page of the site, forget about navigating anywhere without hand-editing the URL.  Long and short, accessing the content on this site is a challenge and that is bad.  This might be a good time to note, information is still king and getting to it is the only way to ensure optimal reader retention.  If your readers can't access your content, they are going to assume you have none and leave.  It's as simple as that.

Another big pitfall is his navigation location.  People learn to rely on the location of menus and such when visiting a site.  Optimally, you should have a strict, well though out navigation hierarchy that you adhere to in the most draconian sort of manner.  I'm not kidding.  Lop off hands of the people that defy you.  You'll feel better come the end of the day, I promise.  You will see immediate benefit as your users learn to trust that your navigation will remain right where they saw it last as they move from page to page to page through your site.  Key thought here: if your user doesn't notice the architecture of your site, you did a good job.

Finally, the most embarassing problem with The Noodle Incident, aside from having an uncanny resemblance to a Guns 'n' Roses album title, is that it suffers from Muphry's Law.  Technically, Muphry's Law applies to editing mistakes, but when generalized, would say "whenever you critique something, you are bound to have an error of the same type in your critique."  This relates to something really important, be sure that your content is useful, correct and does not point out flaws in your own site.  Nothing turns a user off faster than going to a site that is supposed to be an authority only to discover that they are incapable of following their own rules.  If you post authoritative content, be sure that you really know what you are doing and double check that you aren't going to be embarassed by it later.

The take-away from all of this is that Information Architecture, attention to the User Experience and some careful content creation will lead to a happier, more productive site.  People will enjoy visiting and may even take you seriously.  Focus on navigation, findability and accessibility.  These items, coupled with a site that is easy to read will lead to a better web experience for everyone involved.

13th-Apr-2009 01:13 pm - Browser Wars
doom, fire, evil, eyes, code monkey
It's been a while since I have posted. I know. For those of you that are checking out this blog for the first time, welcome. For those of you who have read my posts before, welcome back. We're not here to talk about the regularity (or lack thereof) that I post with. What we are here to talk about is supporting or not supporting browsers. So first, what inspired me to write this? Well... this:

We Don't Support IE

So, this brings a question to mind -- which browsers should we choose not to support and for what reasons?

This is an easy question to answer.  You support all of them.  Yep, you heard me right.  You support everything.  You are mindful of browser incompatibilities, inequities, disabled users, mobile users and users you had never even thought of before.  You are aware of the fact that browsers come in multiple versions and you make your sites backwards compatible.  Long and short, do not tell your users what to do.

Now, a caveat to all of this must follow.  If you are creating a web site geared toward the bleeding edge crowd, you can probably inform your users that they should hop on the newest tech to get the full features of the site, but even to this end, you are to never, never, never to create a site that displays no useful information to a user that does not fit into the spectrum of your audience.

Now, before people hop on me for claiming that the We Don't Support IE site is encouraging people to make their web sites inaccessible to all IE users, I am not saying this.  What I am saying is ignoring the IE crowd is throwing away, at the very least, 50% of your audience.  More than likely, you are going to be tossing out more like 70% of your audience.  This is a bad idea if you plan on doing anything even remotely commercial with your site.

This discussion could bear a little bit of transparency.  I do web development for a living, and I tend to spend a lot of time focused on user experience.  I mean a lot of time.  That being said, I spend quite a bit of time listening to people explain what they do and don't like about the way that something functions.  Moreover I see a lot of really bad sites.  By this, I mean horrible, awful, not fit for use web sites.  So I am not going to just pound on the Firefox/Mozilla, Opera, Safari/Webkit/Chrome crowd.  I understand that this is the group that would rather see Internet Explorer gone, but let's be realistic, IE is probably going to be around for quite a while yet.  Get used to it.

Like I said, though, I am not going to just pound on one camp.  You Microsoft guys get your lumps too.  See, I code in PHP, but I also code in ASP.Net and C#.  That being said, I know the dirty nastiness that lies under the hood of the MS technologies too.  I have seen sites that were built solely in ASP.Net and whatever code-behind model they chose which catered only to Internet Explorer.

Now, I understand that IE has access to neat little .Net architecture tools that other browsers don't play so well with, but I have seen sites that were simple, straightforward websites that when viewed in IE looked great, but heaven forbid you use anything else.  Unforgiving is too gentle of a word for what I have seen.  Pages rendered completely unreadable, forms that stretch across the screen and then some.  Serious kinds of ugly.

Just to inject my personal bias so everyone can see where I come from on a user-side standpoint:  I like Firefox.  I use it a lot.  I am comfortable with it.  It makes me feel all warm and fuzzy inside.  I really detest using Internet Explorer.  I find myself limited more often than not with it.  IE has gotten better in the past year or two, but I am still not a fan.  It's just the way it is.

So bearing my bias in mind, I have to say this: what you like, appreciate or prefer to work with does not matter.  The only one that matters is your user, and you should aim to create as close to the same experience for every user as possible that arrives at your web site.  If you have a menu that looks killer in Firefox, but can't be created in IE no matter how you try and it is unusable for more than half of your users, scrap it.  If you are unable to tweak your CSS to make everything feel similar, research, or pick another layout.  It is that simple.

This browser fight is very reminiscent of the 90's when everyone had a "Get Netscape" or "Get Internet Explorer" buttons plastered all over thier pages.  The web has grown, so it is time that we do too.  We cannot continue to battle this way or we will only alienate users that might otherwise be loyal customers.  In closing, we only hurt the user more by trying to force our preferences upon them.  Don't do it.

16th-Oct-2008 12:13 pm - Web Scripting and you
doom, fire, evil, eyes, code monkey
If there is one thing that I feel can be best learned from programming for the internet it's modularity.  Programmers preach modularity through encapsulation and design models but ultimately sometimes it's really easy to just throw in a hacky fix and be done with the whole mess.  Welcome to the "I need this fix last week" school of code updating.  Honestly, that kind of thing happens to the best of us.

Being that I am a web developer, specifically working in an interpreted language, there are two ways that things can go, clean, neat and easy to manage or a horribly mangled mess.  My first couple of full-scale projects on the web were more of the latter and less of the former.  I cobbled things together any way that worked within the time frame that I was given.  Ultimately this meant little to me at the time, but for the people that are maintaining the code now...  I am terribly sorry.  Fortunately, I know the poor sap that is currently updating the code so he has a resource to cut through the thick undergrowth of spaghetti.

Now fast forward a few projects and one ridiculously large CMS later and I have learned a few things about what not to do.  Lesson 1: don't make a script that does everything.  Lesson 2: you are eventually going to have to look at that code again.  Lesson 3: When the code is not completely obvious (read this as print statements and built in functions being used in the simplest possible way) comments are always helpful. Lesson 4: even interpreted languages have debuggers, so use one. Lesson 5: make it modular.

Lessons 1-4 are things that everyone hears, ignores and then ultimately pays the price for.  Lesson 5 is something that is preached and never reached...  dig the rhyming scheme.  On the web, if you build something in a nice, chunked out way to begin with, your code will look like that forever more.  I promise.  Once you have built a handy little chunk like an order-processing script that just hums along and processes whatever you send it, you'll never write a hack for this order or that one again.  I promise.  It won't happen because you won't need to.  You have a handy little piece of code that works like... say... an object!  WOW!  Who would have thunk it?

Now I write this not for the programmers that are in engineering teams out there working with a bunch of people that all have a standard that they follow and ultimately know all of this already.  I am writing this for the rogue programmer that has decided they are going to go it alone and do something stupid like write a custom CMS/Project Management System/Time tracker integrated tool... Man, that sounds really familiar.  Anyway, if you are going to tackle a large project all by your lonesome it is of the utmost importance that you make it as easy for yourself as possible.  I really like that I have built an ordering system where all I have to do is insert a new item and it is automagically updated and handled all over the place without any extra coding ever.  I don't even have to do a database insert.  It's all just done for me.  It's really nice.

So some of the basic rules that I follow for no other reason than I have found them to work:

1) A script in a file does one thing.  Even if you think it should do x, y and z, it doesn't.  If you coded it to do x, y and z all at the same time, one of those functions breaks on you, I have seen your code and your future, I know.  Trust me, one script, one purpose.

2) Create your directory structure BEFORE you write ANY code.  I generally include the following directories: page_elements, process, includes and templates.  This does not mean that you can't expand, but generally 4 directories and root is the barest minimum.

3) If you think something should be an object, it probably should be.  Gee, I find myself pulling info from the database a row at a time an awful lot.  Should I make a row object?  Yes. You should.

4) One object, one file.  Don't test me boy, see rule 1.

5) Break the system up into small, bite-sized pieces and create an API for plugins.  It can be rudimentary and even require a little code to plug the piece in, but you will save yourself a ton of work if you can just write the added feature without having to dig into anything else.

6) Figure out a layer structure and live by it.  I don't care the model, just use it and make it work for you.  It doesn't even have to be one of the widely recognized design patterns.  I use a home-grown MVC pattern myself and it works like a champ.

7) NO INLINE CSS! Yes, I have broken this rule from time to time, but eventually I go back and pull it out into a file

8) NO INLINE JAVSCRIPT! No, I haven't broken this rule.  I understand that you have to put in even handlers where you want the script to fire, but your script should not live in the document.  Plus, who knows, you might want that toggle element display script somewhere other than in the single place you built it originally.

9) Break up your scripts and include them as needed.  Both CSS and Javascript should function properly where it is needed, but it should be excluded when not needed.  I know that some people write these monstrous CSS files with inline server-side scripts to add in the extra pieces when they are needed, but honestly, isn't it easier on you and the server to just include files when they are needed and not load them at all when they are not?

10) Commenting!  You know that crazy function that you wrote which required bit-shifting to make it happen?  Remember how it took you three days to figure out how to do it?  It will take you 6 days to untangle what you did when you look at it again.  If you had to think about something before you wrote it, put in comments.  The person that ultimately follows after you will thank you, and that might just be YOU.

Very well, that is all.  Off with you.  Go about your programming and make the web a little better place.
10th-Jan-2008 01:12 pm - Occam's Razor
doom, fire, evil, eyes, code monkey
I have a particular project that I work on every so often. It's actually kind of a meta-project as I have to maintain a web-based project queue and management system, so it is a project for the sake of projects. Spiffy eh? Anyway, I haven't had this thing break in a while which either means that I did such a nice, robust job of coding the darn thing that it is unbreakable (sure it is) or more likely, nobody has pushed this thing to the breaking point. Given enough time and enough monkeys. All of that aside, every so often, my boss comes up with new things that she would like the system to do, and I have to build them in. Fortunately, I built it in such a way that most everything just kind of "plugs in" not so much that I have an API and whatnot, but rather, I can simply build out a module and then just run an include and use it. Neat, isn't it?

So, today I was told that she really wanted to be able to update team members on a project and then update the status of said users. Now, the way the thing works is you update the list of team members on a project and then edit the project again to set the status. This is a little cumbersome, we've discovered, simply because we don't use the system the way we thought we would. Isn't this always the case? So, my boss specifically, goes and toys with the team members as she is working on a project. This is dandy, except that she has to update the team and then go find the project again, right away. Not so good. What she asked for is a way to update the team and then immediately update the status of any given member of the new team list.

My first reaction, mentally, was 'great, now I have to build out some crazy AJAX to go behind the scene, update the team list and then cobble together the list of the current team, push out some dynamic content to the page and then update things on the fly.' This is not my idea of a good day. I could have spent all afternoon working on this. Now being the planner that I am, I sat back and thought about this. This promptly put me into a slight daze and I took about a 5-minute nap. When I woke up it dawned on me: the requirements I put together in my head were not what my boss asked for, they were what I interpreted. My solution still uses a little javascript, but now there are just 2 buttons. 1 says 'save,' the other says 'save and exit.' When you click save, everything you did gets saved and you are returned to the page. From there, the page automagically builds and includes all necessary pieces. If you click save and exit, everything you did will be saved and you will be pushed back to the main screen.

So, the takeaway from all of this is Occam's razor applies very neatly to web projects. I love neat stuff that flies all over the screen and interacts with the server by making dynamic XHTML calls through activeX, but assuming all things are equal, the simplest answer is best. Why kill yourself and stress your server when you don't need to?
4th-Jan-2008 01:34 pm - Inflexible XML data structures
doom, fire, evil, eyes, code monkey
Happy new year! Going into the start of the new year, I have a project that has carried over from the moment I started my current job. I am working on the information architecture and interaction design of a web-based insurance tool. Something that I have run into recently is a document structure that was developed using XML containers. This, in and of itself, is not an issue. XML is a wonderful tool for dividing information up in a useful way. The problem lies in how the system is implemented. This, my friends, is where I ran into trouble with a particular detail in this project. Call it the proverbial bump in the road.

Generally speaking, when dealing with a database like on I use a lot, MySQL, you can run queries to retrieve data in any way you like. So long as the basic design is reasonably flexible, you can return anything you want. Apparently this isn't so with the XML structure that was used on my current project. People would ask, 'why is this a problem?' On the outset it doesn't seem like it would be. You figure out how you want to return the data and then you simply structure the XML containers appropriately. Great! Now along I come and I say, 'this model stinks. The usability is nonexistent and we want to change the structure.' Now what?

If the design were made to be flexible then it wouldn't be a problem. The query would be changed and the structure would be re-vamped. On the fly no less. Nothing like some good-old on-demand technology. I am all about on-demand flexibility. Obviously if you are talking about running a report for some exceptionally large amount of data then flexibility will have to be considered along with efficiency, but hey! We're talking about web experience here. People view things 10 at a time. We're not talking about 300,000,000 documents. Closer to 300. With current server tech as it is, 300 documents, even with an extremely inefficient algorithm, would take almost no time to sort at all, then you produce the correct XML and ship everything off to the client. Done! Zip-bang!

Now, if you have an inflexible data structure going into the whole system, you can end up with some major issues if someone, like myself, comes along and says 'this sucks. Fix it.' Now what? You start over. That's what. I made the mistake of coding a solution in a somewhat inflexible way and guess what? I had to re-work it. Some of the code was usable, but a lot of is was just lost time and lessons learned. Now the whole system is designed to be reasonably plug-able, though it's still not a spiffy API, and as people request things, I write them, plug in the necessary code and roll on like the champion I feel like!

So, the take away from all of this, if you are looking at things from a front-end, client-side perspective, expect to run into this kind of thing. Programmers like to write code that does precisely what it is supposed to. Nothing more. So if you are going to suggest major overhaul kind of ideas, prepare yourself. You will meet some friction. If you are a coder and would like to avoid the nastiness associated with people asking you to start over, think about how you can make your life easier at the front end. Since I finished the plug-in system, my life has been much happier and the final timeline has been much shorter on all associated projects. Do your self a favor. Be flexible. Think flexible. Things change. Will you be ready?
Joker Heath Ledger Fake
Something that I have learnt over time is how to make your site accessible for people that don't have your perfect 20/20 vision, are working from a limited environment or just generally have old browsing capabilities. Believe it or not, people that visit my web sites still use old computers with old copies of Windows. Personally, I have made the Linux switch everywhere I can. That being said, I spend a certain amount of time surfing the web using Lynx. This is not due to the fact that I don't have a GUI in Linux. I do. And I use firefox for my usual needs, but Lynx has a certain special place in my heart. It is in a class of browser that sees the web in much the same way that a screen reader does. For example, all of those really neat iframes that you use for dynamic content? Yeah, those come up as "iframe." Totally unreadable. Totally unreachable. Iframe is an example of web technology that is web-inaccessible. Translate this as bad news.

Let's talk about HTML and XHTML. These are both derivatives of SGML which is a generalized markup language that is used to describe data. That's it. SGML doesn't make things "look" a certain way. You won't find it hanging around the local bar trying to pick up girls. It's just data description. Kind of geeky, like that kid that sits in the back of class playing with his... calculator. So, then HTML was developed with it's own DTD so that people could describe the information contained in their HTML documents.  Early HTML was easy to interpret if it was coded well, however most people are sloppy with their code.  This led the W3C to create a code validator.  Now, based on the DTD that you use your document may be standards complient or it may not.  Personally I really like to stay within the bounds of strict XHTML, though onMouseOver is deprecated, so Javascript kind of hoses the whole complience thing.

This leads us to the topic at hand: accessibility.  Considering that HTML and XHTML are used for simply describing data, why would a page be more or less accessible?  Well, let's think about that kid that has MS, but still needs to get online research done for a class he's taking, or better yet, consider Stephen Hawking.  Do you think that they would be particularly amused if they had to fill out a form and had to be pixel precise to click on the input field?  I promise you they wouldn't.

Accessibility means that regardless of the browser or disability, your page should still be reasonably functional.  Now, this does not mean that your page should read minds, but there are some tags that should be very familiar. Learn and use the following:

<acronym> - The acronym tag describes what an acronym means.  Hover your mouse pointer over the first instance of HTML, XHTML, SGML and DTD.  The full name will pop up.  In some screen readers the full text will replace the acronym so that the human can interpret what the acronym means.

<label> - Label tags should be used with form inputs whenever possible.  This will ensure that your user will be able to click on either the form element or the associated text and the form element will gain focus.

<noscript> - Ok, if you have ever used Javascript, you are familiar with the <script> tag.  So, in contrast, the <noscript> tag surrounds what should happen if Javascript or other inline scripting language is disabled.

Aside from these few tags, you should also become familiar with the following attributes:

title - Title can be used on practically any element.  This should be used somewhat sparingly though, as using a title with an alt tag, for instance, could spell disaster for a screen reader which cannot interpret all of the attributes precisely as you might expect.

alt - EVERY IMAGE SHOULD HAVE AN ALT TAG!  No ifs ands or buts.  If the image is included in the page via the <img> tag, use an alt tag.  Furthermore, use descriptive alternate text.  img_21938740928374.jpg is not useful.   I will find out if you are doing this and I will stalk you in the night.  Alternate text doesn't have to be perfect.  No two people are going to interpret it in precisely the same way, but if the image is of a bird in flight, use alternate text that says something like "bird in flight."  People will understand enough to get by.

accesskey - The accesskey attribute sets the stage for people that are going to use the keyboard to navigate your site, or perhaps have a limited command set to work from.  This attribute enables people to use keyboard shortcuts to access form elements for their ease and convenience.  I recommend using this tag in your forms in general, though I strongly urge you to add accesskey to any form elements that a disabled user might be looking to use.

Moving forward, let's discuss the <noscript> tag.  Whenever you have a Javascript effect on your site, consider whether there would be a problem if there were no Javascript enabled.  As a matter of fact, go to your site, turn off Javascript in the options and try navigating.  If your site topples like a house of cards, you need to re-work your Javascript.  I currently work for a company whose previous web-person didn't account for people with no Javascript.  The corporate site is totally unnavigable without Javascript.  This is bad.  This is really bad.

Ensure that your scripts degrade gracefully.  If you have elements that are hidden from the user until the Javascript function is triggered, what happens if scripts are disabled?  I always run a special function at the end of the page that hides the elements I want hidden.  What this translates to is, if you have scripts disabled, the page displays everything.  The failsafe for my scripts is usability.

This brings us back to Lynx.  Don't think that I had forgotten.  I browse my sites using Lynx.  I do this so that I can see what happens when nothing is viewable but text.  Lynx doesn't even give any particular kind of formatting.  Pages are just a long string of words.  If you can open your site like this and still find everything easily you have successfully made your site accessible.  If your page looks like an amorphous mess then you have some work to do.

Now, go forth and make the web a more usable place!
doom, fire, evil, eyes, code monkey
By this I don't mean that you should fill every pixel on the screen with text, information and blinking, distracting graphics. What I really mean is that you should give yourself more time to accomplish what you are looking to do on the web. Sure, your reaction to this is going to be "duh, of course you should spend time thinking about what you are going to do online. All good jobs take time." I say, oh young one, are you actually spending time where it needs to be spent? I suspect you aren't.

First to the hard-core graphic designers in the crowd: Just because you spent extra time looking online through iStockPhoto finding that perfect picture does NOT mean that your design will encourage even one new customer. When it comes down to it, your average web user doesn't care what your design looks like. It should fit the theme of the product being sold and the tone should be appropriate, but they really don't care if you use one perfectly appropriate graphic or another. The nuance is generally totally lost. If you are going to consume days or even weeks coming up with that perfect design, it better slap my dad's dentures right out of his head and make him spend 50% more than he otherwise would. If this isn't the case, the ROI on those last few days or weeks that you spent is going to be painfully low. Think "solution" not "pretty." Spend you time actually solving your customer's problem.

Now on to the webmonkey/codemonkey. Do you really think that your customer cares if your algorithm is 2 nanoseconds faster on the server? They don't. I promise you. If you are shaving tiny fractions of seconds off of a process for the sake of speeding the page load imperceptibly then you are really focusing on the wrong thing. Redirect. Think about what the customer is going to be looking for in the site that you are providing your code for. Does your search algorithm actually produce results? Can I be a bonehead and still get what I need from your code or do I have to specialize in what YOU went to school for in order to make the site work?

So then what should you be spending MORE time on? Well, the customer. The sale. Does the site work? If your AJAXy, cool looking thing that loads all kinds of server-side data brings their browser to its knees, you should strip it. I know, AJAX is the rage. I was guilty too. Why do you think that I am writing this? Seriously, though, AJAX is not the panacea, just like DHTML before it or blink and marquee before that. Spend time looking at how the page works and most importantly spend time looking at how to moderate your "cool factor." Spend your time working on making a site that your customer will enjoy using. To the designer: spend time making the site easy to navigate. Put the information right at your customer's fingertips. Make sure that they will never want to use a competitor's site because yours sparkles so. Programmers: focus on making the site feature-rich in a way that your customer will appreciate. Make searches work properly. Listen to your IA, she knows what the search should do. Make sure that you match the search function match the customer needs. If you are going to use AJAX and other client-server interaction tools, make them lean and mean. Make them function well and MAKE SURE THEY DEGRADE WELL! There is nothing that will drive customers away faster than a site that doesn't work for them.

Unless the site is ultimately useful, regardless of how cool the site is, the customer will walk away. The bottom line is stop doing your techno-masturbation and provide what the customer wants: a useful site. If you spend your time on this your customer will spend more money with your company or use the services or whatever your site is supposed to do. Furthermore, your boss may even thank you with a raise or a promotion. Less is less. More is more. Think about it. ; )
4th-Dec-2007 09:04 am - Note to self, scope is important.
Joker Heath Ledger Fake
Being that this was an issue just last evening, I thought I would share something that I have encountered when writing Javascript scripts.  First of all, let me state that Javascript syntax is extremely forgiving.  You can do all kinds of  unorthodox declarations of variables as well as use variables in all kinds of strange ways.  You can take a variable, store a string in it, then a number, then an object and then back again.  Weakly typed would be the phrase.  The one thing that I would like to note, as it was my big issue last evening, is scope of your variables.  So long as you are careful about defining the scope of any given variable then you are ok, if not, you could have a problem just like I did.  So, let's start with scope and how it works.

Scope: A mouthwash.  Yep.  There you have it.  Brush and floss every day and you'll be a better programmer.

That's useless.

Scope: In computer programming in general, a scope is an enclosing context.

Ok, at least this one is accurate, though it doesn't tell us a whole lot.  Really, the way that you can look at scope is "what parts of my program can access this element?"  I know that "element" is a little vague, but you can have scope that refers to objects, variables, functions, classes and a whole set of "elements."  You see why I chose the word.  What I am concerned with here is the scope of a variable.

So, how do we reconcile this in programming terms?  Well, a variable can be a global, i.e. it can be accessed by anything.  Global variables are bad and should be eliminated with impunity.  I recommend a pistol.  Seriously, though, if you have a global variable that is storing anything that might be more than garbage text, you should probably find some other solution for your data storage needs.

Object scope.  This is much more limited.  Generally in an object you want to kind of tuck the variables away so that people don't fool with your program.  Anything that is tucked into an object and set as private or protected will serve you well, grasshopper.  With any variable that is private or protected in an object you'll need to write a get and a set method/function for it.  Just something that should be said

Function scope and finer grain scope.  Variables that are explicitly defined in a function are available to the entire function from the definition of the variable onward.  Variables that are defined in a for loop, though, are available only in the loop.  This being said, if you define a variable for use in a loop like so "for(var i=0; i<..." the scope is limited, however if you define a variable as such: "while(true){var i=0..." then 'i' will be available outside of the loop, in other parts of the function.  This is a very important distinction.

Quick thoughts on "while(true)..."  Don't do it.  Seriously.  Bad idea.  Bad bozo, no cookie!

So, now the problem that I ran into.  Javascript will allow you to declare a variable thusly: "i=5" and it is fine with that.  If there is no prior instance of the variable 'i' then a global will be created.  If there is a prior variable that is accessible, it will overwrite the current value of 'i' as it should.  If there is a variable 'i' that is limited in scope and inaccessible by the current operating function, a global will be created.  This is important to know when you are doing things like writing recursive functions.

Recursion leads us directly to the problem I had last night.  I was working on a recursive function that had a loop with an incremental counter.  I just went with "for(i=0; i<other_var; i++){..."  I seemed to recall that if you declared a variable this way you would still have a nice, tight little for-loop scope to work with and you wouldn't interfere with other functions or instances of the recursion.  I was wrong.  This kind of error is somewhat invisible.  This took a while to figure out.  The way I should have done, and ended up doing this is "for(var i=0; i<other_var; i++){..." Please note the added 'var.'  This will fix the scope issue and my code ran without a hitch.  Fun!  So, with that in mind, kids, remember to scope your variables properly or you might find yourself in a sticky situation.  Good luck and good web work!
This page was loaded Feb 22nd 2018, 6:38 am GMT.