Let’s face it, information rules the Information Superhighway we call the Internet. If it weren’t for worthwhile information, formulated opinions, and reviews the Internet would not be as popular as it is today. For all intensive purposes in terms of web site design, search engine optimization, and search engine promotion, information is the equivalent to content. There is good content and then there is bad content. Bad content, in my opinion, is content that serves no purpose, has no goal, and doesn’t inform anyone reading it about any general or specific topic. Bad content exists to entice search engines to produce results that favor the web site that contains the bad content as opposed to enticing web surfers to read the content. Good content serves a purpose, has a goal, and is worth the time it takes to read it. The Internet, web sites, web surfers, web developers, and search engine optimizers and promoters should all be focused on creating good content that benefits everyone.
Creating good content has quickly become a way to provide useful information to web surfers. Creating content has also become a way to optimize and promote web sites. Back when the World Wide Web was first taking off, web developers would often place a series of keywords at the bottom of pages they created so that search engine spiders would pick up those keywords and index them accordingly. This would help webmasters and web developers promote their content by helping augment the unusual searching habits of web surfers. However, this series of keywords was unintelligible and never meant to be read by a person. Keyword stuffing (as it is sometimes called) quickly became a means of flooding search engines with useless information to attain higher rankings. This method of optimization is now considered spam, partly because of the need to hide these keywords by making them the same color as the background of the web page they are on. This method, once deemed as spammed, forced a lot of web sites to stop using keyword stuffing for fear of being banned by search engines. I admit, keyword stuffing is still used today (sometimes quite successfully) but, in my opinion, it was never and still isn’t a good way to design web sites.
Eventually, webmasters, web developers, SEPs and SEOs learned that keyword stuffing while being looked dimly upon could be utilized in other ways. The keywords that were previously used to augment web surfer’s wild search queries could now be used elsewhere on the web site being worked into articles, commentary, and other useful forms of information. When these forms of information are linked to accordingly, search engine results reflect well on this new content. This started the search engine optimization flurry to create meaningful, keyword rich content. This method also improves the World Wide Web as a whole by creating more available information thus improving web surfers’ experiences. Creating content is essentially a win, win situation for all involved. Web sites get hits from people searching for relevant information and the people searching for that information get what they were looking for as opposed to a string of keywords at the bottom of a page.
The push to create relevant content does some interesting things. It puts the ML back in HTML. Hypertext Markup Language isn’t a programming language (C, C++, or Java), HTML isn’t code. HTML is a markup language. HTML was designed to be a standard that adds markups to text so that it can be displayed across numerous devices. Just like this article, web development is starting to be done more in a word processor then moved into a web developer’s favorite design tool as opposed to it all being done in a web site design tool. This means that the content is there and then marked up for the web. This also means that HTML has been taken off the hook to provide a multimedia web experience (which HTML was never designed to provide). Other standards, languages, and tools are now being used to provide that engrossing, colorful web experience (Flash, for example). The graphical experience though isn’t crawled by search engines however. So if those graphical web sites are dependent upon search engines for success then they too will need to create content that search engines can spider. Thus creating a need for keywords in plain text on pages (see a trend here?). This means that even heavy graphics web sites need some meaningful content.
Creating pages upon pages of content isn’t necessary but it is recommended to create a few pages of content that a spider can crawl so that a web site is given more meaning and better results in search engines. You won’t be the only one to benefit from this.