Improving the visibility of a website is very important if you have developed a web application for the masses. Your application might be great but people have to get there to like it.
If you are ready to pay, advertisements might be a good idea for you. But why not also incorporate free tweaks to improve it even more. And if you are in lack of fund these methods will be a great help for you to boost up your visibility. Apart from the few tips and tricks I am going to discuss today I have found social networking to be a very effective way to link to audience. I will discuss how to use social networking in a productive way someday later. Today we are going to discuss some very simple tips and tricks which will help a lot in SEO.
SEO is a common term now and a different field in itself. As any one reading this must be aware of the term SEO so am not getting into explaining what it is and to get into how it works (algorithm and all) is out of scope of this article. 

Here are the tips and tricks:-
1. Domain name selection must be done cleverly. It is advised to select .com domain as they are the most popular in market. Free domain name does not get visitors as visitors tend to not take them seriously. And Google takes website with registered for longer periods more seriously according to its algorithm. Low quality and spam websites are generally registered for a lower period of time. Domain name with keywords are considered good but it’s not an option if the website is about a company or organization. In any case company websites do not intend to earn money from advertising. There sole intention is to be visible to potential clients and earn credibility via Page rank. If you are on a shared server, do blacklists check to be sure you’re not on a proxy with a spammer or banned site. Their negative notoriety could affect your own rankings.
2. <title> tags are the first thing a search engines read so make sure it describes the page in the best possible way. It is suggested to keep it within 80 words. Do not repeat keywords and the variation across the website should be in the first few words not the last.
3. A Meta description is used by search engines to index pages so they should be accurate and descriptive.
4. Content is the king. High quality content rich with evenly distributed keywords is must. The content should be sticky so that the visitor spends more time browsing around. Search engines know the time spent in the website which helps in improving ranking. The keyword density should be around 3-7% for primary keywords and 1-2% for secondary keywords. Images with text should be avoided where possible. Putting keyword in the first line of content is a good idea. Keyword stemming (i.e. using digging when keyword is dig or dogs when keyword is dog) is a cool way to make content interesting while increasing keyword density as search engines can recognize stemmed keywords. Short pages help in increasing keyword density. Optimizing 2-3 keywords per page is enough.
5. Site navigation should be simple for the bots to move around. Do not use links in flash as bots cannot access them. And every page should have a link to the homepage. A great method to get this is using breadcrumbs. Make sure internal links to homepage link to http://www.domain.com instead of http://www.domain.com/index.html. The later happens if you link to index.html in href. Make sure every page has at least one link to the home as described.
6. Try to use keywords for anchor texts for links where possible. Make sure it makes sense in context to the website.
7. Inbound/incoming/back links obviously helps. Inbound link from a reputed (high ranked) site is considered more important by search engines. The anchor texts of the inbound links should preferably have keywords instead of saying here or Read more. Links from similar content sites are considered good but it’s not possible most of the time due to competition. Inbound links from review sites are good as they are sites with similar contents (keywords) and can guide visitors along with bots. Links from .edu and .gov are reputed.
8. Design wisely. As said before navigation should be easy and using breadcrumbs to link home page with each page is considered good. Using a theme (similar page layout) is considered better by search engines. A sitemap is must. The urls should be SEO friendly. It should preferably have keyword and must not have space. Underscore (_) is generally considered part of the word so using hyphen (-) to separate word is advised.  Example:- http://www.infosys.com/sustainability/diversity/Pages/cultural-diversity-day-2010.aspx
Interlinking is good in moderation and should not be overdone. Use link checker to avoid broken links.
9. Create XML and Google sitemaps and put them in your web folder which contains the index.
10. Use 301 http redirects for permanent redirect and 302 for temporary ones. Avoid Meta refresh.
Cross-browser css coding has always been the priority of any designer. IE has been the most troublesome with the layout/design being rendered differently in different versions. We as designer always want to check the layout in all the versions especially if its a business web application. There are many tools and services with there own pros and cons.
I peresonally prefer something that is like browsers(means no screenshots like browsershots.org) and free(not like Adobe Browser Lab).
And as IE has been the most troublesome for me testing the layout in all versions of IE is enough for me. If you want for other browsers Browsershots.org is always there for you.

My personal choice is a WebBrowser which is free and accurate. Its called IEtester and be downloaded from here.It can be used to check browser compatibility for Internet Explorer versions from 5.5 to 9.
If there are better tools or services out there for this purpose please share the knowledge here.
The choice of IETester is my personal choice and just a suggestion. I am not endorsing or denoucing any tool or services.All are great in there own way. If you have anything better in mind feel free to share here.


P.S:And by the way save water. An interesting topic about very easy methods of water conservation can be found here. Read it.

It is always better to have a mental image of the application one has to design. Development process should be agile but the templates should be more or less fixed so that we don’t have to change it in near future.
Recently I came across a blog which listed some rules to keep in mind while designing. These rules have been used since time immemorial and will hold good always. So it is a good idea to keep them in mind while deciding the design templates. And on top that it boosts the usability and improves user experience. I would consider them a must in all design in this competitive world.
They are:
1.Feedback: It has been used in mechanical and industrial designs since the industrial age. Law of feedback states that there should be clear indication that something occurred, going to occur or is occurring. For example LED lights that indicate a appliance is working, etc.
In design context it means using hyperlink properties like hover, visited, active and focus properly, also indicating that a file is getting uploaded or downloaded, etc.
Keeping the feedback law in mind will help you to identify many such situations.
2.Mental Model: It states that it is easier for a user to use and learn something new if they can model it off something they already understand. It is for this reason that operating systems have a real life office like feelings with folders, files and desktop.
This concept can be used in designing and naming stuffs for a better user experience. The user will be able to use and learn more affectively as they can relate to it.
3.Hick’s Law: It states that more the options more will be the time to select. It was proved in a survey where customers were given a chance to select among 20 kinds of jams and 4 kinds of jams. Customer with 20 options invariably took a lot more time to decide and some even chose not select one due to indecisiveness.
In design context it is always good to give fewer options in an effective way so the functionality is met. There is no necessity of making the application heavy with options that are not required. If there has to be a lot options then it is a good idea to bundle them in an effective way.
4.80/20 Rule: It states that 80% of the user will use 20% of the tasks. Most of the user will use the application or site for a few information or task. So it is always a good idea to identify them by surveys, interviews, analytics, etc and put more emphasis on them to improve user experience.
The surveys, etc can also identify stuffs that can be excluded in accordance with Hick’s law and Occam’s razor.