Welcome to "SEOCrazy"!!! Subscribe the Blog Feed for Latest SEO Tips and Techniques.
Enjoy your stay... :-)

Tuesday, December 18, 2007

Increase your online business with PPC advertisements

PPC advertisement is the most important way to attract targeted traffic to your site that results increased sales and ROI. Mostly online business companies today prefer PPC advertising in order to get better sales and revenues. In this article, you will learn the 4 ways to maximize your sales and profits through PPC advertising:

1. Understand deeply the market trend and your potential clients. Identify the keywords that bring more profit and use them in your PPC campaign. To make your campaign more targeted, analyze your target market. Identify their gender, their buyer power, their needs, and the keywords they are most likely to use whenever they use search engines. These will guide you in making a suitable PPC campaign that is highly targeted.

2. Meet the needs of your target market. Identify the problems that your target markets are currently facing and what products or service you can offer them to resolve their pressing issues.

3. Optimize the landing page of your PPC campaign. Your landing page is a very crucial element in ensuring successful sales, thus it is equally important that your landing page is enticing. You can do this by posting strong recommendation or unbiased reviews of your products. You can also post testimonials of your satisfied clients to further entice your online visitors to purchase from you.

4. Evaluate the performance of your PPC ads and your landing page. Monitor, track and evaluate each aspect of your PPC campaign. Are you getting more than enough clicks but not making a sale? Then, there must a problem with your landing page.

PPC advertising, when done in the right way, can bring enormous sales and traffic to your site. Thus, you must know how to use this tool properly in order to augment your profits.

Saturday, November 17, 2007

Submit Sitemap to MSN, Google and Yahoo

In November 2006, Google, Yahoo! and MSN joined forces to support a new industry standard for sitemaps: Sitemaps 0.90 (see MSN announcement here). As long as webmasters follow the protocol, they can ensure their sites are fully and consistently indexed across all the major search engines (a real step forward). This article is important for all those with missing or poorly ranked pages.

The official site for the joint venture is at http://www.sitemaps.org/ and contains a lot of info about the new standard and it's syntax. What the site singularly fails to do is explain correctly how to submit your sitemap to the big three! The format suggested on the site of:

http://search-engine-url/ping?sitemap=your sitemap_url

does not currently work at any of the three sites! Until it does, We provide instructions for how to (a) create your sitemap and (b) how to submit to each of the three search engines...

Creating your Sitemap

Some hosting providers(for example 1and1) provide utilities via their web control panel, to create your sitemap, so you should always check with your provider first. If this service is not available, then make a visit to http://www.sitemapspal.com/ and enter your site URL into the generator box. Copy-and-paste the resulting sitemap into notepad, then save-and-upload to your site with the file name: sitemap.xml

If you want to validate the XML prior to uploading to the search engines (useful if you have made any manual amendments), pay a visit to the SmartIT XML Sitemap validator where you can put in the URL of your sitemap and check it against the standard.

Submit sitemap to MSN

MSN have yet to implement a formal interface for Sitemap submission (as at April 2007). To monitor the situation, please visit (from time to time) the following pages:

MSN Site Owner Help
Livesearch official blog (where future announcements are likely to be found)

Whilst MSN have yet to implement a front door, there is a recognised back door for submitting your sitemap to the MSN Search index; namely moreover.com! You should use the following syntax directly in your browser URL box:

http://api.moreover.com/ping?u=http://yourdomain.com/yoursitemap.xml

For example, to submit the sitemap for this site, the correct entry in the browser is as follows:

http://api.moreover.com/ping?u=http://www.seo-expert-services.co.uk/rss.xml

Since February 2005, moreover.com have been the official provider of RSS feeds to the myMSN portal (see press release) and reliable evidence suggests that submission to moreover.com will result in MSN spidering your pages within 2-3 weeks.

Update 18/07/07: MSN still do not support direct submission but now suggest on their blog that you add a reference to your Sitemap into your robots.txt file (something now supported by sitemaps.org). For example:

User-agent: *
Sitemap: http://www.yourdomain.com/sitemap.xml
Disallow: /cgi-bin/

This would tell MSN (and all other engines) to crawl your sitemap file but not to crawl your cgi-bin directory. For more info on how to implement a robots.txt file (in the root of your site webserver) please visit: http://www.robotstxt.org/

Submit sitemap to Google

Google originally developed the XML schema for sitemaps and have developed a dedicated portal for webmasters, from where you can submit your sitemap:

http://www.google.com/webmasters/

First, you need to tell Google all the sites you own, then verfiy that you indeed own them. The verifiaction is achieved by adding a metatag between the head tags on your site homepage. The syntax for the tag is as follows:

<meta content="unique code advised by google" name="verify-v1">

There are full instructions on how to do this on the Google site.

Submit sitemap to Yahoo

Yahoo follows a similar approach to Google. Again, there is a dedicated sevice for webmasters (Yahoo! Site Explorer) and a procedure for verifying your ownsership of the site. First go to:

http://siteexplorer.search.yahoo.com/

Add a site, then click on the verification button. You can then download a verfication key html file - which you will need to upload to the root directory of your webserver. Then you can return to Site Explorer and tell Yahoo to start authentication. This will take up to 24 hours. At the same time you can also add your sitemap by clicking on the manage button and then adding the sitemap as a feed.

Submit sitemap to Ask

Ask follows a simpler approach to the other three. To submit you sitemap, you simply enter a ping URL, followed by the full URL of where your sitemap is located:

http://submissions.ask.com/ping?sitemap=http%3A//www.yourdomain.com/sitemap.xml

After clicking return, you will get a reassuring message from Ask that they have received your submission. Very neat!

Friday, October 12, 2007

Some Web Design Mistakes and Golden Rules

1. Keep Everything Obvious

Visitors to a website expect certain conventions, breaking these is a great way of losing visitors. People expect to find the navigation at the top of a page or on the left hand side. Logos are mostly found on the top left. Much research has been conducted into how people view and use web pages. The good news is that you do not to know all of this; instead look at how larger companies such as eBay, Amazon, Google, Microsoft structure their pages and the language they use, then emulate them.

2. Limit Colours

A website using too many colours at a time can be overwhelming to many users and can make a website look cheap and tacky. Any users with colour blindness or contrast perception difficulties may even be unable to use the site.

Limiting a palette to 2 or 3 colours will nearly always lead to a slicker looking design and has the added bonus of simplifying your design choices, reducing design time.

Software like Color Wheel Pro can greatly simplify the creation of a pallet by showing which colours sit well together. If you really do not have the eye for design then software like this provides the perfect way of escaping monotone or badly combined colour schemes.

If your site uses blue and yellow together or red and green then it may present problems to anyone suffering with colour blindness. Vischeck.com provide free software that can simulate different types of colour blindness.

3. Be Careful With Fonts

The set of fonts available to all visitors of a website is relatively limited. Add to that the possibility of a user having a visual impairment then the options become even smaller. It is advisable to stick to fonts such as Arial, Verdana, Courier, Times, Geneva and Georgia. They may not be very interesting but your content should be more interesting than your font and if it can't be read, what is the point of having a site?

Black text on a white background is far easier for the majority of people to read than white text on a black background. If you have large amounts of text then a white or pale background is far more user friendly. Always ensure that there is a good contrast between any text and its background. Blue text on a blue background is okay as long as the difference in shade is significant.

Verdana is often cited as being the easiest to read on the screen. Georgia is probably the best option for a serif font.

4. Plan for Change

If you fix the height of your page to 600 pixels will you still be able to add additional menu items without completely redesigning your page?

The ability to add or remove content from a website is fundamental to the ongoing success of it. Having to rewrite the entire web page or website each time you want to make a small change is sure fire way to kill your interest in your own site and will negatively impact your overall design and usability.

Getting a good idea of how your website is likely to grow will clarify how best to structure your layout. For example, a horizontal navigation is often more restrictive than a side navigation unless you use drop down menus; if your navigation is likely to grow and you hate drop down menus then your design choice has been 99% made for you!

Understanding how to use Cascading Style Sheets (CSS), avoiding unconventional layouts and complicated backgrounds will all help enormously.

5. Be Consistent

Again, don't make your visitors think! About how to use your site at least. If your navigation is at the top on your homepage, it should be at the top on all other pages too. If your links are coloured red ensure the the same convention is used on all sections.

By using CSS correctly you can make most of this happen automatically leaving you free to concentrate on the content.

6. Keep it Relevant

A picture is better than a thousand words but if the picture you took on holiday is not relevant to your Used Car Sales website then you should really replace it with something which reflects the content or mood of the page; a photo of a car perhaps!

If you can take something off of your web page without it adversely affecting the message, appearance or legality of your website you should do it without hesitation.

Avoid the need to add images, Flash animations or adverts just because you have space. This wastes bandwidth and obscures the intentions of your website. If you absolutely must fill the space, then exercise your imagination to find something as relevant as possible.

Keeping your content focused will ultimately help your search-engine rankings.

7. Become a CSS Expert

Cascading Style Sheets should be any web designer's best friend. CSS makes it is possible to separate the appearance and layout of your page from the content. This has huge benefits when it comes to updating and maintaining your site, making your site accessible and making your site easy for search engines to read.

8. Avoid Complexity

Using standard layouts for your web page will save you development time and make your site easier to use. Pushing the boundaries nearly always leads to quirky behaviour, cross-browser problems, confused site visitors and maintenance headaches. Unless you really do like a challenge then avoid complexity wherever possible.

Saturday, September 29, 2007

Mostly used tricks of Dirty Linking

As we all are familiar with the concept of search engine positions. The best way to achieve Search Engine Positions are acquiring links from external web pages. These links can come from websites who like your site or who link to your site(natural links), reciprocal link building, directory submissions and lots of other ways.

True quality links will carry benefits far beyond that of attaining a coveted position in the search engine results. The links will bring traffic from the Web page linking to your Web page. Therefore, you want to ensure you trade or barter links from quality partners.

Sometimes it's hard to determine who is a quality linking partner, even for the expert. So, how can you tell if your link is on a Web page where its value will not be very good?

The short list below highlights ways of diminishing or nullifying the value of a link to your site from another Web page.

Meta Tag Masking - this old trick simply used CGI codes to hide the Meta tags from browsers while allowing search engines to actually see the Meta tags.

Robots Meta Instructions - using noindex and nofollow attributes let's the novice link partner see the visible page with their link while telling the search engines to ignore the page and the links found on the page. Nofollow can be used while allowing the page to be indexed which gives the impression that the search engines will eventually count the link.

Rel=nofollow Attributes - this is not a real attribute based upon HTML standards, but rather it is an attribute approved by the search engines to help identify which links should not be followed. This attribute is often used with blogs to prevent comment and link spam. The link will appear on the Web page and in the search engine's cache, but never be counted.

Dynamic Listing - dynamic listing is a result of having links appear randomly across a series of pages. Each time the link is found on a new page, the search engines count consider the freshness of the link. It is extremely possible that the link won't be on the same page upon the next search engine visitation. So, the link from a partner displaying rotating, dynamic link listings rarely helps.

Floating List - this can be easily missed when checking link partners. Essentially, your link could be number one today, but as new link partners are added your link is moved down the list. This is harmful because the values of the links near the bottom of the list are considered to be of lesser value than the links at the top. With the floating list, it is possible to have your link moved to a new page whose PR value is significantly less or not existent and the new page may not be visited and indexed for months.

Old Cache - the caching date provided by Google indicates the last time the page was cached. Pages with lower PR values tend to be visited and cached less often than pages that have medium to high PR values. If the cache is more than six months old, it can be surmised that Google has little or no desire to revisit the page.

Denver Pages - while Denver, CO is a nice place to visit, Denver Pages are not a place you want to find your link in a trade. Denver Pages typically have a large amount of links grouped into categories on the same page. Some people call this the mile high list. These types of pages do not have any true value in the search engines and are not topically matched to your site.

Muddy Water Pages - these are dangerous and easy to spot. Your link will be piled in with non-topically matched links with no sense of order. It's like someone took all the links and thrown them in the air to see where they land. These are worse than the Denver Pages.

Cloaking - cloaking is the process of providing a page to people while providing a different page to search engines. You could be seeing your link on the Web page, but the search engines could possibly never see the link because they are provided with a different copy. Checking Google's cache is the only way to catch this ploy.

Dancing Robots - this can be easily performed with server-side scripting like PHP and is rarely easy to catch. In this situation people that attempt to view the robots.txt file receive a copy of the robots.txt file that does not include exclusion instructions for the search engines. However, when the search engines request the robots.txt file they receive the exclusion instructions. With this situation the links pages will never be linked and you'll never know why without expert assistance.

Meta Tags and Robots.txt Confusion - which instructions have the most weight? Don't know the answer? Shame. Search engines do. If they conflict the page Meta tags are typically considered the rule to follow.

Link the Head - while these links do not count in the search engines and do not show up on the Web page, they do get counted by scripts or programs designed to verify the links exist. These programs only look for the URL within the source codes for the Web page.

Empty Anchors - this is a nasty trick, but can be an honest mistake. The links exist and are counted by the search engines, but unfortunately are neither visible nor clickable on the Web page. So, there are no traffic values from the link.

Thursday, September 27, 2007

Google celebrates 9th Birthday!!!

Happy 9th Birthday Google!

Google 9th Birthday

Wednesday, September 26, 2007

How to Avoid Alienating the Major Search Engines

Each of the major search engines Google, Yahoo and MSN have quality webmaster guidelines in place to prevent the unfair manipulation of search engine rankings by unscrupulous website owners. These webmaster guidelines change frequently to ‘weed’ out any new deceptive practices and those websites found engaging in these illicit practices are consequently dropped from the search engine rankings of the major search engine they have offended.

Being banned or dropped from the search engine rankings can have dire effects on your website traffic, online sales generation and site popularity. Especially if your website is classified as a ‘bad neighborhood’ site, you can then kiss your reciprocal linking campaign goodbye, as existing and prospective link partners will not want to be associated with your site for fear of their own rankings dropping.

If you wish to avoid alienating the major search engines then do not engage in the following SE tactics:

1. ‘Cloaking’ or sneaky redirects - displaying different content to the search engines than shown to your normal website visitors including hidden text and hidden links. Often this is achieved by delivering content based on the IP address of the user requesting the page, when a user is identified as a search engine spider a side-server script delivers a different version of the web page to deceive the search engine into giving the website a higher ranking.

2. ‘Doorway’ pages created specifically for the search engines that are aimed at spamming the index of a search engine by inserting results for specific keyword phrases to send the search engine spider to a different page. With doorway pages a user doesn’t arrive at the page they were looking for. Similarly avoid ‘cookie cutter’ approaches that direct users to affiliate advertising with little or no original content.

3. Don’t create pages that install viruses, Trojans or badware. ‘Badware’ is spyware, malware or deceptive adware that tracks a user’s movements on the internet and reports this information back to unscrupulous marketing groups who then bombard the user with targeted advertising. This type of spyware is often unknowingly downloaded when playing online games or is attached to software or information downloads from a site. They are often difficult to identify and remove from a user’s PC and can affect the PC’s functionality.

4. Avoid using software that sends automatic programming queries to the search engines to submit pages or check rankings. This type of software consumes valuable computing resources of the search engines and you will be penalized for using it.

5. Don’t load web pages with irrelevant words.

6. Don’t link to ‘bad neighborhood’ sites who have:

* Free for all links pages
* Link farms - automated linking schemes with lots of unrelated links
* Known web spammers or the site has been dropped or banned by the search engines.

7. Avoid ‘broken links’ or ‘404 errors’, your site will be penalized for them.

8. Don’t display pages with minimal content that is of little value to your site visitors.

9. Do not duplicate content unnecessarily.

10. Do not use pop-ups, pop-unders or exit consoles.

11. Do not use pages that rely significantly on links to content created for another website.

12. Do not use ‘cross linking’ to artificially inflate a site’s popularity. For example, the owner of multiple sites cross linking all of his sites together, if all sites are hosted on the same servers the search engines will pick this up and the sites will be penalized.

13. Do not misuse a competitors name or brand names in site content.

14. Sites with numerous, unnecessary virtual host names will be penalized.

15. Do not use techniques that artificially increase the number of links to your web pages ie. Link farms.

16. Display web pages with deceptive, fraudulent content or pages that provide users with irrelevant page content.

17. Using content, domain titles, meta tags and descriptions that violate any laws, regulations, infringe on copyrights & trademarks, trade secrets or intellectual property rights of an individual or entity. Specifically in terms of publicity, privacy, product design, torts, breach of contract, injury, damage, consumer fraud, false, misleading, slanderous or threatening content.

Wednesday, September 19, 2007

Google Slips in China

A new report shows Baidu.com Inc. increased its share of the search market in China's top cities, while rival Google Inc. lost ground despite continued heavy investments in its Chinese operations.

Baidu has a 69.5 percent share of the search market in Beijing, Shanghai and Guangzhou, up 7.6 percentage points since last year, according to a report (in Chinese) by Peter Lu, managing partner and chief analyst at China IntelliConsulting Corp. (CIC).

Meanwhile, Google saw its share of the market shrink, falling 1.1 percentage points to 23 percent.

Google's decline was offset somewhat by improved performance among students during the most recent six months. But the company continued to lose its older, more affluent users to Baidu, the report said.

These figures are likely to step up the pressure on Google China President Kai-Fu Lee to deliver results for the company. Google China has grown quickly, hiring hundreds of staff and moving into a new building close to the campus of Beijing's prestigious Tsinghua University. But these investments have yet to deliver the hoped-for domination of China's search market and raised concerns the company expanded too fast in the country.

Those concerns came to a head in April, when Google admitted that a software tool released by its Chinese operations used a database developed by rival Sohu.com Inc. for a competing product -- a stunning admission for a company that is widely respected for the strength of its engineering operations and the caliber of its software developers.

The CIC report had some good news for Google. The company's efforts to strengthen its Chinese search capabilities are recognized by users, with 58.5 percent of respondents in a survey saying they'd noticed changes to the search engine. Of those respondents, 65.3 percent said Google's Chinese search engine had improved.

Other search engines also saw their share of the search market decline. Yahoo China, which is run by Alibaba.com Inc., saw its share drop 2.9 percentage points to 2.3 percent of the market, while Sohu.com's Sogou search engine fell 1.4 points to 1.8 percent.

Monday, September 17, 2007

Way of Link Campaigns

If we are going to be engaged with link campaigns,there are various reasons for that. Mostly search engines always prefer links to determine a site rank in SERPs(search engine results page).And if we talk about Google algorithms, link campaigns always play a vital role in search engine promotion and marketing strategies.

Before getting into it deeply,lets have a look on types of link campaigns we often handle:

Reciprocal Link: This is where two sites exchange links to each other's site.

One Way Link : where one site links to other site without getting link back from that site.

Tri-Way Link : where one site links to other site and get back a link from other's secondary site.


Reciprocal linking is a easy process than one way linking and also time consuming process too. During this process, a web master look for the related websites and ask them for link exchange.

If and when you receive some responses, you should add their sites as soon as you can, and send them an email telling them that their link has been added, and include the url where they can see their site.

In-Bound (One-way) Links are the best type of links if you can get them. It's not all that difficult to do so, but as in anything worth while, there will be an amount of effort involved.

The easiest way to get In bound links to your site is to do directory submissions. There are many great free directories to submit to, and it's worth the time and effort to get listed in as many as you possibly can.

There are also fee-based directories which you submit to, and as your budget allows, you should try to get listed in all directories that are relevant to your site's topic.

The next way to get one-way links to your site is to do a press release. With sites such as prwire, you can submit a press release to them for free, and that will not only get your site exposure, but it will also get you some in-bound links.

If you have the ability to compose articles or columns, there are many sites that can get your site exposure by you allowing them to re-print your content. While some may argue that this procedure may penalize you in the search engines for duplicate content, as long as you have posted your article to your site first, you shouldn't get hit by a duplicate content penalty.




Wednesday, September 12, 2007

Web Site Cloaking and Search Engines

Cloak: Something which hides, covers or keeps something else secret. (Cambridge Dictionary)
Cloaking: Also known as stealth, a technique used by some Web sites to deliver one page to a search engine for indexing while serving an entirely different page to everyone else. The search engine thinks it is selecting a prime match to its request based on the meta tags that the site administrator has input. However, the search result is misleading because the meta tags do not correspond to what actually exists on the page. Some search engines ... even ban cloaked Web sites. (Webopedia)

A good amount of web space has been dedicated to the merits of cloaking, but no one seems able to agree on the actual definition of the technique. One definition given is: "A technique used to deliver different web pages under different circumstances." This definition is too broad as page redirects for different browsers, for example, would be included under this definition but these pages are not hidden from search engine robots.

The one main characteristic of cloaking, according to Webopedia and Google, employs a different page delivered to search engines than the one displayed to visitors to the Web site; its only purpose being to hide content from search engines.

Many of those who employ cloaking techniques argue that they are presenting something different to a search engine rather than hiding something from it. The end result remains the same: the page presented to the search engine is different than the one anticipated by the visitor.

Wednesday, September 5, 2007

Link Building Quality Parameters

Link Building requires deep analysis of search engine algorithms and there response to the quality & relevancy of the links so if you are planning to outsource your link building campaigns, it is extremely important that you must insist your service provider to conform to the following quality guidelines for your Link Building Campaign.

  1. Only Incoming (non-reciprocal) links.
  2. Links with specified "Keywords" in the Anchor Text.
  3. Sufficient randomization in the anchor text and associated link description text so as to not have a set pattern of linking.
  4. Links from sites that have PR specified links page.
  5. Link to your site should not be through a "redirect" script.
  6. No JavaScript links.
  7. No framed sites.
  8. No flash site links.
  9. No robots.txt excluded link pages.
  10. No Robots Tag excluded link pages.
  11. No paid or time-bound links
  12. Links from Pre-indexed Pages only.
  13. Spread your site link across different domains.
  14. No SPAM should be used to solicit links.
  15. No links from Link Farms.
  16. No links from FFA (Free-For-All) link networks.
  17. Don't get links from Blogs and that have the Link Attribute, "rel=nofollow".
  18. No links from PORN, racially prejudiced sites and other sites containing offensive content.
  19. Full data sheet of links created at the end of each month.
  20. Only relevant established links should be counted in the final report.
  21. The links creation should be spread across a duration of time, so that there is no sudden surge in the incoming links count.
  22. No more than 60 links per page where your link appears
  23. Links from industry-relevant pages.
  24. Links pace as per client convenience
  25. No email spam used to solicit links
Link Popularity Building helps your website gain high Page Rank which enables your site to rank higher in major Search Engines with highly competitive keywords.

Saturday, September 1, 2007

Link Building: The paradox of Poverty amidst Plenty

An effective landing page is only as good as the visitors that it attracts. Hence, the next logical step after development of an effective landing page is to promote it. This involves attracting qualified visitors to it, by employing various online marketing efforts, viz. PPC or SEO , email marketing. Another frequently employed online marketing tool with the twin benefit of economy and effectiveness is Link Building.

Link building is a practice of getting the other semantically similar websites to feature your link on their websites. Link building constitutes finding viable business partners willing to endorse your site. An authoritative link to your site counts as one vote toward your link popularity.

Other site owners would link to a page, when they consider the content important. Likewise, search engines will consider your content important because human beings, not software programs, link to your site’s content. These factors result in the page attracting classified visitors.

There are various attributes that influence the link building initiative. The irony of link building is that while it may be appear to be an easy exercise (sourcing links from FFA and link a minute sites is easy), the real benefits of link building accrue only when the link is obtained from a quality website that is thematically relevant to the website that the link is requested for. Hence the paradox.

To read complete article press your finger @ Mindscape

Saturday, August 18, 2007

Google AdWords introducing Radio Ads

Google’s commitment to enhance user experience takes headway as Google is all set to broadcast Google AdWords over radio. As per reports, Radio Ads by Google AdWords shall stay abreast of the Google’s tradition of highly targeted, minimally intrusive and cost-effective advertising.

For more info,log on to Kneoteric

Friday, August 3, 2007

Pay Per Clicks Overview

Pay Per Click advertisements are those which appear besides and above the organic search engine results for any specific search term. The advertiser has to pay search engines for each visitor he receives through the medium of search engines.

As pay per click is a complete game of money, it requires the best management.

Examples for Pay Per Click Advertisements


Google Pay Per Click


Yahoo Pay Per Click


MSN Pay Per Click

SEO Important Factors

Search Engine Optimization (SEO) has been defined as a subset of Internet Marketing used to improve the search engine rankings for any website and derive quality visitors and business organically.

There are various factors in SEO which are equally responsible for any website to achieve high rankings on the major search engine’s namely Google, Yahoo and MSN.

A variety of factors and algorithms are deployed by search engine’s to rank any web page for a specific search term.

Some of these factors can be:

On Page Optimization
On Page Optimization includes making variations within the website code and structure to make it more search engine friendly and attractive for human eye. There are many criteria’s which are responsible for better On Page Optimization like Meta Tags, Use of H1 & H2 tags, use of bold and italics tags and many others

Off Page Optimization
This is termed as one of the most important factors in search engine optimization industry as back links are a third party review given to your website. The Off Page Optimization includes search engine submissions, directory submissions, article submissions, press release submissions and the most important Link Building (One Way and Reciprocal)

Technical Optimization
There are various technical factors which are responsible for achieving the improvement in search engine rankings. Some of these factors responsible are Server Header Redirects, Htaccess Optimization, Custom 404 page optimization and many others.

Content Optimization
The content for any website needs to be optimized in terms to keywords and looks to make it search engine and human friendly. In addition there should be regular updations made on the website in terms of new content which helps the search engine crawler to visit your website more frequently.

There aforementioned are only a few factors which are responsible to achieve search engine rankings on the major players.

Friday, July 27, 2007

How Search Engines Operate??

Search engines have a short list of critical operations that allows them to provide relevant web results when searchers use their system to find information.

  1. Crawling the Web
    Search engines run automated programs, called "bots" or "spiders", that use the hyperlink structure of the web to "crawl" the pages and documents that make up the World Wide Web. Estimates are that of the approximately 20 billion existing pages, search engines have crawled between 8 and 10 billion.
  2. Indexing Documents
    Once a page has been crawled, its contents can be "indexed" - stored in a giant database of documents that makes up a search engine's "index". This index needs to be tightly managed so that requests which must search and sort billions of documents can be completed in fractions of a second.
  3. Processing Queries
    When a request for information comes into the search engine (hundreds of millions do each day), the engine retrieves from its index all the document that match the query. A match is determined if the terms or phrase is found on the page in the manner specified by the user. For example, a search for car and driver magazine at Google returns 8.25 million results, but a search for the same phrase in quotes ("car and driver magazine") returns only 166 thousand results. In the first system, commonly called "Findall" mode, Google returned all documents which had the terms "car", "driver", and "magazine" (they ignore the term "and" because it's not useful to narrowing the results), while in the second search, only those pages with the exact phrase "car and driver magazine" were returned. Other advanced operators (Google has a list of 11) can change which results a search engine will consider a match for a given query.
  4. Ranking Results
    Once the search engine has determined which results are a match for the query, the engine's algorithm (a mathematical equation commonly used for sorting) runs calculations on each of the results to determine which is most relevant to the given query. They sort these on the results pages in order from most relevant to least so that users can make a choice about which to select.

Although a search engine's operations are not particularly lengthy, systems like Google, Yahoo!, AskJeeves, and MSN are among the most complex, processing-intensive computers in the world, managing millions of calculations each second and funneling demands for information to an enormous group of users.

Reference By-SEOmoz

Thursday, July 26, 2007

Negative Keywords-Unethical way of Optimization

One of the prominent problem faced in the arena of Pay Per Click Search Engine Marketing is how the junk traffic can be avoided . And the term Junk Traffic has a direct relationship with keyword bringing traffic. The Keywords can be Negative or Positive keywords


Negative keywords are those which are totally irrelevant to the website you are promoting . Negative Keywords are used in keyword advertising to prevent ads from being shown when the search terms contain the negative keywords . For example the keyword “FREE” has a lot of traffic but if you are not giving anything for free then it is an absolute disaster to your website as it will attract Junk Traffic .The conversion of visitors into business leads would be very low and the cost would be tremendously high .


But still some sites do promotion on the negative keywords just to attract traffic this is called negative keyword promotion or optimizing negative keyword . Negative keyword promotion has some advantages like , in case if some spelling error occurs while inserting a search term. For example a site name is www.oleklezbezone.com , a user can easily commit a mistake while writing the site address as a result an unwanted result generated on the the search engines due to which a relevant customer will not be able to get to you . Now the concept of optimizing on the negative keywords comes. If the optimization has been done on the positive as well as the negative keywords the problem can be solved .


It's essential that the negative keywords are not used in excess as it can result in a very limited advertising audience. But not using any negative keyword at all may result in a meager number of users interested in your business and service . One should always consider negative keywords while framing a successful keyword list and keep a close eye on the combination of the two so that a feasible option can be thought out . Thoughtfully chosen negative keywords can help in lowering the wasted clicks and hence better results . A well-targeted keyword list can be the cause of lower advertising costs and leads generating clicks .


The point is we can search for this kind of errors which are very repetitive and worth bidding on and promote the site on that keyword too.


Effective Keyword Research - Tips and Suggestions

Keyword Research is the process of identifying the most important business terms and key phrases. It is the one of the initial steps while chalking out an SEO/SEM campaign plan for a website.


An effective keyword research process is considered as the first step to a successful SEO campaign as it governs the revenue as well as the investment, thus being a major factor affecting the ROI.

Keyword Research requires a number of factors be considered including the business model and the demographics of the target audience.

For instance:
If you are a Ford Dealer in London, a good keyword may be “Ford Dealers London” or “Ford Car Dealers London”.

Some of the commonly used tools for keyword research include Overture Inventory, Google Keyword Tool (a.k.a. Google Suggest) and Word Tracker. These tools help you study the search pattern for a set of key phrases and help in opting for the right keywords.

Although it can be tempting to choose the most traffic attracting key phrases, however, due care needs to be paid to ensure that the keywords are relevant to the business model and pose a realistically achievable target.

A good strategy might be to divide the SEM/SEO campaign into phases and iteratively perform keyword research before the start of each phase.

Finally, it might be a good idea to hire a keyword research expert or an expert SEO/SEM consultant to help with the keyword research for your website.

Search Engine Optimization Basics

Search Engine Optimization (SEO) has been defined as a subset of Search Engine Marketing (SEM). SEO is a technique to enhance the organic search engine rankings and increase the online visitors for a website by incorporating various changes on and off the website.

SEO is an amalgamation of a number of activities. For a successful SEO campaign, some of the important activities are:

1. Targeted Keyword Research
Keywords form the backbone of any SEO campaign. A successful SEO campaign needs the right keyword selection satisfying the business model and objectives. Keyword research may be performed using various tools like Google Suggest, Overture Keyword Suggestion, Word Tracker amongst others.

2. Competitor Analysis
Competitor analysis includes studying various competitors' websites and analyzing the strategies used by them to optimize their websites. This study includes the analysis of their target keywords, role business model and optimization strategies used by your business competitors.

3. On Page Optimization
On page optimization has the highest tangible value amongst various SEO activities. The website pages are tweaked and optimized for better search engine rankings. Some of the important on page optimization activities include meta tags revision, use of heading tags (h1/h2), image optimization (including size reduction and addition of alternative text), file renaming and optimizing the over all website architecture.

4. Off Page Optimization
Inbound links from thematically similar websites result in a high Google Page Rank making it favorable for search engines.

5. Technical Optimization
There are many technical factors that command consideration to help boost the search engine rankings for a website. Some of these factors include canonical indexing, custom error pages amongst others.

The aforementioned are only a few of the activities that lead to a successful SEO campaign. High rankings and traffic is an outcome of a blend of various Search Engine Optimization activities performed continuously over time. These further yield a higher ROI than most marketing activities that exist.


For More Information click on Kneoteric