Welcome to "SEOCrazy"!!! Subscribe the Blog Feed for Latest SEO Tips and Techniques.
Enjoy your stay... :-)

Thursday, April 24, 2008

Digital Marketing - A New Era Of Marketing

The term digital marketing has emerged recently in the world of professional marketing. Whilst digital marketing include so many technique and practice enclosed within the Internet Marketing, it extends beyond this by including other channels with which to reach people that do not require the use of The Internet.

Indeed, there was no terminology to describe the actions of companies made to reach their prospects / customers across all digital media (computer, mobile phone, podcast player, video games and display dynamic - billboard outside & TV point of sale). The digital marketing is a discipline designed to promote products and services using a medium or a digital communication channel to reach consumers in a personal way, ultra-targeted and interactive for a reasonable cost.

In view of the rise of new media companies are entrain to change their marketing organization. Thus, the Internet marketing experts see their scope of responsibilities grow to encompass the issues of digital marketing. They must learn to target consumers not only on the Internet, but on all digital media. Those responsible for advertising will also have to learn to control this area. As an example of the evolution of media consumption, an American teenager spends more time playing video games than watching television. To reach these consumers in grass, companies will therefore have to seek ways to be present in an intelligent manner in video games.

Monday, April 7, 2008

Spammy Robots List

Here is a list of some unwanted robots, that crawl your web pages for email address to spam you with advertisements. They can also be referred to as Spiders, Web Wanderers, and Web Crawlers. You can block these by using this code:


Code:

User-agent: aipbot
Disallow: /

User-agent: ia_archiver
Disallow: /

User-agent: Alexibot
Disallow: /

User-agent: Aqua_Products
Disallow: /

User-agent: asterias
Disallow: /

User-agent: b2w/0.1
Disallow: /

User-agent: BackDoorBot/1.0
Disallow: /

User-agent: becomebot
Disallow: /

User-agent: BlowFish/1.0
Disallow: /

User-agent: Bookmark search tool
Disallow: /

User-agent: BotALot
Disallow: /

User-agent: BotRightHere
Disallow: /

User-agent: BuiltBotTough
Disallow: /

User-agent: Bullseye/1.0
Disallow: /

User-agent: BunnySlippers
Disallow: /

User-agent: CheeseBot
Disallow: /

User-agent: CherryPicker
Disallow: /

User-agent: CherryPickerElite/1.0
Disallow: /

User-agent: CherryPickerSE/1.0
Disallow: /

User-agent: Copernic
Disallow: /

User-agent: cosmos
Disallow: /

User-agent: Crescent
Disallow: /

User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow: /

User-agent: DittoSpyder
Disallow: /

User-agent: EmailCollector
Disallow: /

User-agent: EmailSiphon
Disallow: /

User-agent: EmailWolf
Disallow: /

User-agent: EroCrawler
Disallow: /

User-agent: ExtractorPro
Disallow: /

User-agent: FairAd Client
Disallow: /

User-agent: Fasterfox
Disallow: /

User-agent: Flaming AttackBot
Disallow: /

User-agent: Foobot
Disallow: /

User-agent: Gaisbot
Disallow: /

User-agent: GetRight/4.2
Disallow: /

User-agent: Harvest/1.5
Disallow: /

User-agent: hloader
Disallow: /

User-agent: httplib
Disallow: /

User-agent: HTTrack 3.0
Disallow: /

User-agent: humanlinks
Disallow: /

User-agent: IconSurf
Disallow: /
Disallow: /favicon.ico

User-agent: InfoNaviRobot
Disallow: /

User-agent: Iron33/1.0.2
Disallow: /

User-agent: JennyBot
Disallow: /

User-agent: Kenjin Spider
Disallow: /

User-agent: Keyword Density/0.9
Disallow: /

User-agent: larbin
Disallow: /

User-agent: LexiBot
Disallow: /

User-agent: libWeb/clsHTTP
Disallow: /

User-agent: LinkextractorPro
Disallow: /

User-agent: LinkScan/8.1a Unix
Disallow: /

User-agent: LinkWalker
Disallow: /

User-agent: LNSpiderguy
Disallow: /

User-agent: lwp-trivial
Disallow: /

User-agent: lwp-trivial/1.34
Disallow: /

User-agent: Mata Hari
Disallow: /

User-agent: MIIxpc
Disallow: /

User-agent: MIIxpc/4.2
Disallow: /

User-agent: Mister PiX
Disallow: /

User-agent: moget
Disallow: /

User-agent: moget/2.1
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: NetAnts
Disallow: /

User-agent: NICErsPRO
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Openbot
Disallow: /

User-agent: Openfind
Disallow: /

User-agent: Openfind data gatherer
Disallow: /

User-agent: Oracle Ultra Search
Disallow: /

User-agent: PerMan
Disallow: /

User-agent: ProPowerBot/2.14
Disallow: /

User-agent: ProWebWalker
Disallow: /

User-agent: psbot
Disallow: /

User-agent: Python-urllib
Disallow: /

User-agent: QueryN Metasearch
Disallow: /

User-agent: Radiation Retriever 1.1
Disallow: /

User-agent: RepoMonkey
Disallow: /

User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow: /

User-agent: RMA
Disallow: /

User-agent: searchpreview
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: SpankBot
Disallow: /

User-agent: spanner
Disallow: /

User-agent: SurveyBot
Disallow: /

User-agent: suzuran
Disallow: /

User-agent: Szukacz/1.4
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: Telesoft
Disallow: /

User-agent: The Intraformant
Disallow: /

User-agent: TheNomad
Disallow: /

User-agent: TightTwatBot
Disallow: /

User-agent: toCrawl/UrlDispatcher
Disallow: /

User-agent: True_Robot
Disallow: /

User-agent: True_Robot/1.0
Disallow: /

User-agent: turingos
Disallow: /

User-agent: TurnitinBot
Disallow: /

User-agent: TurnitinBot/1.5
Disallow: /

User-agent: URL Control
Disallow: /

User-agent: URL_Spider_Pro
Disallow: /

User-agent: URLy Warning
Disallow: /

User-agent: VCI
Disallow: /

User-agent: VCI WebViewer VCI WebViewer Win32
Disallow: /

User-agent: Web Image Collector
Disallow: /

User-agent: WebAuto
Disallow: /

User-agent: WebBandit
Disallow: /

User-agent: WebBandit/3.50
Disallow: /

User-agent: WebCapture 2.0
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: WebCopier v.2.2
Disallow: /

User-agent: WebCopier v3.2a
Disallow: /

User-agent: WebEnhancer
Disallow: /

User-agent: WebSauger
Disallow: /

User-agent: Website Quester
Disallow: /

User-agent: Webster Pro
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebZip
Disallow: /

User-agent: WebZip
Disallow: /

User-agent: WebZip/4.0
Disallow: /

User-agent: WebZIP/4.21
Disallow: /

User-agent: WebZIP/5.0
Disallow: /

User-agent: Wget
Disallow: /

User-agent: wget
Disallow: /

User-agent: Wget/1.5.3
Disallow: /

User-agent: Wget/1.6
Disallow: /

User-agent: WWW-Collector-E
Disallow: /

User-agent: Xenu's
Disallow: /

User-agent: Xenu's Link Sleuth 1.1c
Disallow: /

User-agent: Zeus
Disallow: /

User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow: /

User-agent: Zeus Link Scout
Disallow: /

Saturday, April 5, 2008

Robots.txt - Stop Search Engines To Access Your Private Files

The robots.txt file is a text file containing commands to the engine crawlers research to clarify their pages who may or may not be indexed. Thus any search engine began its exploration of a website seeking robots.txt at the root of the site.

Format robots.txt

The robots.txt (written in lower case and plural) is an ASCII file that are at the root of the site and may contain the following commands:

* User-Agent: allows you to specify the robot affected by the following guidelines.
* The value means "all search engines".
* Disallow: allows you to specify the pages to exclude from indexing. Each page or path to exclude must be on a line at hand and must begin with. The value / sole means "all pages."

The robots.txt file should contain no blank line!

Examples of robots.txt:

* Exclusion of all pages:

User-Agent: *
Disallow: /

* Exclusion of any page (equivalent to the absence of robots.txt, all pages are visited):

User-Agent: *
Disallow:

* Authorization of a single robot:

User-Agent: nomDuRobot
Disallow:
User-Agent: *
Disallow: /

* Exclusion of a robot:

User-Agent: NomDuRobot
Disallow: /
User-Agent: *
Disallow:

* Excluding one-page:

User-Agent: *
Disallow: / directory / path / page.html

* Exclusion of several page:

User-Agent: *
Disallow: / directory / path / page.html
Disallow: / repertoire/chemin/page2.html
Disallow: / repertoire/chemin/page3.html

* Exclusion of all pages of a directory and its subfolders:

User-Agent: *
Disallow: / directory /