WordPress widget for Publicizing Blog Posts on facebook/Twitter is just a click away.


Image representing Twitter as depicted in Crun...

Image via CrunchBase

WordPress has introduced this wonderful feature that enables blogger’s posts/articles to get published on Facebook & Twitter  automatically….

The idea of self-publishing itself is just amazing. In today’s SEO centric Internet arena,  this widget is goig t make the life of a blogger fetch some relief.

http://en.blog.wordpress.com/2011/08/08/now-publicize-to-facebook-pages

How does it work  huhh ??

Facebook @ Mobile

Go to your wp BLOG’s  dashboard, Settings, and Sharing you’ll see Publicize. Once you set it up, Publicize works automatically.

You can customize the message that gets sent out on the Post page.

See http://en.support.wordpress.com/publicize/

I came across wish list of a WordPress user (Salil Lawande), who is seeking wp team to introduce below mentioned widgets.

Wishlist:
> Ability to tag media in the Media Library
> Albums functionality (or something close to what Picasa Web Albums)
> Sorting by month, year, title
> A graphical/visual approach to managing & deploying media (drag & drop)

Happy Blogging.

Go Blogger Go.

Your 2010 year in blogging…by Team WordPress.com + Stats Helper Monkeys


The stats helper monkeys at WordPress.com mulled over how this blog did in 2010, and here’s a high level summary of its overall blog health:

Healthy blog!

The Blog-Health-o-Meter™ reads Wow.

Crunchy numbers

Featured image

A helper monkey made this abstract painting, inspired by your stats.

A Boeing 747-400 passenger jet can hold 416 passengers. This blog was viewed about 7,500 times in 2010. That’s about 18 full 747s.

 

In 2010, there were 28 new posts, growing the total archive of this blog to 115 posts. There was 1 picture uploaded, taking a total of 208kb.

The busiest day of the year was November 22nd with 111 views. The most popular post that day was Picto-ry of Web evolution…. (tired of writing words) 😛.

Where did they come from?

The top referring sites in 2010 were facebook.com, bigextracash.com, bloglovin.com, student-loan-consilidation.com, and 1harga.com.

Some visitors came searching, mostly for coke a cola, funny monkey, funny monkey pictures, mineral based industries, and mineral based industries in india.

Attractions in 2010

These are the posts and pages that got the most views in 2010.

1

Picto-ry of Web evolution…. (tired of writing words) 😛 November 2010
9 comments

2

Fraud email Gallery: Scam Letters – 542 different examples! December 2009
10 comments

3

Thanda Matlab Coca Cola: But, Thanda nahi hai…… November 2009
1 comment

4

India’s Most Trusted Brands – 2007: Economic Times Brand Equity Survey November 2009

5

India’s Most Trusted Brands – 2008: Economic Times Brand Equity Survey November 2009

 

 

Some of your most popular posts were written before 2010. Your writing has staying power! Consider writing about those topics again.

 

Share these stats with your visitors

Want to share this summary with your readers? Just click the button below,

Image representing WordPress.com as depicted i...

Image via CrunchBase

HTML Commands for Formatting of the Note or Script


To use special formatting such as bold and underline in your notes, use the following HTML commands.

To see this: Type this in your note:
bold <b>bold</b>
italics <i>italics</i>
underline <u>underline</u>
strikethrough <s>strikethrough</s>
Big size <big>Big size</big >
Small size <small>Small size</small>
An em-dash—see? An em-dash&mdash;see?
Hyperlink to Facebook Hyperlink to <a href=”http://www.facebook.com”>Facebook</a&gt;
A Bulleted List:

  • One Item
  • Another Item
A Bulleted List:
<ul>
<li>One Item</li>
<li>Another Item</li>
</ul>
An Ordered List:

  1. First Item
  2. Second Item
An Ordered List:
<ol>
<li> First Item</li>
<li> Second Item</li>
</ol>
The following quote is special:

Because it is indented

The following quote is special:
<blockquote>Because it is indented</blockquote>

Heading 1

Heading 2

Heading 3

<h1>Heading 1</h1>

<h2>Heading 2</h2>

<h3>Heading 3</h3>

Exposing click fraud – The Anatomy of Online Scam


Internet marketers facing higher advertising fees on search networks are becoming increasingly concerned about a form of online fraud that was thought to have been contained years ago.

The practice, known as “click fraud,” began in the early days of the Internet’s mainstream popularity with programs that automatically surfed Web sites to increase traffic figures. This led companies to develop policing technologies touted as antidotes to the problem. But some marketing executives estimate that up to 20 percent of fees in certain advertising categories continue to be based on nonexistent consumers in today’s search industry.

News context:

What’s new:
Net marketers facing higher ad fees are becoming increasingly worried about an online practice known as “click fraud.”Bottom line:
The persistence of click fraud has exposed a fundamental weakness in the promising business of Internet search marketing, but most advertisers aren’t sure how to address the problem.

In one recent example of the problem, law enforcement officials say a California man created a software program that he claimed could let spammers bilk Google out of millions of dollars in fraudulent clicks. Authorities said he was arrested while trying to blackmail Google for $150,000 to hand over the program. He was indicted by a California jury in June.

Matt Parrella, chief of the San Jose branch of the U.S. Attorney’s Office in Northern California, said that case was “not unique.” The problem “is certainly not shrinking, and we’re ready to prosecute people,” said Parrella, whose office handled the Google case.

Click fraud is perpetrated in both automated and human ways. The most common method is the use of online robots, or “bots,” programmed to click on advertisers’ links that are displayed on Web sites or listed in search queries. A growing alternative employs low-cost workers who are hired in China, India and other countries to click on text links and other ads. A third form of fraud takes place when employees of companies click on rivals’ ads to deplete their marketing budgets and skew search results.

Although the extent of click fraud is impossible to measure with any certainty, its persistence has exposed a fundamental weakness in the promising business of Internet search marketing. Google’s pending initial public offering has been widely anticipated as a barometer of online advertising and the post-apocalyptic dot-com climate in general.

“It’s hard to tell how big the problem is, but people are looking at it closer and closer as the cost of search advertising goes up,” said John Squire, vice president of business development of Coremetrics, a Web analytics firm. “Click fraud is a fin sticking out of the water: You’re not sure if it’s a great white shark or a dolphin.”

Unlike advertising in traditional media such as billboards and print publications, “cost per click” Internet ads displayed with specific keyword searches have been promoted as a definitive way for companies to gauge their exposure to potential customers. As a result, U.S. sales from advertiser-paid search results are expected to grow 25 percent this year to $3.2 billion, up from $2.5 billion in 2003, according to research firm eMarketer. From 2002 to 2003, the market rose by 175 percent.

As more advertisers have competed for desirable keywords in their industries, the cost for clicks has risen too. On average, advertisers are paying 45 cents per click this year, according to financial analysts, up from 40 cents in 2003 and 30 cents in the second quarter of 2002. In certain sectors, such as travel, legal advice and gaming, the cost can reach several dollars per click.

But marketing executives say click fraud is pervasive among affiliates of search leaders Google, Yahoo-owned Overture Services and FindWhat.com. In a typical affiliation, any Web publisher can become a partner of these large networks by displaying their paid links on a Web page or within its own search results and then share in the profits with every click.

“There’s a fatal flaw in the cost-per-click model because a ton of marketing dollars can be depleted in a fraction of a second,” said Jessie Stricchiola, president of Alchemist Media, a search-engine marketing firm based in Los Angeles that specializes in fraud protection. “Technology is continuing to be developed that can exploit this pricing model at incredibly high volumes.”

Google’s fraud squad
Google declined an interview for this report, citing the mandatory “quiet period” before its initial public offering, which is expected to raise $2.7 billion. But the company said in a statement that it has been “the target of individuals and entities using some of the most advanced spam techniques for years. We have applied what we have learned with search to the click fraud problem and employ a dedicated team and proprietary technology to analyze clicks.”

In recent documents filed with the Securities and Exchange Commission, the company also acknowledged the problem as a threat to its revenue, of which 95 percent is derived from advertising. Google and other search networks provide refunds to advertisers when click fraud has been discovered.

The Anatomy of Online Clicks Scam

“If we are unable to stop this fraudulent activity, these refunds may increase,” Google said in its SEC filing. “If we find new evidence of past fraudulent clicks we may have to issue refunds retroactively of amounts previously paid to our Google Network members.”

Google and Overture employ “fraud squads,” or teams of people dedicated to fighting click schemes. But at least two marketing executives say such countermeasures are missing fraudulent clicks that are responsible for between 5 percent and 20 percent of advertising fees paid to all search networks.

Overture spokeswoman Jennifer Stephens refutes that estimate, saying that the numbers likely represent acts of fraud that are ultimately caught. She added that Overture filters most fraudulent clicks with the best antifraud system in the industry, which combines technology and human analysis.

“We take this very seriously; it’s the foundation of what we do,” Stephens said. “If an advertiser has a question about it, we look into all matters.”

Cost-per-click advertising comes in many forms, but it essentially lets marketers gain exposure on a Web site and pay only when people click on their ads. Google and Overture let advertisers bid for placement of paid links, which appear when certain keyword searches are conducted on the networks’ sites or those of third parties that partner with them. Keyword ads can also be distributed according to the content of partners’ sites and displayed on non-search pages. (CNET Networks, which publishes News.com, partners with Google for shared advertising revenue.)

Most advertisers are aware of the click-fraud issue but have not delved into it because of the technical complexities involved. Others are concerned that they could jeopardize their relationships with the powerful search networks if they complain too loudly.

“It is a bigger problem, but folks just don’t want to take the time to track it down because it’s a complex problem,” Coremetrics’ Squire said. Given that some of the largest marketers manage up to 1 million keywords in a campaign, he added, the data can be difficult to crunch.

Danny Sullivan, who runs a quarterly search-industry conference, said many advertisers do not raise their concerns with the ad networks because “they’re afraid that if they complain, it will hurt their free listings.”

Still, more fraud-detection technologies are emerging to help advertisers analyze their campaigns and traffic. Some advertisers and search-engine marketing companies say they are compiling lists of sites that generate a high number of clicks but not sales.

Coremetrics, Urchin and Whosclickingwho.com are just a few that sell technology to examine click rates and sales that result from paid searches. Alchemist Media, which charges flat fees for its consulting services, has detected fraud while acting as an intermediary between search networks and marketers.

In general, Alchemist’s Stricchiola estimates that 10 percent of all search ad clicks could be fraudulent. But she said the rate can reach 20 percent in particular businesses that have been targeted for click fraud.

Roy de Souza, CEO of advertising technology firm Zedo, said his company’s geotracking systems have traced Internet Protocol addresses to detect click operations in China. In describing one common scheme, he said a legitimate site is duplicated under another name, complete with text ads from a search network. A bot would then be trained to click on the ad links that appear on the bogus site, said de Souza, who estimated that click fraud affects 10 percent to 20 percent of today’s search network ads.

Many policing technologies can counter click fraud by analyzing Web traffic logs or surfing behavior. If a page is turned every 1.8 seconds over a period of time, for example, fraud-detecting systems will flag the traffic as suspiciously uniform.

Covert clicks
Human operations can be more difficult to detect because a wide network of people can click on ads from different computers across many regions, without a steady pattern. According to a report in the India Times, residents are being hired to click paid links from home, with the hopes of making between $100 to $200 per month.

In other instances, the source of bogus clicks can be much closer to home.

Joe, the chief executive of an Internet marketing company, enjoys clicking on his rivals’ text ads on Google and Yahoo because his competitor must pay as much as $15 each time he does it. Eventually, such phantom clicks can add up and drain a rival’s budget.

“It’s an entertainment,” said the executive, who asked to keep his name and company anonymous. “Why do you run into a store without dropping a quarter in the meter? You know it’s wrong, but you do it.”

Kevin Lee, chief executive of search marketing firm Did-It, estimates that fraud from such “drive-by” competitive clicks and affiliate scams makes up about 5 percent of the industry’s total sales. Lee concedes that he can only guess at the number, but he does know one thing for sure:

If it gets much higher, he said, “then we should all be getting worried.”

By Stefanie Olsen
Staff Writer, CNET News

How does WordPress count Blog statistics


About the Analytical Counting Math at WordPress

If you try to verify WordPress’s computations using the numbers in these tables you might get different results.

The logic is explained here.

  • An average is the sum of views divided by the number of days.
  • We exclude days prior to the first recorded view and future days.
  • Today (Sep 24) is excluded from averages because it isn’t over yet.
  • Yearly averages are computed from sums, not an average of monthly averages.
  • Averages are rounded to the nearest integer for display.
  • Gray zeroes are exactly zero. Black zeroes have been rounded down.
  • Percent change is computed from weekly averages before they are rounded.

Just a note: It doesn’t count your own visits to your blog.

Generated 2009-09-24 00:38:43 UTC+5

https://sohandhande.wordpress.com

SEO Practices- List of Best and Worst practices for designing a high traffic website


Here is a checklist of the factors that affect your rankings with Google, Bing, Yahoo! and the other search engines. The list contains positive, negative and neutral factors because all of them exist. Most of the factors in the checklist apply mainly to Google and partially to Bing, Yahoo! and all the other search engines of lesser importance. If you need more information on particular sections of the checklist, you may want to read our SEO tutorial, which gives more detailed explanations of Keywords, Links, Metatags, Visual Extras, etc.

Keywords                                 Description                                                          Effect
1 Keywords in <title> tag This is one of the most important places to have a keyword because what is written inside the <title> tag shows in search results as your page title. The title tag must be short (6 or 7 words at most) and the the keyword must be near the beginning. +3
2 Keywords in URL Keywords in URLs help a lot – e.g. – http://domainname.com/seo-services.html, where “SEO services” is the keyword phrase you attempt to rank well for. But if you don’t have the keywords in other parts of the document, don’t rely on having them in the URL. +3
3 Keyword density in document text Another very important factor you need to check. 3-7 % for major keywords is best, 1-2 for minor. Keyword density of over 10% is suspicious and looks more like keyword stuffing, than a naturally written text. +3
4 Keywords in anchor text Also very important, especially for the anchor text of inbound links, because if you have the keyword in the anchor text in a link from another site, this is regarded as getting a vote from this site not only about your site in general, but about the keyword in particular. +3
5 Keywords in headings (<H1>, <H2>, etc. tags) One more place where keywords count a lot. But beware that your page has actual text about the particular keyword. +3
6 Keywords in the beginning of a document Also counts, though not as much as anchor text, title tag or headings. However, have in mind that the beginning of a document does not necessarily mean the first paragraph – for instance if you use tables, the first paragraph of text might be in the second half of the table. +2
7 Keywords in <alt> tags Spiders don’t read images but they do read their textual descriptions in the <alt> tag, so if you have images on your page, fill in the <alt> tag with some keywords about them. +2
8 Keywords in metatags Less and less important, especially for Google. Yahoo! and Bing still rely on them, so if you are optimizing for Yahoo! or Bing, fill these tags properly. In any case, filling these tags properly will not hurt, so do it. +1
9 Keyword proximity Keyword proximity measures how close in the text the keywords are. It is best if they are immediately one after the other (e.g. “dog food”), with no other words between them. For instance, if you have “dog” in the first paragraph and “food” in the third paragraph, this also counts but not as much as having the phrase “dog food” without any other words in between. Keyword proximity is applicable for keyword phrases that consist of 2 or more words. +1
10 Keyword phrases In addition to keywords, you can optimize for keyword phrases that consist of several words – e.g. “SEO services”. It is best when the keyword phrases you optimize for are popular ones, so you can get a lot of exact matches of the search string but sometimes it makes sense to optimize for 2 or 3 separate keywords (“SEO” and “services”) than for one phrase that might occasionally get an exact match. +1
11 Secondary keywords Optimizing for secondary keywords can be a golden mine because when everybody else is optimizing for the most popular keywords, there will be less competition (and probably more hits) for pages that are optimized for the minor words. For instance, “real estate new jersey” might have thousand times less hits than “real estate” only but if you are operating in New Jersey, you will get less but considerably better targeted traffic. +1
12 Keyword stemming For English this is not so much of a factor because words that stem from the same root (e.g. dog, dogs, doggy, etc.) are considered related and if you have “dog” on your page, you will get hits for “dogs” and “doggy” as well, but for other languages keywords stemming could be an issue because different words that stem from the same root are considered as not related and you might need to optimize for all of them. +1
13 Synonyms Optimizing for synonyms of the target keywords, in addition to the main keywords. This is good for sites in English, for which search engines are smart enough to use synonyms as well, when ranking sites but for many other languages synonyms are not taken into account, when calculating rankings and relevancy. +1
14 Keyword Mistypes Spelling errors are very frequent and if you know that your target keywords have popular misspellings or alternative spellings (i.e. Christmas and Xmas), you might be tempted to optimize for them. Yes, this might get you some more traffic but having spelling mistakes on your site does not make a good impression, so you’d better don’t do it, or do it only in the metatags. 0
15 Keyword dilution When you are optimizing for an excessive amount of keywords, especially unrelated ones, this will affect the performance of all your keywords and even the major ones will be lost (diluted) in the text. -2
16 Keyword stuffing Any artificially inflated keyword density (10% and over) is keyword stuffing and you risk getting banned from search engines. -3
Links – internal, inbound, outbound
17 Anchor text of inbound links As discussed in the Keywords section, this is one of the most important factors for good rankings. It is best if you have a keyword in the anchor text but even if you don’t, it is still OK. +3
18 Origin of inbound links Besides the anchor text, it is important if the site that links to you is a reputable one or not. Generally sites with greater Google PR are considered reputable. +3
19 Links from similar sites Having links from similar sites is very, very useful. It indicates that the competition is voting for you and you are popular within your topical community. +3
20 Links from .edu and .gov sites These links are precious because .edu and .gov sites are more reputable than .com. .biz, .info, etc. domains. Additionally, such links are hard to obtain. +3
21 Number of backlinks Generally the more, the better. But the reputation of the sites that link to you is more important than their number. Also important is their anchor text, is there a keyword in it, how old are they, etc. +3
22 Anchor text of internal links This also matters, though not as much as the anchor text of inbound links. +2
23 Around-the-anchor text The text that is immediately before and after the anchor text also matters because it further indicates the relevance of the link – i.e. if the link is artificial or it naturally flows in the text. +2
24 Age of inbound links The older, the better. Getting many new links in a short time suggests buying them. +2
25 Links from directories Great, though it strongly depends on which directories. Being listed in DMOZ, Yahoo Directory and similar directories is a great boost for your ranking but having tons of links from PR0 directories is useless and it can even be regarded as link spamming, if you have hundreds or thousands of such links. +2
26 Number of outgoing links on the page that links to you The fewer, the better for you because this way your link looks more important. +1
27 Named anchors Named anchors (the target place of internal links) are useful for internal navigation but are also useful for SEO because you stress additionally that a particular page, paragraph or text is important. In the code, named anchors look like this: <A href= “#dogs”>Read about dogs</A> and “#dogs” is the named anchor. +1
28 IP address of inbound link Google denies that they discriminate against links that come from the same IP address or C class of addresses, so for Google the IP address can be considered neutral to the weight of inbound links. However, Bing and Yahoo! may discard links from the same IPs or IP classes, so it is always better to get links from different IPs. +1
29 Inbound links from link farms and other suspicious sites This does not affect you in any way, provided that the links are not reciprocal. The idea is that it is beyond your control to define what a link farm links to, so you don’t get penalized when such sites link to you because this is not your fault but in any case you’d better stay away from link farms and similar suspicious sites. 0
30 Many outgoing links Google does not like pages that consists mainly of links, so you’d better keep them under 100 per page. Having many outgoing links does not get you any benefits in terms of ranking and could even make your situation worse. -1
31 Excessive linking, link spamming It is bad for your rankings, when you have many links to/from the same sites (even if it is not a cross- linking scheme or links to bad neighbors) because it suggests link buying or at least spamming. In the best case only some of the links are taken into account for SEO rankings. -1
32 Outbound links to link farms and other suspicious sites Unlike inbound links from link farms and other suspicious sites, outbound links to bad neighbors can drown you. You need periodically to check the status of the sites you link to because sometimes good sites become bad neighbors and vice versa. -3
33 Cross-linking Cross-linking occurs when site A links to site B, site B links to site C and site C links back to site A. This is the simplest example but more complex schemes are possible. Cross-linking looks like disguised reciprocal link trading and is penalized. -3
34 Single pixel links when you have a link that is a pixel or so wide it is invisible for humans, so nobody will click on it and it is obvious that this link is an attempt to manipulate search engines. -3
Metatags
35 <Description> metatag Metatags are becoming less and less important but if there are metatags that still matter, these are the <description> and <keywords> ones. Use the <Description> metatag to write the description of your site. Besides the fact that metatags still rock on Bing and Yahoo!, the <Description> metatag has one more advantage – it sometimes pops in the description of your site in search results. +1
36 <Keywords> metatag The <Keywords> metatag also matters, though as all metatags it gets almost no attention from Google and some attention from Bing and Yahoo! Keep the metatag reasonably long – 10 to 20 keywords at most. Don’t stuff the <Keywords> tag with keywords that you don’t have on the page, this is bad for your rankings. +1
37 <Language> metatag If your site is language-specific, don’t leave this tag empty. Search engines have more sophisticated ways of determining the language of a page than relying on the <language>metatag but they still consider it. +1
38 <Refresh> metatag The <Refresh> metatag is one way to redirect visitors from your site to another. Only do it if you have recently migrated your site to a new domain and you need to temporarily redirect visitors. When used for a long time, the <refresh> metatag is regarded as unethical practice and this can hurt your ratings. In any case, redirecting through 301 is much better. -1
Content
39 Unique content Having more content (relevant content, which is different from the content on other sites both in wording and topics) is a real boost for your site’s rankings. +3
40 Frequency of content change Frequent changes are favored. It is great when you constantly add new content but it is not so great when you only make small updates to existing content. +3
41 Keywords font size When a keyword in the document text is in a larger font size in comparison to other on-page text, this makes it more noticeable, so therefore it is more important than the rest of the text. The same applies to headings (<h1>, <h2>, etc.), which generally are in larger font size than the rest of the text. +2
42 Keywords formatting Bold and italic are another way to emphasize important words and phrases. However, use bold, italic and larger font sizes within reason because otherwise you might achieve just the opposite effect. +2
43 Age of document Recent documents (or at least regularly updated ones) are favored. +2
44 File size Generally long pages are not favored, or at least you can achieve better rankings if you have 3 short rather than 1 long page on a given topic, so split long pages into multiple smaller ones. +1
45 Content separation From a marketing point of view content separation (based on IP, browser type, etc.) might be great but for SEO it is bad because when you have one URL and differing content, search engines get confused what the actual content of the page is. -2
46 Poor coding and design Search engines say that they do not want poorly designed and coded sites, though there are hardly sites that are banned because of messy code or ugly images but when the design and/or coding of a site is poor, the site might not be indexable at all, so in this sense poor code and design can harm you a lot. -2
47 Illegal Content Using other people’s copyrighted content without their permission or using content that promotes legal violations can get you kicked out of search engines. -3
48 Invisible text This is a black hat SEO practice and when spiders discover that you have text specially for them but not for humans, don’t be surprised by the penalty. -3
49 Cloaking Cloaking is another illegal technique, which partially involves content separation because spiders see one page (highly-optimized, of course), and everybody else is presented with another version of the same page. -3
50 Doorway pages Creating pages that aim to trick spiders that your site is a highly-relevant one when it is not, is another way to get the kick from search engines. -3
51 Duplicate content When you have the same content on several pages on the site, this will not make your site look larger because the duplicate content penalty kicks in. To a lesser degree duplicate content applies to pages that reside on other sites but obviously these cases are not always banned – i.e. article directories or mirror sites do exist and prosper. -3
Visual Extras and SEO
52 JavaScript If used wisely, it will not hurt. But if your main content is displayed through JavaScript, this makes it more difficult for spiders to follow and if JavaScript code is a mess and spiders can’t follow it, this will definitely hurt your ratings. 0
53 Images in text Having a text-only site is so boring but having many images and no text is a SEO sin. Always provide in the <alt> tag a meaningful description of an image but don’t stuff it with keywords or irrelevant information. 0
54 Podcasts and videos Podcasts and videos are becoming more and more popular but as with all non-textual goodies, search engines can’t read them, so if you don’t have the tapescript of the podcast or the video, it is as if the podcast or movie is not there because it will not be indexed by search engines. 0
55 Images instead of text links Using images instead of text links is bad, especially when you don’t fill in the <alt> tag. But even if you fill in the <alt> tag, it is not the same as having a bold, underlined, 16-pt. link, so use images for navigation only if this is really vital for the graphic layout of your site. -1
56 Frames Frames are very, very bad for SEO. Avoid using them unless really necessary. -2
57 Flash Spiders don’t index the content of Flash movies, so if you use Flash on your site, don’t forget to give it an alternative textual description. -2
58 A Flash home page Fortunately this epidemic disease seems to have come to an end. Having a Flash home page (and sometimes whole sections of your site) and no HTML version, is a SEO suicide. -3
Domains, URLs, Web Mastery
59 Keyword-rich URLs and filenames A very important factor, especially for Yahoo! and Bing. +3
60 Site Accessibility Another fundamental issue, which that is often neglected. If the site (or separate pages) is unaccessible because of broken links, 404 errors, password-protected areas and other similar reasons, then the site simply can’t be indexed. +3
61 Sitemap It is great to have a complete and up-to-date sitemap, spiders love it, no matter if it is a plain old HTML sitemap or the special Google sitemap format. +2
62 Site size Spiders love large sites, so generally it is the bigger, the better. However, big sites become user-unfriendly and difficult to navigate, so sometimes it makes sense to separate a big site into a couple of smaller ones. On the other hand, there are hardly sites that are penalized because they are 10,000+ pages, so don’t split your size in pieces only because it is getting larger and larger. +2
63 Site age Similarly to wine, older sites are respected more. The idea is that an old, established site is more trustworthy (they have been around and are here to stay) than a new site that has just poped up and might soon disappear. +2
64 Site theme It is not only keywords in URLs and on page that matter. The site theme is even more important for good ranking because when the site fits into one theme, this boosts the rankings of all its pages that are related to this theme. +2
65 File Location on Site File location is important and files that are located in the root directory or near it tend to rank better than files that are buried 5 or more levels below. +1
66 Domains versus subdomains, separate domains Having a separate domain is better – i.e. instead of having blablabla.blogspot.com, register a separate blablabla.com domain. +1
67 Top-level domains (TLDs) Not all TLDs are equal. There are TLDs that are better than others. For instance, the most popular TLD – .com – is much better than .ws, .biz, or .info domains but (all equal) nothing beats an old .edu or .org domain. +1
68 Hyphens in URLs Hyphens between the words in an URL increase readability and help with SEO rankings. This applies both to hyphens in domain names and in the rest of the URL. +1
69 URL length Generally doesn’t matter but if it is a very long URL-s, this starts to look spammy, so avoid having more than 10 words in the URL (3 or 4 for the domain name itself and 6 or 7 for the rest of address is acceptable). 0
70 IP address Could matter only for shared hosting or when a site is hosted with a free hosting provider, when the IP or the whole C-class of IP addresses is blacklisted due to spamming or other illegal practices. 0
71 Adsense will boost your ranking Adsense is not related in any way to SEO ranking. Google will definitely not give you a ranking bonus because of hosting Adsense ads. Adsense might boost your income but this has nothing to do with your search rankings. 0
72 Adwords will boost your ranking Similarly to Adsense, Adwords has nothing to do with your search rankings. Adwords will bring more traffic to your site but this will not affect your rankings in whatsoever way. 0
73 Hosting downtime Hosting downtime is directly related to accessibility because if a site is frequently down, it can’t be indexed. But in practice this is a factor only if your hosting provider is really unreliable and has less than 97-98% uptime. -1
74 Dynamic URLs Spiders prefer static URLs, though you will see many dynamic pages on top positions. Long dynamic URLs (over 100 characters) are really bad and in any case you’d better use a tool to rewrite dynamic URLs in something more human- and SEO-friendly. -1
75 Session IDs This is even worse than dynamic URLs. Don’t use session IDs for information that you’d like to be indexed by spiders. -2
76 Bans in robots.txt If indexing of a considerable portion of the site is banned, this is likely to affect the nonbanned part as well because spiders will come less frequently to a “noindex” site. -2
77 Redirects (301 and 302) When not applied properly, redirects can hurt a lot – the target page might not open, or worse – a redirect can be regarded as a black hat technique, when the visitor is immediately taken to a different page. -3

Top 10 SEO Mistakes & Errors – everyone repeats the same !


1 Targeting the wrong keywords

This is a mistake many people make and what is worse – even experienced SEO experts make it. People choose keywords that in their mind are descriptive of their website but the average users just may not search them. For instance, if you have a relationship site, you might discover that “relationship guide” does not work for you, even though it has the “relationship” keyword, while “dating advice” works like a charm. Choosing the right keywords can make or break your SEO campaign. Even if you are very resourceful, you can’t think on your own of all the great keywords but a good keyword suggestion tool, for instance, the Website Keyword Suggestion (http://www.webconfs.com/website-keyword-suggestions.php)  tool will help you find keywords that are good for your site.

2 Ignoring the Title tag

Leaving the <title> tag empty is also very common. This is one of the most important places to have a keyword, because not only does it help you in optimization but the text in your <title> tag shows in the search results as your page title.

3 A Flash website without a html alternative

Flash might be attractive but not to search engines and users. If you really insist that your site is Flash-based and you want search engines to love it, provide an html version. Here are some more tips for optimizing Flash sites (http://www.webconfs.com/optimizing-flash-sites-article-14.php). Search engines don’t like Flash sites for a reason – a spider can’t read Flash content and therefore can’t index it.

4 JavaScript Menus

Using JavaScript for navigation is not bad as long as you understand that search engines do not read JavaScript and build your web pages accordingly. So if you have JavaScript menus you can’t do without, you should consider build a sitemap (or putting the links in a noscript tag) so that all your links will be crawlable.

5 Lack of consistency and maintenance

Our friend Rob from Blackwood Productions often encounters clients, who believe that once you optimize a site, it is done forever. If you want to be successful, you need to permanently optimize your site, keep an eye on the competition and – changes in the ranking algorithms of search engines.

6 Concentrating too much on meta tags

A lot of people seem to think SEO is about getting your meta keywords and description correct! In fact, meta tags are becoming (if not already) a thing of the past. You can create your meta keywords and descriptions but don’t except to rank well only because of this.

7 Using only Images for Headings

Many people think that an image looks better than text for headings and menus. Yes, an image can make your site look more distinctive but in terms of SEO images for headings and menus are a big mistake because h2, h2, etc. tags and menu links are important SEO items. If you are afraid that your h1 h2, etc. tags look horrible, try modifying them in a stylesheet or consider this approach: http://www.stopdesign.com/articles/replace_text.

8 Ignoring URLs

Many people underestimate how important a good URL is. Dynamic page names are still very frequent and no keywords in the URL is more a rule than an exception. Yes, it is possible to rank high even without keywords in the URL but all being equal, if you have keywords in the URL (the domain itself, or file names, which are part of the URL), this gives you additional advantage over your competitors. Keywords in URLs are more important for MSN and Yahoo! but even with Google their relative weight is high, so there is no excuse for having keywordless URLs.

9 Backlink spamming

It is a common delusion that it more backlinks are ALWAYS better and because of this web masters resort to link farms, forum/newgroup spam etc., which ultimately could lead to getting their site banned. In fact, what you need are quality backlinks. Here are some more information on The Importance of Backlinks

10 Lack of keywords in the content

Once you focus on your keywords, modify your content and put the keywords wherever it makes sense. It is even better to make them bold or highlight them.

Happy SEO you all people!!

%d bloggers like this: