Monday, 31 March 2014

Optimize Images for Search Engines

Optimize your images for search engines:


  • Keywords in alt text, text around the image, image name, page title;
  • Preferred image formatting – jpg;
  • Separate SE accessible image folder;
  • Image freshness. (SEW suggests re-uploading your images to keep them fresh);
  • Enabled image search option at Google webmaster tool.
  • Reasonable image file size (see the discussion at WW)
  • Limited number of images per page;
  • Popular and reliable photo sharing hosting (e.g. Flickr is reported to help in Yahoo! image optimization).

Optimize your images for social media:
  • (obvious) use really great images to give your readers another reason to spread the word;
  • (to get Digg thumbnailed submission) stick to jpg format and make sure the images can be resized to (or already are) 160×120 or 160×160 pixels (Unless you have an image that can be resized that way Digg will not offer the submitter a thumbnail to go with the post).

Optimize your images for people:
  • People pay more attention to a clear image where they can see details (choose high quality images).
  • (Clean, clear) faces in an image get more eye fixation. (don’t use abstract images too often).
  • Keep them relevant: images are not the first thing a visitor sees on a web page but they can compel him to stay (according to the eye-tracking research web text is prior to images; when looking at a printed article people see images first; with web page they see text first – nevertheless images make the visitor stay and remember the page via relevant associations).

Sunday, 30 March 2014

Google Started Showing “Yellow Ad icon” in Search Result

Google Ad Icon

From last 3-4 years Google has been updating their algorithm regularly. Although, sometimes it might be painful for webmaster but on the contrary required too. As competition increases day by day the World Wide Web is becoming more crowded so, it is required to do changes so that user can get actual and authentic information through search engines.

Actually now search engines are working on the ideology of general user and continuously trying to offer the best search result to them. We have seen lots of changes in the interface of search result also. From the 1stweek of Nov 2013 I observed a change in “Ad” icon for inorganic results.

Yes, Google is now showing “Yellow Ad icon” adjacent to every inorganic listing on the page in spite of using just color as they were used to do earlier. I personally welcome this change and this will certainly help users as well as web marketers to identify the result quickly. Good work Google..!!

Google Offers Free DoubleClick Webinars for Digital Marketers

Digital marketers using Google's DoubleClick ad platform are being invited to join a series of webinars that will teach new skills for using DoubleClick.

Google's DoubleClick ad platform is looking to teach digital marketers how to use DoubleClick more effectively to help their businesses generate more clicks and more revenue through a series of free webinars being offered through June. The webinars will range from the fundamentals of DoubleClick Campaign Manager (DCM) for rookies, presentations on platform changes from previous versions, and advanced sessions on reporting, using in-stream video and more, according to a March 26 post by Sarah Payne, the DoubleClick Campaign Manager product trainer, on the DoubleClick Advertisers Blog.

Digital marketers with existing DoubleClick accounts can sign up for the events through the training pages on the latest DCM or earlier DoubleClick for Advertisers platforms, wrote Payne. "Whether you're starting to upgrade to DCM or you've already made the move, attend these webinars to better understand and use the product." Sessions covering DCM Fundamentals will be held on April 29 and June 10 through a 2.5 hour webinar that will teach everything from "how third-party ad serving works to the steps needed to set up Floodlight tags and traffic for your first campaign," wrote Payne. The session also touches on remarketing and stresses an introduction to the fundamentals and best practices of DCM.

Sessions on what's changed from the previous DoubleClick for Advertisers 6 (DFA6) platform as users move to the new DCM version will be offered April 2, 16 and 30, as well as on May 14, 28 and June 11 and 25, according to Payne. "Learn what's changed in DoubleClick Campaign Manager. Get a demo of the key differences compared to DFA6. Attend this webinar if you've been using DFA6 and want to know about functionality and workflow differences between DFA and DoubleClick Campaign Manager before you upgrade."

A "campaign trafficking demo" will be offered April 9, May 7 and June 4 to provide a live walk-through of how to create a campaign and assign campaign elements.

Advanced DCM webinars, which are aimed at users familiar with basic reporting and DFA6/DoubleClick Campaign Manager concepts, will include detailed sessions on understanding in-stream video on April 17; using report builder on April 3, May 1 and June 5; and using report attribution on May 15, according to the post.

Webinars will also be offered for digital marketers who are still using DFA6, wrote Payne. The DFA6 webinar offerings include sessions on DFA fundamentals on April 23 and June 4, which will mainly focus on the basics of ad trafficking, as well as sessions on MediaVisor fundamentals  on April 22 and June 17.
"Learn the most common uses of MediaVisor, including how to plan and create campaigns, advertisers, and site placements," Payne wrote. "You'll also learn how to send RFPs and IOs, as well as how to traffic placements from MediaVisor to DFA."

Some sessions for both DCM and DFA6 require prerequisite training with the platforms.
Digital marketers who are starting with DoubleClick from scratch can begin with Google's DCM Academy, which provides easy-to-use learning paths that guide them through core online training materials to help get users up to speed on the platform, wrote Payne.

Webinar participants who complete their training sessions and pass exams on the subject materials can get a printable certificate of completion, demonstrating their knowledge of DoubleClick Campaign Manager, she added.

Saturday, 29 March 2014

Lets Link Exchange

link exchange
link exchange

- Exchange links with sites that are related by topic. Exchanging links with unrelated sites is ineffective and unpopular.

   - Before exchanging, make sure that your link will be published on a “good” page. This means that the page must have a reasonable PageRank (3-4 or higher is recommended), it must be available for indexing by search engines, the link must be direct, the total number of links on the page must not exceed 50, and so on.

   - Do not create large link directories on your site. The idea of such a directory seems attractive because it gives you an opportunity to exchange links with many sites on various topics. You will have a topic category for each listed site. However, when trying to optimize your site you are looking for link quality rather than quantity and there are some potential pitfalls. No seo aware webmaster will publish a quality link to you if he receives a worthless link from your directory “link farm” in return. Generally, the PageRank of pages from such directories leaves a lot to be desired. In addition, search engines do not like these directories at all. There have even been cases where sites were banned for using such directories.

   - Use a separate page on the site for link exchanges. It must have a reasonable PageRank and it must be indexed by search engines, etc. Do not publish more than 50 links on one page (otherwise search engines may fail to take some of the links into account). This will help you to find other seo aware partners for link exchanges.

   - Search engines try to track mutual links. That is why you should, if possible, publish backlinks on a domain/site other than the one you are trying to promote. The best variant is when you promote the resource site1.com and publish backlinks on the resource site2.com.

    - Exchange links with caution. Webmasters who are not quite honest will often remove your links from their resources after a while. Check your backlinks from time to time.

Sunday, 23 March 2014

How To Use Google Analytics & Click Distribution Data To Project Traffic Boosts

Most are well aware of the power and benefits of installing Google Analytics to find out who is coming to a website. Many are even aware that you can see exactly what keywords are being searched for to find that website. There are not many out there however, who are aware of the fact that you can use that data to skyrocket your traffic numbers.

If you follow the steps in this guide, you will see a significant traffic boost. The beauty of this strategy is that it is often limitless (depending on how big the market is for any given website). This is also a strategy that works no matter how much traffic you are currently getting. The only requirement is that your website is getting some sort of traffic from search engines. It works best if you are getting 35+ visitors per day via search engines. The more traffic you currently get, the easier this technique will be to implement.

Preparing for Increased Traffic

There are dozens of different options for bringing a website live these days. If you are serious about your website/blog and increasing the traffic, you have to prepare for achieving that goal. If you are on a low grade hosting plan, be aware of the consequences associated with going that route. You do not need anything extreme, but a simple plan from a good web hosting provider will suffice. That way, you do not have to worry about your site going down or some sort of security issue arising. The traffic is not going to skyrocket over night, but it is better to switch hosting servers when your site is getting less traffic versus more. There is always that risk of a small downtime when you change servers.

A Look Into the Strategy

The idea behind this strategy, is that you will be analyzing your current traffic data, to figure out new pages that should be added to your website and/or blog. There is a lot that can be seen from looking into this data. Sure, it is nice to be able to see the keywords that people are searching for and how many visitors you are getting each day, but what good does that do if you are not utilizing the data you are looking at? Depending on the traffic level of your website, there may be dozens (or even thousands) of keyword phrases that are bringing traffic to your website. Many of these are long-tail keyword phrases.

Taking a look at the list of keywords in question, along with where you currently place for those keywords, will allow you to further target them. If you are ranking fairly well for many of these keywords on accident, imagine what you could do if you actually tried to rank for those same keywords?

The theory behind this strategy is simple. Your job is to find keyword phrases that you already rank for (on accident) that bring you traffic. The key is to find phrases for which you rank below #1. If you are already ranking #1 for the keyword, then there is no point in taking this strategy.

Analyzing Your Keywords

The first thing you must do is login to your Google Analytics account and access your keyword data.
Find your way to the dashboard of the website for which you are attempting this strategy.
Click on Traffic Sources on the left hand side of the page.
This will expand and give you more options. Click on Keywords.
On the bottom right hand corner, you will see an option that says Show Rows. Choose the option to show 500 rows.

You should now have a large list of keyword phrases from which to choose. From there, you can scan through and see what keywords are bringing traffic to your website. Additionally, you will be able to see how many visitors have come from each specific keyword phrase. You want to find a keyword phrase that looks like it is bringing in a few visitors and also was not something you were targeting on purpose.

Friday, 14 March 2014

Using Twitter to Index Your Blog Posts

In the past few months I have found an active twitter account to be a monster for indexing site pages and posts, usually within minutes. I used to pay for some of the link indexing services out there like Linkalicious, Backlinks Indexer, etc., until I discovered the value of using Twitter to do this for free. So here is how I do it:

Step #1: Manually create a Twitter account, adjust the profile page settings, add a profile picture, change your background, etc. Make it look like a real person.

Step #2: Start following some of the popular Twitter users. This can be niche specific or celebrities, it really doesn't matter.

Step #3: Start posting some tweets on a daily basis. I usually post some generic tweets like "Love this new iPhone" or "Great! Its raining again the sun must be on vacation." Do this for about a week while continuing to follow some more popular users. Basically, you want to artificially make your twitter account look popular.

Step #4: Add a twitter plugin to your site that auto-tweets your new post to twitter every time you publish. I have used several and they all seem to work about the same, you can also use an online service like twitterfeed.com : feed your blog to twitter which monitors your rss feed and updates to twitter and status net sites as you publish. Instant backlinks!

Step #5: Sit back and watch your posts get indexed in minutes.

For other backlinks, off-site, you could either login manually and post their links into your account, roll your own simple Ubot to login and post the links at certain intervals and I also think there are a few indexing programs out there as well that send them to Twitter, but haven't used any of those. I usually use Ubot Studio for this.

This method has worked very well for me lately but the key seems to be the "popularity" of the twitter account. I tried this method on a brand new account with no previous posts or followers and the results were pretty disappointing to say the least, so I would recommend spending a minute or two each day to tweet something for a week or so and build some followers, then blast away. 

Thursday, 13 March 2014

Affiliate Sites & Dynamic URLs

Affiliate Sites & Dynamic URLs




One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.

You do not have to be concerned that the result may throw up this “site-map” page and would disappoint the visitor. This will not happen, as the “site-map” has no searchable content and will not get included in the results, rather all other pages would. We found the site wired.com had published hierarchical sets of crawler pages. The first crawler page lists all the category headlines, these links lead to a set of links with all story headlines, which in turn lead to the news stories.

Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.

One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.

Tuesday, 11 March 2014

Image SEO

Images as an asset for organic search results and search engine optimization are often overlooked.  Images can drive traffic through image search as well as inclusion in universal search results.

1) Find the right images -

Finding the right kind of image is incredibly important. Great images can add another dimension to an article or page that can encourage people to share the page and create some great backlinks. Research shows that while text is still the first thing seen on the page, the image is what sells the page.

You can also use Google Images to find images for your site, as long as you search with the proper licensing. (They allow you to search Creative Commons and other public licenses.) But you have to be very careful when using images, as if you don’t have the permission to reuse it, companies and sites can take legal action against you.

The general rule of thumb is this: if the image isn’t Creative Common licensed or you didn’t buy or create it, don’t post it.

2) Use the keyword(s) in the file name

Just like keywords in post urls are important for pages, the same is true for images. Using keyword-rich words in your image filename is important for helping search engines determine relevancy. For example, the image above was originally named “iStock_small.jpg” which doesn’t add much information about this web page. It has been renamed to “image-optimization.jpg”. Of course, most images that are not simply decorative like the one above are literal and connected to the content of the page such as a photo of a product.  If the above image were used in an article about eye color, then the file name should reflect that.

Google suggests that you should place your images in one folder on your site, mydomain.com/images versus placing them in random folders throughout the site. Another suggestion from Google related to file names or URLs of images is to make sure you use common image filetypes such as JPEG, GIF, PNG, and BMP.

3) Create descriptive alt text

Alt text or tags are another way that search engines help determine what your image is about. Unlike traditional web content, search engines can’t determine the text content of an image. (Search spiders are pretty smart, but as far as I know they haven’t developed eyes yet.) As a result, search engines need to rely on captions around the image, alt text, file names and other surrounding text. Adding descriptive text in the alt tag helps the search engines determine what the content of the image is.

If an image is used as navigation, ie as a link to another page, be sure to use alt text that is meaningful to the content of the page being linked to.

4) The right anchor text

Anchor text is another important factor in image SEO. If you decide to link to images with text, your anchor text can play a role in how your image is ranked for keywords. Use descriptive anchor text that describes the image.  For example, linking to an image using a generic term like “image” or “photo” or a file name that doesn’t use keywords doesn’t give search engines much meaningful information on what the image is about. Linking to an image with keywords is helpful to search engines as well as people visiting your site.

5) Make sure the image matches the content

The content surrounding the image should be related to all of the things that you’ve optimized thus far: anchor tags, image url, alt tags. When these things align, it helps search engines confirm that you’re not spamming and that the image is of higher quality and relevant.

Friday, 7 March 2014

SEO Software Review

Best Seo Software
Seo Software

1) Ranking Monitor - Any seo optimization specialist is faced with the regular task of checking the positions of his sites in the search engines. You could check these positions manually, but if you have several dozen keywords and 5-7 search engines to monitor, the process becomes a real chore. 

   The Ranking Monitor module will do everything automatically. You are able to see information on your site ratings for any keywords and in a variety of search engines. You will also see the dynamics and history of your site positions as well as upward and downward trends in your site position for your specified keywords. The same information is also displayed in a visual form.

2)Link Popularity Checker - This program will automatically poll all available search engines and create a complete duplicate-free list of inbound links to your resource. For each link, you will see important parameters such as the link text and PageRank of the referring page. If you have studied this article, you will know how important these parameters are. As well as viewing the overall list of inbound links, you can track how the inbound links change over time.

3)Site Indexation Tool - This useful tool will show you all pages indexed by a particular search engine. It is a must-have tool for anybody who is creating a new web resource. The PageRank value will be displayed for each indexed page.

4)Log Analyzer - All information about your visitors is stored in the log files of your server. The log analyzer module will present this information in convenient and visual reports. Displayed information includes:

   - Originating sites
   - Keywords used,
   - What country they are from
   - Much more…

5)Page Rank Analyzer - This utility collects a huge amount of competitive information on the list of sites that you specify. For each site it automatically determines parameters such as Google PageRank, the number of inbound links and the presence of each site in the DMOZ and Yahoo directories. It is an ideal tool for analyzing the competition rate of a particular query.

6)HTML Analyzer - This application analyzes the HTML code of a page. It estimates the weight and density of keywords and creates a report on the correct optimization of the site text. It is useful during the creation your own site and is also a great tool for analyzing your competitors' sites. It allows you to analyze both local HTML pages and online projects.

Thursday, 6 March 2014

Cloaking - Google Guidelines

Cloaking - Google Guidelines
No Cloaking

Sometimes, a webmaster might program the server in such a way that it returns different content to Google than it returns to regular users, which is often done to misrepresent search engine rankings. This process is referred to as cloaking as it conceals the actual website and returns distorted web pages to search engines crawling the site. This can mislead users about what they'll find when they click on a search result. Google highly disapproves of any such practice and might place a ban on the website which is found guilty of cloaking. 

Google Guidelines


Here are some important tips and tricks that can be useful to deal with.

Do’s
  1. A website should have crystal clear hierarchy and links and should preferably be easy to navigate.
  2. A site map is required to help the users go around your site and in case the site map has more than 100 links, then it is advisable to break it into several pages to avoid clutter.
  3. Come up with essential and precise keywords and make sure that your website features relevant and informative content.
  4. The Google crawler will not recognize text hidden in the images, so when describing important names, keywords or links; stick with plain text.
  5. The TITLE and ALT tags should be descriptive and accurate and the website should have no broken links or incorrect HTML.
  6. Dynamic pages (the URL consisting of a ‘?’ character) should be kept to a minimum as not every search engine spider is able to crawl them.
  7. The robots.txt file on your web server should be current and should not block the Googlebot crawler. This file tells crawlers which directories can or cannot be crawled.
Donts
  1. When making a site, do not cheat your users, i.e. those people who will surf your website. Do not provide them with irrelevant content or present them with any fraudulent schemes.
  2. Do not employ hidden texts or hidden links.
  3. Google frowns upon websites using cloaking technique. Hence, it is advisable to avoid that.
  4. Automated queries should not be sent to Google.
  5. Avoid "doorway" pages created just for search engines or other "cookie cutter" approaches such as affiliate programs with hardly any original content.

Step By Step Page Optimization

Step By Step Page Optimization
Onpage Optimization

1) A heading tag that includes a keyword(s) or keyword phrases. A heading tag is bigger and bolder text than normal body text, so a search engine places more importance on it because you emphasize it.

2)Heading sizes range from h1 - h6 with h1 being the largest text. If you learn to use just a little   Cascading Style Sheet code you can control the size of your headings. You could set an h1 s   sized heading to be only slightly larger than your normal text if you choose, and the search         engine will still see it as an important heading.

3)Next would be an introduction that describes your main theme. This would include several of your top keywords and keyword phrases. Repeat your top 1 or 2 keywords several times, include other keyword search terms too, but make it read in sentences that makes sense to your visitors.

4)A second paragraph could be added that got more specific using other words related to online education.

5)Next you could put smaller heading.

6)Then you'd list the links to your pages, and ideally have a brief decision of each link using keywords and keyword phrases in the text. You also want  to have several pages of quality content to link to. Repeat that procedure for all your links that relate to your theme.

7)Next you might include a closing, keyword laden paragraph. More is not necessarily better when it comes to keywords, at least after a certain point. Writing "online education" fifty times across your page would probably result in you being caught for trying to cheat. Ideally, somewhere from 3% - 20% of your page text would be keywords. The percentage changes often and is different at each search engine. The 3-20 rule is a general guideline, and you can go higher if it makes sense and isn't redundant.

Wednesday, 5 March 2014

Create a Custom Web 2.0 Link Wheel

Basically, a linkwheel is a set of SIX Web 2.0 properties interlinked in a “wheel” structure, all promoting a page on the web which is the focus of that particular “wheel”.


Let me explain, or well, let this image show you.

Create Link Wheel
Link Wheel

Your “focus hub” could example a page on your be ANY page website, or you wish to another Web promote. For 2.0 property.



To expand, you could build a linkwheel like this around a page, and then build a new linkwheel around each of the properties that are in that linkwheel.


For example, in the above image, you would continue building another linkwheel having Weebly and Squidoo etc AS the Focus Hub.

Use of the Linkwheel strategy can bring some serious Google domination implications. For keywords with lower competition you could easily take over the front page of Google.

Web 2.0 Properties

Alright, so those you can post your six above are just a few of the different places content to.

Basically – you can, which does want to post your content to NOT make your outgoing links ANY site where you nofollow.


Some of the popular sites, such as Zimbio for example, puts a “nofollow” tag on the links you put there, which pretty much makes the link near useless in terms of ranking benefits.

You can Firefox check if links are “nofollow” with Aaron Wall's SEO For plugin, for the Firefox browser.


Here: http://tools.seobook.com/firefox/seo-for-firefox.html

And it's free!

Anyway, Web 2.0 on the next page you'll find a list of some of the other Properties you can and should use.


Web 2.0 sites:

http://squidoo.com http://blogger.com http://wordpress.com http://quizilla.com http://livejournal.com http://weebly.com http://blog.com http://jimdo.com http://tumblr.com http://blinkweb.com http://synthasite.com http://blogsome.com


Again, those are just a handful for you to start off with.
So Best of Luck and Create a wheel of your site like above and get some better news about your site in google in some days...

Link Building is the Key Element Towards Online Success

Search engine optimization is the process of taking steps with your website to make it highly ranked by the search engines for your desired search terms.

The key element in getting a higher page rank is link building. Building high-quality links to your website that appear natural to search engines is one and most important step towards increasing your page rank.
Here are two main reasons:


1) Links make your website get indexed by search engines so they begin ranking your pages        for target keywords. The Higher rank in search engines you have, the more visitors youll get.


2) Links are used by search engines to determine how relevant you web page is to search            terms. By building links from relevant websites, you ensure that your web page gets indexed      and ranked by search engines accordingly.

Typically a link consists of two parts:

1) URL (Uniform Resource Locator). This is the web address of the site your link points to.

2) Anchor text. This is the visible text of the link.

Having the keywords related to your site as the anchor text is one of the key elements of link building and getting a high page rank.

Page Rank Based on Popularity


The web search technology offered by Google is often the technology of choice of the world’s leading portals and websites. It has also benefited the advertisers with its unique advertising program that does not hamper the web surfing experience of its users but still brings revenues to the advertisers.

When you search for a particular keyword or a phrase, most of the search engines return a list of page in order of the number of times the keyword or phrase appears on the website. Google web search technology involves the use of its indigenously designed Page Rank Technology and hypertext-matching analysis which makes several instantaneous calculations undertaken without any human intervention. Google’s structural design also expands simultaneously as the internet expands.