Thursday, 28 August 2014

Steps to increase website traffic

increase traffic - ankit patel seo

Be regular – To get Website traffic

Be patient - Follow simple steps and update website
Professional Blogger always tries to keep own blog or website up-to-date time to time. Reason is simple, Google keep your blog alive on search engine. You will not get your blog or website index down in future. So top of the most blogger suggest that, Update your blog regularly.

Content Quality

Content helps in future after posting meaning full and unique article in your blog or website. Google algorithms made for find unique article, Moves visitors on best result. Content Quality may be down through the copied images from others, without copying text over others.

Website index

Website index helps to keep up in google search engine result page. If you have a good reputation with search engine, you can get google website index. Reputation on Top best search engine must be increase, submitting your post URL on best search engine.

Social networking sites: direct website traffic from social media websites

Now a day’s hundreds of social media can be use for sharing your article. Using social media you can target visitors as of your requirement. Most of the businesses always find clients over worldwide.

Through the social media you can impress your client with exciting post title. Now a day thousand of websites, Mashable, Thereachest, are getting huge traffic over the social media.

Online forms and Q/A sites

Question Answer form also deserve best techniques, to get huge website traffic/Blog traffic. Using this way you can ask your readers on any important question and gives satisfied answer in reply.
Most popular Q/A sites to diverse website traffic on your websites are, stack overflow,,

Spotlight on keyword

Content quality is king and keyword in queen. Without this combination you can’t get meaning full result on website traffic. WordPress give very useful keyword tools. Using these tools you can get perfect keyword related to your post and article.

- This keyword connects your website/blog directly with your blog.

- Most popular Keyword in visitor's mind > Search in google with keyword > used keyword get your website on google search result page.

Regular comment

Commenting enhance your website visibility. Using this trick you get meaning full people, who are interested your skill that’s comment in others blog article.

-Using this way you can build 
backlinks and increase pagerank also.

Internal links

Internal links make crowd over web. In this your website link refers another link which is your own websites link. More number of internal links is good for your blog/website.
Benefits are,

Guest Posting - Website Traffic inmproves

Regular and high quality backlinks are coming from top best sites. In this, you have to submit your article on another website/Post as a guest article. From there you can get traffic on your website. 

Domain Name/ URL selection

Impressive domain always makes good reputation in front of visitors and search engine. Meaning full URL make hidden relation with your content.

Domain selection is the most important point and first step of blogging and website building.

Monday, 25 August 2014

How To Write Articles That Rank On Google Top 10??

Every blog owner knows that content is king.

If you want your articles to rank on Google, there are certain steps you must take. Contrary to what we’ve been taught by so-called SEO experts, you don’t need to bite your fingers to achieve results.

In fact, a little step you can take today, can make a huge difference. The web is primarily powered by content, and Google is constantly looking for pages that have the right content for her users. If you provide this content, you would be rewarded with fresh traffic, and increase in PR.

The steps required to rank your articles highly on Google top 10 are:

1. Research Your Long Tail Keywords

This is the first approach to take, as it boosts your chances of ranking easily. When you concentrate your effort on long tail keywords (3 – 6 words), you’re going to reap the rewards of Google homepage. I’ve seen new blogs rank highly because the articles focused on long tail keywords. You can’t possibly compete with the giants in the market like Amazon, Wal-mart, and several shopping sites.

When you tour the route of long tail key terms, which these popular websites ignore, you drive targeted traffic, that makes for real conversion.

2. Know Your Market Well

You’ve to determine the actual people you’re serving. Your article should be thinly focused on the exact people. It could be crazy when an article writer tries to target everybody that comes to the internet – I don’t think he’s going to succeed in the first place. As a Google user, know your market in and out.

Participate and answer questions on discussion boards. You would be able to discover amazing questions and concerns of your target market. This positions you to write effectively, thereby pleasing Google and reaping the benefits.

3. Write To One Particular Person

Yes, even though you’ve thousands of people who need to read your content, as a rule of thumb, write only to one person. Channel your sentences to align and attribute to one unique person. This is because, your article is going to be read by one person at a time.

When this is done, your article seems engaging and when that happens, the bounce rate is reduced to the barest minimum.

Before Google can rank your articles, they want to know the extent by which you’ve been able to solve problems. How unique and interesting your content is, would only be determined by the “amount” of time spent by your readers.

4. Write Headlines That Are Short

Not merely short like that, but make sure your headline is 71 characters or less. This is because Google spiders only index the first 71 characters and uses (…) for the remaining section. A perfect SEO content  should abide by ranking rules and ensure the headline is cute and short.

If you must write a long headline, make it a sub-heading and use it within the content of the article. It makes a lot of sense because Google reads the entire content.

5. Start With Your Key Phrases

What you are targeting shouldn’t be just keywords. So when you start writing your content, use your key phrases as the initial start-ups. You’re doing this because, search engine spiders read content in the same way as humans. They read from left to right – so when your primary terms are well situated, indexing is faster.

This in turn increases your pagerank on Google without much struggles.

6. Write Quality Content

There is no alternative to writing quality content.. It’s not all about the keywords and phrases you used, by how unique and original your content is. The kind of articles that succeeds to the homepage of search engines, are those that are less stuffed with key terms.

Avoid spun contents, they would only hinder your success and jeopardize your search engine optimization endeavors. Take time and create quality content that people love.

Remember, it’s not the search engine that will Subscribe to your RSS feed to your email list and buy your product, it’s the people that will. Write for people and Google will follow.

Tuesday, 19 August 2014

Upcoming Trends of Web Marketing in Social Media Platforms

Not so long ago, social media was considered a novelty of the Internet. Any attempt of advertising was handled by so called ''experts'' of social media marketing. Today, things are a bit different. Exponential distribution is changing the world as we know it, which is subsequently changing the way we think and act. Everything leads to connecting the world through social media, which is a great place to employ different types of marketing.Adjusting the market to an individual or group with same interests has become the key aspect of modern marketing. Seeing as how connecting people with same interest is one of the main characteristics of social media usage, it presents itself as welcoming all types of modern marketing to come in the future. This article will now continue to elaborate the impact that social media networks will have on future online marketing.

Web 3.0 Concept
Web 3.0 is a technology we didn't have to wait for too long. Web 2.0 has just become the standard, and we already have Web 3.0 fast approaching. It is estimated that it will last between the years 2013 and 2020. Its development is moving rapidly and we already know of some solutions that will contribute to speeding up the Web.

Semantic Web
Semantic Web is a term coined by Tim Berners Lee. It is not a stand-alone type of Web, but more of an extension to the already existing one. Information is now given a clearer, better defined meaning, which improves cooperation between people and computers.

The Structure of Semantic Web
While the job of HTML (Hypertext MarkUp Language) is presenting data on the Web, the structure of Semantic Web is divided into three parts:1. XML (eXtensible Markup Language) - determines the structure of data.2. RDF (Resource Description Framework) - the central protocol of the Web, when it comes to semantic connections. It is based on the W3C standard (World Wide Web Consortium), and it defines the semantic relationship between electronic sources.3. Ontologies – the most important element. Ontologies make the pillar of Semantic Web technology.

Web 3.0 Marketing
The key factors of Web 3.0 Marketing are new ways of searching for information, ''smarter'' information, and a more approachable Web. Web 3.0 marketing is a convergence of new technologies and fast changes of consumer trends. The Web 3.0 marketing world is home to information that will be available on all devices, from all parts of the globe.

Here are the main components of Web 3.0 marketing:
1. Microblogging - the ability to share your thoughts in an already defined number of characters. People are busy, in a hurry, and they want the gist of things. That is why you need to make it short to be heard.

2. Virtual worlds - places where users go to interact with each other in a 3D environment. They can host dates, parties, and various events.

3. Personalization - enables users to create personalized web pages. People continuously expect their names to show up on top of various web pages on the Internet. They want personalized emails, as well as suggestions for shopping according to their needs. Since the web is getting smarter, personalization is becoming the standard. 

4. Phone applications - they exist because billions of people in the world use mobile phones. A number much greater than PC users. Consumers browse the web and expect to be able to make purchases directly on the phone.

Monday, 31 March 2014

Optimize Images for Search Engines

Optimize your images for search engines:

  • Keywords in alt text, text around the image, image name, page title;
  • Preferred image formatting – jpg;
  • Separate SE accessible image folder;
  • Image freshness. (SEW suggests re-uploading your images to keep them fresh);
  • Enabled image search option at Google webmaster tool.
  • Reasonable image file size (see the discussion at WW)
  • Limited number of images per page;
  • Popular and reliable photo sharing hosting (e.g. Flickr is reported to help in Yahoo! image optimization).

Optimize your images for social media:
  • (obvious) use really great images to give your readers another reason to spread the word;
  • (to get Digg thumbnailed submission) stick to jpg format and make sure the images can be resized to (or already are) 160×120 or 160×160 pixels (Unless you have an image that can be resized that way Digg will not offer the submitter a thumbnail to go with the post).

Optimize your images for people:
  • People pay more attention to a clear image where they can see details (choose high quality images).
  • (Clean, clear) faces in an image get more eye fixation. (don’t use abstract images too often).
  • Keep them relevant: images are not the first thing a visitor sees on a web page but they can compel him to stay (according to the eye-tracking research web text is prior to images; when looking at a printed article people see images first; with web page they see text first – nevertheless images make the visitor stay and remember the page via relevant associations).

Sunday, 30 March 2014

Google Started Showing “Yellow Ad icon” in Search Result

Google Ad Icon

From last 3-4 years Google has been updating their algorithm regularly. Although, sometimes it might be painful for webmaster but on the contrary required too. As competition increases day by day the World Wide Web is becoming more crowded so, it is required to do changes so that user can get actual and authentic information through search engines.

Actually now search engines are working on the ideology of general user and continuously trying to offer the best search result to them. We have seen lots of changes in the interface of search result also. From the 1stweek of Nov 2013 I observed a change in “Ad” icon for inorganic results.

Yes, Google is now showing “Yellow Ad icon” adjacent to every inorganic listing on the page in spite of using just color as they were used to do earlier. I personally welcome this change and this will certainly help users as well as web marketers to identify the result quickly. Good work Google..!!

Google Offers Free DoubleClick Webinars for Digital Marketers

Digital marketers using Google's DoubleClick ad platform are being invited to join a series of webinars that will teach new skills for using DoubleClick.

Google's DoubleClick ad platform is looking to teach digital marketers how to use DoubleClick more effectively to help their businesses generate more clicks and more revenue through a series of free webinars being offered through June. The webinars will range from the fundamentals of DoubleClick Campaign Manager (DCM) for rookies, presentations on platform changes from previous versions, and advanced sessions on reporting, using in-stream video and more, according to a March 26 post by Sarah Payne, the DoubleClick Campaign Manager product trainer, on the DoubleClick Advertisers Blog.

Digital marketers with existing DoubleClick accounts can sign up for the events through the training pages on the latest DCM or earlier DoubleClick for Advertisers platforms, wrote Payne. "Whether you're starting to upgrade to DCM or you've already made the move, attend these webinars to better understand and use the product." Sessions covering DCM Fundamentals will be held on April 29 and June 10 through a 2.5 hour webinar that will teach everything from "how third-party ad serving works to the steps needed to set up Floodlight tags and traffic for your first campaign," wrote Payne. The session also touches on remarketing and stresses an introduction to the fundamentals and best practices of DCM.

Sessions on what's changed from the previous DoubleClick for Advertisers 6 (DFA6) platform as users move to the new DCM version will be offered April 2, 16 and 30, as well as on May 14, 28 and June 11 and 25, according to Payne. "Learn what's changed in DoubleClick Campaign Manager. Get a demo of the key differences compared to DFA6. Attend this webinar if you've been using DFA6 and want to know about functionality and workflow differences between DFA and DoubleClick Campaign Manager before you upgrade."

A "campaign trafficking demo" will be offered April 9, May 7 and June 4 to provide a live walk-through of how to create a campaign and assign campaign elements.

Advanced DCM webinars, which are aimed at users familiar with basic reporting and DFA6/DoubleClick Campaign Manager concepts, will include detailed sessions on understanding in-stream video on April 17; using report builder on April 3, May 1 and June 5; and using report attribution on May 15, according to the post.

Webinars will also be offered for digital marketers who are still using DFA6, wrote Payne. The DFA6 webinar offerings include sessions on DFA fundamentals on April 23 and June 4, which will mainly focus on the basics of ad trafficking, as well as sessions on MediaVisor fundamentals  on April 22 and June 17.
"Learn the most common uses of MediaVisor, including how to plan and create campaigns, advertisers, and site placements," Payne wrote. "You'll also learn how to send RFPs and IOs, as well as how to traffic placements from MediaVisor to DFA."

Some sessions for both DCM and DFA6 require prerequisite training with the platforms.
Digital marketers who are starting with DoubleClick from scratch can begin with Google's DCM Academy, which provides easy-to-use learning paths that guide them through core online training materials to help get users up to speed on the platform, wrote Payne.

Webinar participants who complete their training sessions and pass exams on the subject materials can get a printable certificate of completion, demonstrating their knowledge of DoubleClick Campaign Manager, she added.

Saturday, 29 March 2014

Lets Link Exchange

link exchange
link exchange

- Exchange links with sites that are related by topic. Exchanging links with unrelated sites is ineffective and unpopular.

   - Before exchanging, make sure that your link will be published on a “good” page. This means that the page must have a reasonable PageRank (3-4 or higher is recommended), it must be available for indexing by search engines, the link must be direct, the total number of links on the page must not exceed 50, and so on.

   - Do not create large link directories on your site. The idea of such a directory seems attractive because it gives you an opportunity to exchange links with many sites on various topics. You will have a topic category for each listed site. However, when trying to optimize your site you are looking for link quality rather than quantity and there are some potential pitfalls. No seo aware webmaster will publish a quality link to you if he receives a worthless link from your directory “link farm” in return. Generally, the PageRank of pages from such directories leaves a lot to be desired. In addition, search engines do not like these directories at all. There have even been cases where sites were banned for using such directories.

   - Use a separate page on the site for link exchanges. It must have a reasonable PageRank and it must be indexed by search engines, etc. Do not publish more than 50 links on one page (otherwise search engines may fail to take some of the links into account). This will help you to find other seo aware partners for link exchanges.

   - Search engines try to track mutual links. That is why you should, if possible, publish backlinks on a domain/site other than the one you are trying to promote. The best variant is when you promote the resource and publish backlinks on the resource

    - Exchange links with caution. Webmasters who are not quite honest will often remove your links from their resources after a while. Check your backlinks from time to time.

Sunday, 23 March 2014

How To Use Google Analytics & Click Distribution Data To Project Traffic Boosts

Most are well aware of the power and benefits of installing Google Analytics to find out who is coming to a website. Many are even aware that you can see exactly what keywords are being searched for to find that website. There are not many out there however, who are aware of the fact that you can use that data to skyrocket your traffic numbers.

If you follow the steps in this guide, you will see a significant traffic boost. The beauty of this strategy is that it is often limitless (depending on how big the market is for any given website). This is also a strategy that works no matter how much traffic you are currently getting. The only requirement is that your website is getting some sort of traffic from search engines. It works best if you are getting 35+ visitors per day via search engines. The more traffic you currently get, the easier this technique will be to implement.

Preparing for Increased Traffic

There are dozens of different options for bringing a website live these days. If you are serious about your website/blog and increasing the traffic, you have to prepare for achieving that goal. If you are on a low grade hosting plan, be aware of the consequences associated with going that route. You do not need anything extreme, but a simple plan from a good web hosting provider will suffice. That way, you do not have to worry about your site going down or some sort of security issue arising. The traffic is not going to skyrocket over night, but it is better to switch hosting servers when your site is getting less traffic versus more. There is always that risk of a small downtime when you change servers.

A Look Into the Strategy

The idea behind this strategy, is that you will be analyzing your current traffic data, to figure out new pages that should be added to your website and/or blog. There is a lot that can be seen from looking into this data. Sure, it is nice to be able to see the keywords that people are searching for and how many visitors you are getting each day, but what good does that do if you are not utilizing the data you are looking at? Depending on the traffic level of your website, there may be dozens (or even thousands) of keyword phrases that are bringing traffic to your website. Many of these are long-tail keyword phrases.

Taking a look at the list of keywords in question, along with where you currently place for those keywords, will allow you to further target them. If you are ranking fairly well for many of these keywords on accident, imagine what you could do if you actually tried to rank for those same keywords?

The theory behind this strategy is simple. Your job is to find keyword phrases that you already rank for (on accident) that bring you traffic. The key is to find phrases for which you rank below #1. If you are already ranking #1 for the keyword, then there is no point in taking this strategy.

Analyzing Your Keywords

The first thing you must do is login to your Google Analytics account and access your keyword data.
Find your way to the dashboard of the website for which you are attempting this strategy.
Click on Traffic Sources on the left hand side of the page.
This will expand and give you more options. Click on Keywords.
On the bottom right hand corner, you will see an option that says Show Rows. Choose the option to show 500 rows.

You should now have a large list of keyword phrases from which to choose. From there, you can scan through and see what keywords are bringing traffic to your website. Additionally, you will be able to see how many visitors have come from each specific keyword phrase. You want to find a keyword phrase that looks like it is bringing in a few visitors and also was not something you were targeting on purpose.

Friday, 14 March 2014

Using Twitter to Index Your Blog Posts

In the past few months I have found an active twitter account to be a monster for indexing site pages and posts, usually within minutes. I used to pay for some of the link indexing services out there like Linkalicious, Backlinks Indexer, etc., until I discovered the value of using Twitter to do this for free. So here is how I do it:

Step #1: Manually create a Twitter account, adjust the profile page settings, add a profile picture, change your background, etc. Make it look like a real person.

Step #2: Start following some of the popular Twitter users. This can be niche specific or celebrities, it really doesn't matter.

Step #3: Start posting some tweets on a daily basis. I usually post some generic tweets like "Love this new iPhone" or "Great! Its raining again the sun must be on vacation." Do this for about a week while continuing to follow some more popular users. Basically, you want to artificially make your twitter account look popular.

Step #4: Add a twitter plugin to your site that auto-tweets your new post to twitter every time you publish. I have used several and they all seem to work about the same, you can also use an online service like : feed your blog to twitter which monitors your rss feed and updates to twitter and status net sites as you publish. Instant backlinks!

Step #5: Sit back and watch your posts get indexed in minutes.

For other backlinks, off-site, you could either login manually and post their links into your account, roll your own simple Ubot to login and post the links at certain intervals and I also think there are a few indexing programs out there as well that send them to Twitter, but haven't used any of those. I usually use Ubot Studio for this.

This method has worked very well for me lately but the key seems to be the "popularity" of the twitter account. I tried this method on a brand new account with no previous posts or followers and the results were pretty disappointing to say the least, so I would recommend spending a minute or two each day to tweet something for a week or so and build some followers, then blast away. 

Thursday, 13 March 2014

Affiliate Sites & Dynamic URLs

Affiliate Sites & Dynamic URLs

One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.

You do not have to be concerned that the result may throw up this “site-map” page and would disappoint the visitor. This will not happen, as the “site-map” has no searchable content and will not get included in the results, rather all other pages would. We found the site had published hierarchical sets of crawler pages. The first crawler page lists all the category headlines, these links lead to a set of links with all story headlines, which in turn lead to the news stories.

Another way to avoid such dynamic URLs is to rewrite these URLs using a syntax that is accepted by the crawler and also understood as equivalent to the dynamic URL by the application server. The Amazon site shows dynamic URLs in such syntax. If you are using Apache web server, you can use Apache rewrite rules to enable this conversion.

One good tip is that you should prepare a crawler page (or pages) and submit this to the search engines. This page should have no text or content except for links to all the important pages that you wished to be crawled. When the spider reaches this page it would crawl to all the links and would suck all the desired pages into its index. You can also break up the main crawler page into several smaller pages if the size becomes too large. The crawler shall not reject smaller pages, whereas larger pages may get bypassed if the crawler finds them too slow to be spidered.

Tuesday, 11 March 2014

Image SEO

Images as an asset for organic search results and search engine optimization are often overlooked.  Images can drive traffic through image search as well as inclusion in universal search results.

1) Find the right images -

Finding the right kind of image is incredibly important. Great images can add another dimension to an article or page that can encourage people to share the page and create some great backlinks. Research shows that while text is still the first thing seen on the page, the image is what sells the page.

You can also use Google Images to find images for your site, as long as you search with the proper licensing. (They allow you to search Creative Commons and other public licenses.) But you have to be very careful when using images, as if you don’t have the permission to reuse it, companies and sites can take legal action against you.

The general rule of thumb is this: if the image isn’t Creative Common licensed or you didn’t buy or create it, don’t post it.

2) Use the keyword(s) in the file name

Just like keywords in post urls are important for pages, the same is true for images. Using keyword-rich words in your image filename is important for helping search engines determine relevancy. For example, the image above was originally named “iStock_small.jpg” which doesn’t add much information about this web page. It has been renamed to “image-optimization.jpg”. Of course, most images that are not simply decorative like the one above are literal and connected to the content of the page such as a photo of a product.  If the above image were used in an article about eye color, then the file name should reflect that.

Google suggests that you should place your images in one folder on your site, versus placing them in random folders throughout the site. Another suggestion from Google related to file names or URLs of images is to make sure you use common image filetypes such as JPEG, GIF, PNG, and BMP.

3) Create descriptive alt text

Alt text or tags are another way that search engines help determine what your image is about. Unlike traditional web content, search engines can’t determine the text content of an image. (Search spiders are pretty smart, but as far as I know they haven’t developed eyes yet.) As a result, search engines need to rely on captions around the image, alt text, file names and other surrounding text. Adding descriptive text in the alt tag helps the search engines determine what the content of the image is.

If an image is used as navigation, ie as a link to another page, be sure to use alt text that is meaningful to the content of the page being linked to.

4) The right anchor text

Anchor text is another important factor in image SEO. If you decide to link to images with text, your anchor text can play a role in how your image is ranked for keywords. Use descriptive anchor text that describes the image.  For example, linking to an image using a generic term like “image” or “photo” or a file name that doesn’t use keywords doesn’t give search engines much meaningful information on what the image is about. Linking to an image with keywords is helpful to search engines as well as people visiting your site.

5) Make sure the image matches the content

The content surrounding the image should be related to all of the things that you’ve optimized thus far: anchor tags, image url, alt tags. When these things align, it helps search engines confirm that you’re not spamming and that the image is of higher quality and relevant.

Friday, 7 March 2014

SEO Software Review

Best Seo Software
Seo Software

1) Ranking Monitor - Any seo optimization specialist is faced with the regular task of checking the positions of his sites in the search engines. You could check these positions manually, but if you have several dozen keywords and 5-7 search engines to monitor, the process becomes a real chore. 

   The Ranking Monitor module will do everything automatically. You are able to see information on your site ratings for any keywords and in a variety of search engines. You will also see the dynamics and history of your site positions as well as upward and downward trends in your site position for your specified keywords. The same information is also displayed in a visual form.

2)Link Popularity Checker - This program will automatically poll all available search engines and create a complete duplicate-free list of inbound links to your resource. For each link, you will see important parameters such as the link text and PageRank of the referring page. If you have studied this article, you will know how important these parameters are. As well as viewing the overall list of inbound links, you can track how the inbound links change over time.

3)Site Indexation Tool - This useful tool will show you all pages indexed by a particular search engine. It is a must-have tool for anybody who is creating a new web resource. The PageRank value will be displayed for each indexed page.

4)Log Analyzer - All information about your visitors is stored in the log files of your server. The log analyzer module will present this information in convenient and visual reports. Displayed information includes:

   - Originating sites
   - Keywords used,
   - What country they are from
   - Much more…

5)Page Rank Analyzer - This utility collects a huge amount of competitive information on the list of sites that you specify. For each site it automatically determines parameters such as Google PageRank, the number of inbound links and the presence of each site in the DMOZ and Yahoo directories. It is an ideal tool for analyzing the competition rate of a particular query.

6)HTML Analyzer - This application analyzes the HTML code of a page. It estimates the weight and density of keywords and creates a report on the correct optimization of the site text. It is useful during the creation your own site and is also a great tool for analyzing your competitors' sites. It allows you to analyze both local HTML pages and online projects.

Thursday, 6 March 2014

Cloaking - Google Guidelines

Cloaking - Google Guidelines
No Cloaking

Sometimes, a webmaster might program the server in such a way that it returns different content to Google than it returns to regular users, which is often done to misrepresent search engine rankings. This process is referred to as cloaking as it conceals the actual website and returns distorted web pages to search engines crawling the site. This can mislead users about what they'll find when they click on a search result. Google highly disapproves of any such practice and might place a ban on the website which is found guilty of cloaking. 

Google Guidelines

Here are some important tips and tricks that can be useful to deal with.

  1. A website should have crystal clear hierarchy and links and should preferably be easy to navigate.
  2. A site map is required to help the users go around your site and in case the site map has more than 100 links, then it is advisable to break it into several pages to avoid clutter.
  3. Come up with essential and precise keywords and make sure that your website features relevant and informative content.
  4. The Google crawler will not recognize text hidden in the images, so when describing important names, keywords or links; stick with plain text.
  5. The TITLE and ALT tags should be descriptive and accurate and the website should have no broken links or incorrect HTML.
  6. Dynamic pages (the URL consisting of a ‘?’ character) should be kept to a minimum as not every search engine spider is able to crawl them.
  7. The robots.txt file on your web server should be current and should not block the Googlebot crawler. This file tells crawlers which directories can or cannot be crawled.
  1. When making a site, do not cheat your users, i.e. those people who will surf your website. Do not provide them with irrelevant content or present them with any fraudulent schemes.
  2. Do not employ hidden texts or hidden links.
  3. Google frowns upon websites using cloaking technique. Hence, it is advisable to avoid that.
  4. Automated queries should not be sent to Google.
  5. Avoid "doorway" pages created just for search engines or other "cookie cutter" approaches such as affiliate programs with hardly any original content.

Step By Step Page Optimization

Step By Step Page Optimization
Onpage Optimization

1) A heading tag that includes a keyword(s) or keyword phrases. A heading tag is bigger and bolder text than normal body text, so a search engine places more importance on it because you emphasize it.

2)Heading sizes range from h1 - h6 with h1 being the largest text. If you learn to use just a little   Cascading Style Sheet code you can control the size of your headings. You could set an h1 s   sized heading to be only slightly larger than your normal text if you choose, and the search         engine will still see it as an important heading.

3)Next would be an introduction that describes your main theme. This would include several of your top keywords and keyword phrases. Repeat your top 1 or 2 keywords several times, include other keyword search terms too, but make it read in sentences that makes sense to your visitors.

4)A second paragraph could be added that got more specific using other words related to online education.

5)Next you could put smaller heading.

6)Then you'd list the links to your pages, and ideally have a brief decision of each link using keywords and keyword phrases in the text. You also want  to have several pages of quality content to link to. Repeat that procedure for all your links that relate to your theme.

7)Next you might include a closing, keyword laden paragraph. More is not necessarily better when it comes to keywords, at least after a certain point. Writing "online education" fifty times across your page would probably result in you being caught for trying to cheat. Ideally, somewhere from 3% - 20% of your page text would be keywords. The percentage changes often and is different at each search engine. The 3-20 rule is a general guideline, and you can go higher if it makes sense and isn't redundant.