Thursday 29 December 2011

What is SEO (On Page & Off Page SEO)


 

                           What is Search engine optimization?

 

SEO is the technique of improving a web site ranking in search engines. SEO thus helps you get traffic from search engines and other web site. If the website has more traffic then search engine will crawl it in very short times in comparison with other websites.

Types of seo :-

 1= On Page Seo (30%)

2= Off Page Seo (70%)

1:- On Page =    on page SEO is the process of optimizing the content of your website. This includes the text, images and links on your website. Anything uploaded to your site's domain is considered on page

·         Keyword search

·         Add Title Tag 

·         Add Meta Description Tags in all website page

·         Add keyword

·         Add favicon

·         Add web master tool all search engine (Google, yahoo, msn, alexia)

·         Add Robots.txt and Humans.txt

·         Add Meta tag in all images & add Title in all Hyperlink & Paragraph.

·         Add .htaccess file

·         Add xml sitemap

·         Page and all type of file compression status.

·         Setup analytics tools

 

Most important points in on page seo

1=   Make URL’s is seo friendly with help .htaccess file
2= Background repeat images should be of 1px. And used .GIF file in flash.
3= Title should be of 66 characters & Description should be of 150-155 characters in Google and 145-150 characters in yahoo.com
4=and keyword should be of 250 characters including spaces & maximum 20 keyword used in one web site …
5=used image and flash size 50kb & Page size should not be more than 25 kb
6= Use favicon make it with 16X16 px with .ico plugging
7= URL Length Less than 1000 is good and less than 100 is better.
8= Use 4 keywords in page title in between “|”, 7 words at most and maximum 66 characters in one Title.

2:- Off Page Seo: - Off page SEO or search engine optimization is doing things off site to improve your sites search engine rankings. The only thing you can do off site to increase your rankings is build up more links. More links will generally lead to better Google Page Rank and better search engine rankings.

Work in off page

·         Search engine submission

·         Directory submission

·         Social bookmarking submission

·         Article submission

·         Blog generation & blog commenting

·         Link wheel

·         Classified ads

·         Press release

·         Social Media Monitoring

Most important points in off page seo

1= to get better results: - Submit on 120 - 200 high pr do follow directories for one domain per day.

2= to get better back link results: - Submit on 30 bookmarking per day in one domain

 

 

Different Way of SEO:-

1:- White hat seo     2:- Black hat seo   3:- Gray hat seo


White Hat SEO :- Also known as natural search engine optimization or organic search engine optimization, ethical seo, white hat SEO is the legitimate use of keyword-focused copy and tags, Crawler-friendly site architecture, Search Engine Submissions and a quality Back links network to improve a site's Position.
Black Hat SEO: - Black hat SEO is the term used for unethical or deceptive optimization techniques. This includes Spam, Cloaking, or violating search engine rules in any way. If a search engine discovers a site engaging in black hat SEO it will remove that site from its Index.
Grey hat seo: - Gray hat SEO refers to Search Engine Optimization strategies that fall in between Black Hat SEO and White Hat SEO. Gray hat SEO techniques can be legitimate in some cases and illegitimate in others. Such techniques include Doorway Pages, Gateway Pages, Cloaking and duplicate content. It is a soft form of Black hat seo or we can say that it is tricky part of seo.


Monday 26 December 2011

Spring Cleaning out of Session

Announcement by Google
• Aardvark: Aardvark was a start-up we acquired in 2010. An experiment in a new kind of social search, it helped people answer each other’s questions. While Aardvark will be closing, we’ll continue to work on tools that enable people to connect and discover richer knowledge about the world.

• Desktop: In the last few years, there’s been a huge shift from local to cloud-based storage and computing, as well as the integration of search and gadget functionality into most modern operating systems. People now have instant access to their data, whether online or offline. As this was the goal of Google Desktop, the product will be discontinued on September 14, including all the associated APIs, services, plugins, gadgets and support.
• Fast Flip: Fast Flip was started to help pioneer news content browsing and reading experiences for the web and mobile devices. For the past two years, in collaboration with publishers, the Fast Flip experiment has fueled a new approach to faster, richer content display on the web. This approach will live on in our other display and delivery tools.
• Google Maps API for Flash: The Google Maps API for Flash was launched to provide ActionScript developers a way to integrate Google Maps into their applications. Although we’re deprecating the API, we’ll keep supporting existing Google Maps API Premier customers using the Google Maps API for Flash and we’ll focus our attention on the JavaScript Maps API v3 going forward.
• Google Pack: Due to the rapidly decreasing demand for downloadable software in favor of web apps, we will discontinue Google Pack today. People will still be able to access Google’s and our partners’ software quickly and easily through direct links on the Google Pack website.
• Google Web Security: Google Web Security came to Google as part of the Postini acquisition in 2007, and since then we've integrated much of the web security functionality directly into existing Google products, such as safe browsing in Chrome. Although we will discontinue new sales of Google Web Security, we’ll continue to support our existing customers.
• Image Labeler: We began Google Image Labeler as a fun game to help people explore and label the images on the web. Although it will be discontinued, a wide variety of online games from Google are still available.
• Notebook: Google Notebook enabled people to combine clipped URLs from the web and free-form notes into documents they could share and publish. We’ll be shutting down Google Notebook in the coming months, but we’ll automatically export all notebook data to Google Docs.
• Sidewiki: Over the past few years, we’ve seen extraordinary innovation in terms of making the web collaborative. So we’ve decided to discontinue Sidewiki and focus instead on our broader social initiatives. Sidewiki authors will be given more details about this closure in the weeks ahead, and they’ll have a number of months to download their content.
• Subscribed Links: Subscribed Links enabled developers to create specialized search results that were added to the normal Google search results on relevant queries for subscribed users. Although we'll be discontinuing Subscribed Links, developers will be able to access and download their data until September 15, at which point subscribed links will no longer appear in people's search results.
• Code Search, which was designed to help people search for open source code all over the web, will be shut down along with the Code Search API on January 15, 2012.
• In a few weeks we’ll shut down Google Buzz and the Buzz API, and focus instead on Google+. While people obviously won't be able to create new posts after that, they will be able to view their existing content on their Google Profile, and download it using Google Takeout.
• Jaiku, a product we acquired in 2007 that let users send updates to friends, will shut down on January 15, 2012. We’ll be working to enable users to export their data from Jaiku.
• Several years ago, we gave people the ability to interact socially on iGoogle. With our new focus on Google+, we will remove iGoogle's social features on January 15, 2012. iGoogle itself, and non-social iGoogle applications, will stay as they are.
• The University Research Program for Google Search, which provides API access to our search results for a small number of approved academic researchers, will close on January 15, 2012.
• Google Bookmarks Lists—This is an experimental feature for sharing bookmarks and collaborating with friends, which we’re going to end on December 19, 2011. All bookmarks within Lists will be retained and labeled for easier identification, while the rest of Google Bookmarks will function as usual. As Lists was an English-only feature, non-English languages will be unaffected.
• Google Friend Connect—Friend Connect allows webmasters to add social features to their sites by embedding a few snippets of code. We're retiring the service for all non-Blogger sites on March 1, 2012. We encourage affected sites to create a Google+ page and place a Google+ badge on their site so they can bring their community of followers to Google+ and use new features like Circles and Hangouts to keep in touch.
• Google Gears—In March we said goodbye to the Gears browser extension for creating offline web applications and stopped supporting new browsers. On December 1, 2011, Gears-based Gmail and Calendar offline will stop working across all browsers, and later in December Gears will no longer be available for download. This is part of our effort to help incorporate offline capabilities into HTML5, and we’ve made a lot of progress. For example, you can access Gmail, Calendar and Docs offline in Chrome.
• Google Search Timeline—We’re removing this graph of historical results for a query. Users will be able to restrict any search to particular time periods using the refinement tools on the left-hand side of the search page. Additionally, users who wish to see graphs with historical trends for a web search can use google.com/trends or google.com/insights/search/ for data since 2004. For more historical data, the "ngram viewer" in Google Books offers similar information.
• Google Wave—We announced that we’d stopped development on Google Wave over a year ago. But as of January 31, 2012, Wave will become read-only and you won’t be able to create new ones. On April 30 we will turn it off completely. You’ll be able to continue exporting individual waves using the existing PDF export feature until the Google Wave service is turned off. If you’d like to continue using this technology, there are a number of open-source projects, including Apache Wave and Walkaround.
• Knol—We launched Knol in 2007 to help improve web content by enabling experts to collaborate on in-depth articles. In order to continue this work, we’ve been working with Solvitor and Crowd Favorite to create Annotum, an open-source scholarly authoring and publishing platform based on WordPress. Knol will work as usual until April 30, 2012, and you can download your knols to a file and/or migrate them to WordPress.com. From May 1 through October 1, 2012, knols will no longer be viewable, but can be downloaded and exported. After that time, Knol content will no longer be accessible.
• Renewable Energy Cheaper than Coal - This initiative was developed as an effort to drive down the cost of renewable energy, with an REpublished our results to help others in the field continue to advance the state of power tower technology, and we’ve closed our efforts. We will continue our work to generate cleaner, more efficient energy—including our on-campus efforts, procuring renewable energy for our data centers, making our data centers even more efficient and investing more than $850 million in renewable energy technologies.

Monday 19 December 2011

Google Removes Author: Search From Google News


navjot singh
You can no longer search for articles from specific authors in Google News.

As Barry Schwartz reported this morning on Search Engine Roundtable, using the author: firstname lastname command at Google News brings up no results now, and Google has disabled it on purpose. If you think it has something to do with the rel=author movement, it seems that you’re correct. Here’s what a Google employee named Erik explained in the Google News help forum:



The author: search operator is no longer available. For author-specific Google News content, I would recommend use of the Authorship capabilities in Google News, introduced last month. Integration with Google+ circles means easier following and engagement between authors and readers.




The main problem here, as Barry points out on SER, is that rel=author markup rarely seems to show inside of Google News search results.

Google Adds Author Stats To Webmaster Tools

By

Google has introduced a new report in Google Webmaster Tools named “Author Stats.”

Author Stats shows you how often your content is showing up on the Google search results page. This will show up under Google Webmaster Tools in the “labs” section in Webmaster Tools. It shows the impressions and clicks of the stories found in Google and shows up when you associate your content with your Google Profile.


Here is a picture:


Google said if you have issues with it, you can email them at authorship-pilot@google.com.

Sunday 11 December 2011

Web page with {strong} elements with jQuery


selector with the jQuery html() and text() functions to
change the HTML code or text in all matching elements on a page. follow these steps to change the text in all the elements on a page:


“http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd”>





My Test Page









some name

Some text



another name

More text



another name

Even more text



your name

Last bit of text






Meta tag generator 4 seo
If else statement
Increase traffic your blogs
Popularity by massage
 Important of keywords
Boost your ranking with seo

Friday 9 December 2011

Parents and Children Selectorswith jQuery


The following code shows two
elements, each with the same content inside:







My Test Page











your name goes here

text here



another name goes here

More text here







more name

more text



another name

More text











select elements based on their parents or children, try these selectors:



first-child: Selects the first child element. The following code selects the first child of the first
and changes the text of the selected element:

$(‘div:first-child’).text(‘Change me.’);


last-child: Selects the last child element. The following code selects the last child of the second
and changes the text of the selected element:

$(‘div:last-child’).text(‘Change me.’);


child: Selects the child element of the parent element. This code changes the text of every element that is a child of a
element


$(‘div > strong’).text(‘Change me.’);


You can find the complete list of selectors at http://api.jquery.com/category/selectors/.


Meta tag generator 4 seo
If else statement
Increase traffic your blogs
Popularity by massage
Tips 4 blog posting



Thursday 8 December 2011

Changing Text Content with jQuery

Two elements with their HTML code swapped.Sometimes you don’t want the actual HTML code in an element; you want
only the text. To do so, replace the html() function with the text() function.
In the preceding example, you swapped the HTML code. If you want to
swap only the text and not the HTML code, use this code:


“http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd”>





My Test Page







This is the text in the STRONG element.

This is the text in the P element.






Free create back link
Meta tag generator 4 seo
If else statement
Increase traffic your blogs
Free Web Promotion



Tuesday 6 December 2011

HTML Elements, Attributes, and Positions on jQuery

HTML Elements, Attributes, and Positions:
Following are some order selectors and examples of their use with elements

in the preceding code:


radio: Selects all elements with the type attribute set to radio. The
following code returns the value 1 in an alert box:
alert( $(‘:radio’).length);


checkbox: Selects all elements with type attribute set to radio. The following code sets the checked attribute to true for all check boxes:

$(‘:checkbox’).attr({checked:’true’});



[attribute]: Selects all elements with a specific attribute. The following

code displays the number of elements with a height attribute:

alert( $(‘img[height]’).length);


[attribute=value]: Selects all elements with a particular attribute set to a specific value. The following code displays the number of elements with a class attribute set to myclass:

alert( $(‘[class=myclass]’).length);



[attribute!=value]: Selects all elements with a particular attribute not set to a specific value. The following code displays the number of
elements with a class attribute that isn’t myclass. Elements with no class attribute are ignored:

alert( $(‘[class!=myclass]’).length);    

Free create back link
Meta tag generator 4 seo
Meta description taG
Free link for PR3
Website optimization technique


Sunday 4 December 2011

Html elements

A few of
most common HTML elements you should know:





and .
: Tells the Web browser that everything inside the tags

should be considered a Web page.
: Contains information that controls how the page is

displayed. Elements responsible for JavaScript and CSS code and calls to

other files are generally placed between these tags.

: Contains the title of the Web page, displayed on the

title bar at the top of the browser.

: Holds all the content of the page.

: Controls the appearance and behavior of elements

on your Web page.

: Makes JavaScript and other specified code available,

either by calling a file or code placed between these tags. jQuery is

included on the page with this tag.

: Boldfaces any text within the tag.

: Creates header text.

: Creates a container of content.

: Creates a paragraph.
: Creates a hyperlink.

: Displays an image. Note that this tag doesn’t have a matching

end tag, so a slash character is used inside the tag to denote the end of

the tag.
: Creates a Web form that can send user-submitted information

to another Web page or code that can process this information.

: Creates a form element, such as a radio button, text

input box, or a Submit button. Used as a child element inside


.

: Inserts a line break. No matching end tag is needed.

: Creates a table, along with child tags


























































You can check MoreHTML elements is located at

www.w3.org/TR/REChtml40/index/elements.html.




Free create back link
Meta tag generator 4 seo
If else statement
Increase traffic your blogs
Popularity by massage

 

Wednesday 30 November 2011

What Is “(Not Provided)” in Organic Search Traffic Keywords on Google Analytics?

For about two months we’ve noticed a (Not Provided) item on Google Analytics traffic reports under “Traffic Sources/Google Organic” section. This is an article that has done a good job explaining it. The basic answer is that the keywords surfers used to find this particular site were not shown because the searchers were logged into their Google account when conducting the search.


Google wants to further protect their users’ privacy by encrypting their search results pages (through https://www.google.com).
In Nov. our site received 1,144 visitors from organic Google search which their search keywords were not shown (see below). This is 19.6% of our total Google organic search traffic.
The numbers are similar across the board for our other clients. The fact that we are running blind for about 20% of our total Google organic search traffic is creating a bit of frustration for our team. Mainly because we’re unable to show our clients (or our team) what keywords were used when the site was found on Google organic search traffic with 100% certainty (as we did prior to Oct.).
However, since the total number of organic search traffic has not been effected, we can use the other keywords to get a clear understanding of what keywords people search to find a particular site.
My opinion is that Google needs to find a way to fully show the keywords people use in finding sites. Otherwise, we may have to use other software to monitor our clients’ traffic.

Friday 25 November 2011

Importance of Keyword

The reason keyword analysis is therefore necessary is as a result of it’s regarding quite search engine behaviour. Keywords are the foremost solid proof you'll be able to realize when researching the approach your users assume. Even in depth user surveys won’t be quite as reliable because the data folks offer out when they’re not responsive to being watched. whereas your users might tell you that they are available to you for quality service or discount costs, keywords can tell you that they are available to you when they’re bored or that they are available to you simply when they’re when one thing terribly specific. Sometimes, keyword analysis will tell you that your users aren’t who you thought they were.

Thursday 17 November 2011

Raising awareness of cross-domain URL selections

A piece of content can often be reached via several URLs, not all of which may be on the same domain. A common example we’ve talked about over the years is having the same content available on more than one URL, an issue known as duplicate content. When we discover a group of pages with duplicate content, Google uses algorithms to select one representative URL for that content. A group of pages may contain URLs from the same site or from different sites. When the representative URL is selected from a group with different sites the selection is called a cross-domain URL selection. To take a simple example, if the group of URLs contains one URL from a.com and one URL from b.com and our algorithms select the URL from b.com, the a.com URL may no longer be shown in our search results and may see a drop in search-referred traffic.

Webmasters can greatly influence our algorithms’ selections using one of the currently supported mechanisms to indicate the preferred URL, for example using rel="canonical" elements or 301 redirects. In most cases, the decisions our algorithms make in this regard correctly reflect the webmaster’s intent. However, in some rare cases we’ve also found many webmasters are confused as to why it has happened and what they can do if they believe the selection is incorrect.
To be transparent about cross-domain URL selection decisions, we’re launching new Webmaster Tools messages that will attempt to notify webmasters when our algorithms select an external URL instead of one from their website. The details about how these messages work are in our Help Center article about the topic, and in this blog post we’ll discuss the different scenarios in which you may see a cross-domain URL selection and what you can do to fix any selections you believe are incorrect.

Common causes of cross-domain URL selection

There are many scenarios that can lead our algorithms to select URLs across domains.
In most cases, our algorithms select a URL based on signals that the webmaster implemented to influence the decision. For example, a webmaster following our guidelines and best practices for moving websites is effectively signalling that the URLs on their new website are the ones they prefer for Google to select. If you’re moving your website and see these new messages in Webmaster Tools, you can take that as confirmation that our algorithms have noticed.
However, we regularly see webmasters ask questions when our algorithms select a URL they did not want selected. When your website is involved in a cross-domain selection, and you believe the selection is incorrect (i.e. not your intention), there are several strategies to improve the situation. Here are some of the common causes of unexpected cross-domain URL selections that we’ve seen, and how to fix them:
  1. Duplicate content, including multi-regional websites: We regularly see webmasters use substantially the same content in the same language on multiple domains, sometimes inadvertently and sometimes to geotarget the content. For example, it’s common to see a webmaster set up the same English language website on both example.com and example.net, or a German language website hosted on a.de, a.at, and a.ch.Depending on your website and your users, you can use one of the currently-supported canonicalization techniques to signal to our algorithms which URLs you wish selected. Please see the following articles about this topic:
    • Canonicalization, specifically rel="canonical" elements and 301 redirects
    • Multi-regional and multilingual sites and more about working with multi-regional websites
    • About rel="alternate" hreflang="x"
  2. Configuration mistakes: Certain types of misconfigurations can lead our algorithms to make an incorrect decision. Examples of misconfiguration scenarios include:
    1. Incorrect canonicalization: Incorrect usage of canonicalization techniques pointing to URLs on an external website can lead our algorithms to select the external URLs to show in our search results. We’ve seen this happen with misconfigured content management systems (CMS) or CMS plugins installed by the webmaster.To fix this kind of situation, find how your website is incorrectly indicating the canonical URL preference (e.g. through incorrect usage of a rel="canonical" element or a 301 redirect) and fix that.
    2. Misconfigured servers: Sometimes we see hosting misconfigurations where content from site a.com is returned for URLs on b.com. A similar case occurs when two unrelated web servers return identical soft 404 pages that we may fail to detect as error pages. In both situations we may assume the same content is being returned from two different sites and our algorithms may incorrectly select the a.com URL as the canonical of the b.com URL.You will need to investigate which part of your website’s serving infrastructure is misconfigured. For example, your server may be returning HTTP 200 (success) status codes for error pages, or your server might be confusing requests across different domains hosted on it. Once you find the root cause of the issue, work with your server admins to correct the configuration.
  3. Malicious website attacks: Some attacks on websites introduce code that can cause undesired canonicalization. For example, the malicious code might cause the website to return an HTTP 301 redirect or insert a cross-domain rel="canonical" link element into the HTML or HTTP header, usually pointing to an external URL hosting malicious content. In these cases our algorithms may select the malicious or spammy URL instead of the URL on the compromised website.In this situation, please follow our guidance on cleaning your site and submit a reconsideration request when done. To identify cloaked attacks, you can use the Fetch as Googlebot function in Webmaster Tools to see your page’s content as Googlebot sees it.

Wednesday 16 November 2011

Website optimization techniques

Website content optimization is common sense,who doesn't want a fast find site with engaging content.Here is a detailed optimization techniques-


Unique viitors
A server hits is an HTTP request for a single web objective.Ones web page viw can require many hits to the server.You can increase your unique audienc by providing fast,fast,revelent,engaging web pages.Tracking new unique visitors can help you track audience grouth.

Average time on site and length of site
Click Tracks is one of the best measure of user engagement.

Pages per visit
The number of pages that wew consumed during a visit is a broad and simple measure of user engagement.Pages per visit measure that can indicate possible flow states of high engagement.

Bounce rate
The bounce rate is the percentage of user who left your site without browsing to another page or terminating.You shouls examining pages with high bounce rates closely for improvement to content.

Conversation rates
The ratio of the number of objectives accomplished when compared to unique visitors is your converation rate.You can boost your converation rate.

Primary content consumptiom

Every site visit has to have entry point
.This is the persantage of time that a pag constitutes the impression of a site.

Path loss
Path loss is the persantage of times that page was seen within a visitors navigation path where the visitors was terminated without bouncing.Path loss can indicate incomplate information or misguide search marketing.

Keyword or campaign
Which keyword or campaigns are making you the most money or searching position.

Cost per conversation
If the cost per conversation for a particular campaign,like ad group or keyword is larger than the average sale value for the include items.

Tuesday 15 November 2011

Ten recent algorithm changes

Ten recent algorithm changes by Google.com

Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.



Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.

Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page’s title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page’s content.
Length-based autocomplete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.

Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.

Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.

Fresher, more recent results: As we announced just over a week ago, we’ve made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.

Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.

Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.

Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.

Monday 7 November 2011

Google Search Algorithm Updates

If and Else Statement

The if and else statements can be very helpful in controlling the power and resulting output of your scripts.You can input your site with basically unlimitited possibilities.You can diffrent massage based on different content or persone's.

EXAMPLE OF CODE

$ages=45;
if($ages=1 && $ages<=4){
echo "Infant";
}else if($ages=5 && $ages<=12){
echo "child";
}else if($ages=13 && $ages<=19){
echo "Teen";
}else if($ages=20 && $ages<=40){
echo "Youth";
}else if($ages=41 && $ages<=60){
echo "adult";
}else{
echo "Enter correct value";
}
?>

Sunday 30 October 2011

Boost your search engine ranking with fresh content


Fresh content is extremely necessary in terms of search engine optimization. Boost your search engine ranking with fresh content.Users surfing cyberspace are continuously searching for the newest data. Search engines perceive this and so places a good stress on the content freshness. Sites that are frequently updated additionally encourage the spiders to go to a page that has its content updated daily can notice that search engine crawls the page additional typically than the opposite slow active pages. This explains why blogs have a power frequent bots visit as compared to alternative sites.

Wednesday 26 October 2011

High_performance_web_site_tips_and_tricks




Proper structural markup conveys useful info to whoever is maintaining the positioning with heading,paragraphs and list things.Search engine search for structural markup to ascertain what info is most vital.
These are basically tips for web site redesign
1.Use a content delivery network.
2.Add an expires header.
3.Make fewer HTTP requests to scale back object overhead.
4.Put Java Scripts at very cheap of the body.
5.Put style sheet at the highest of the body.
6.Avoid CSS expressions that are CPU-intensive and might be evaluated frequently.
7.Reduce Domain Name System (DNS)lookups to scale back the overhead of DNS delay by spiting look ups between 2 to four distinctive host names.
8.Configure E tags for sites hosted on multiple server.File E Tag none in Apache removes E tags to avoid improver cache validation.
9.Make Ajax cache able and little to avoid unnecessary HTTP requests.




Another-popularity-your-link-by-message-boards-and-discussion-lists
Free-create-back-link
Meta-tag-generator-for-seo
Tips-4blog-posting
How-to-optimization-your-website-best-secret-of-seo
Online-promotion-for-blogger
Seo-tips_for Blogger
Seo_on_link_building

Optimizes_your_web_page_speed




Optimize your online page speed.Start by stripping out all inline vogue.Pare down your markup to pure HTML structure.Next,Look at your page to envision wheather any components may be produce by additional effiecent suggests that.You can usually morph HTML structure components with CSS to switch table-based components additional effieciently.


After your code has been stripped of favor.Convert that embedded style into rule-based CSS.To enable progresive show.Position CSS files within the head and Jave Scripts files at the top of your body code.Minimize the quantity of HTTp request by combining files and changing graphical text to CSS text.Use HTTP compression to avoid wasting and average of eightieth off XHTML,CSS and JAVA SCRIPT file size

See also this pages

Another-popularity-your-link-by-message-boards-and-discussion-lists
Free-create-back-link
Meta-tag-generator-for-seo
Tips-4blog-posting
How-to-optimization-your-website-best-secret-of-seo
Online-promotion-for-blogger
Seo-tips_for Blogger
Seo_on_link_building

Sunday 23 October 2011

Free_Link_submit_PR

PageRank could be a numeric price that represents how necessary a page is on the net. Google figures that when one page links to a different page, it's effectively casting a vote for the opposite page. The additional votes that are forged for a page, the additional necessary the page should be. Also, the importance of the page that's casting the vote determines how necessary . Google calculates a page's importances. How necessary every vote is is taken into consideration when a page's PageRank is calculated.

One-way inbound links from websites with topics that are associated with your website's topic can assist you gain the next page rank.

The number of links outbound from the web site that links to you furthermore mght determines the worth of the link. A connected web site with ten outbound links that links to you is way higher than a connected web site with a hundred outbound links that link to you.

So Wanna Put Your Link On this PR2 Blogs You Are always welcome-But Only PAGE RANK2-AND HIGHER.DO YOU HAVE PR2 OR HIGHER SO SUBMIT YOUR LINK-
IF YOU SUBMIT YOUR LINK 0 PAGE RANK WE DELETE YOUR LINK IN THIS BLOG.






Thursday 20 October 2011

Should I Change My URLs for SEO?

Written By Mail to Navjot
Every SEO eventually gets fixated on a tactic. Maybe you read 100 blog posts about how to build the “perfectly” optimized URL, and you keep tweaking and tweaking until you get it just right. Fast-forward 2 months – you’re sitting on 17 layers of 301-redirects, you haven’t done any link-building, you haven’t written any content, you’re eating taco shells with mayonnaise for lunch, and your cat is dead.
Ok, maybe that’s a bit extreme. I do see a lot of questions about the "ideal" URL structure in Q&A, though. Most of them boil down to going from pretty good URLs to slightly more pretty good URLs.

All Change Is Risky

I know it’s not what the motivational speakers want you to hear, but in the real world, change carries risk. Even a perfectly executed site-wide URL change – with pristine 301-redirects – is going to take time for Google to process. During that time, your rankings may bounce. You may get some errors. If your new URL scheme isn’t universally better than the old one, some pages may permanently lose ranking. There’s no good way to A/B test a site-wide SEO change.
More often, it’s just a case of diminishing returns. Going from pretty good to pretty gooder probably isn’t worth the time and effort, let alone the risk. So, when should you change your URLs? I’m going to dive into 5 specific scenarios to help you answer that question…

(1) Dynamic URLs

A dynamic URL creates content from code and data and carries parameters, like this:
www.example.com/product.php?id=12345&color=4&size=3&session=67890
It’s a common SEO misconception that Google can’t read these URLs or gets cut off after 2 or 3 parameters. In 2011, that’s just not true – although there are reasonable limits on URL length. The real problems with dynamic URLs are usually more complex:
  • They don’t contain relevant keywords.
  • They’re more prone to creating duplicate content.
  • They tend to be less user-friendly (lower click-through).
  • They tend to be longer.
So, when are your URLs too dynamic? The example above definitely needs help. It’s long, it has no relevant keywords, the color and size parameters are likely creating tons of near-duplicates, and the session ID is creating virtually unlimited true duplicates. If you don’t want to be mauled by Panda, it’s time for a change.
In other cases, though, it’s not so simple. What if you have a blog post URL like this?
www.example.com/blog.php?topic=how-to-tame-a-panda
It’s technically a “dynamic” URL, so should you change it to something like:
www.example.com/blog/how-to-tame-a-panda
I doubt you’d see much SEO benefit, or that the rewards would outweigh the risks. In a perfect world, the second URL is better, and if I was starting a blog from scratch I’d choose that one, no question. On an established site with 1000s of pages, though, I’d probably sit tight.

(2) Unstructured URLs

Another common worry people have is that their URLs don’t match their site structure. For example, they have a URL like this one:
www.example.com/diamond-studded-ponies
...and they think they should add folders to represent their site architecture, like:
www.example.com/horses/bejeweled/diamond-studded-ponies
There’s a false belief in play here – people often think that URL structure signals site structure. Just because your URL is 3 levels deep doesn’t mean the crawlers will treat the page as being 3 levels deep. If the first URL is 6 steps from the home-page and the second URL is 1 step away, the second URL is going to get a lot more internal link-juice (all else being equal).
You could argue that the second URL carries more meaning for visitors, but, unfortunately, it’s also longer, and the most unique keywords are pushed to the end. In most cases, I’d lean toward the first version.
Of course, the reverse also applies. Just because a URL structure is “flat” and every page is one level deep, that doesn’t mean that you’ve created a flat site architecture. Google still has to crawl your pages through the paths you’ve built. The flatter URL may have some minor advantages, but it’s not going to change the way that link-juice flows through your site.
Structural URLs can also create duplicate content problems. Let’s say that you allow visitors to reach the same page via 3 different paths:
www.example.com/horses/bejeweled/diamond-studded-ponies
www.example.com/tags/ponies/diamond-studded-ponies
www.example.com/tags/shiny/diamond-studded-ponies
Now, you’ve created 2 pieces of duplicate content – Google is going to see 3 pages that look exactly the same. This is more of a crawl issue than a URL issue, and there are ways to control how these URLs get indexed, but an overly structured URL can exacerbate these problems.

(3) Long URLs

How long of a URL is too long? Technically, a URL should be able to be as long as it needs to be. Some browsers and servers may have limits, but those limits are well beyond anything we’d consider sane by SEO or usability standards. For example, IE8 can support a URL of up to 2,083 characters.
Practically speaking, though, long URLs can run into trouble. Very long URLs:
  • Dilute the ranking power of any given URL keyword
  • May hurt usability and click-through rates
  • May get cut off when people copy-and-paste
  • May get cut off by social media applications
  • Are a lot harder to remember
How long is too long is a bit more art than science. One of the key issues, in my mind, is redundancy. Good URLs are like good copy – if there’s something that adds no meaning, you should probably lose it. For example, here’s a URL with a lot of redundancy:
www.example.com/store/products/featured-products/product-tasty-tasty-waffles
If you have a “/store” subfolder, do you also need a “/products” layer? If we know you’re in the store/products layer, does your category have to be tagged as “featured-products” (why not just “featured”)? Is the “featured” layer necessary at all? Does each product have to also be tagged with “product-“? Are the waffles so tasty you need to say it twice?
In reality, I’ve seen much longer and even more redundant URLs, but that example represents some of the most common problems. Again, you have to consider the trade-offs. Fixing a URL like that one will probably have SEO benefits. Stripping “/blog” out of all your blog post URLs might be a nice-to-have, but it isn’t going to make much practical difference.

(4) Keyword Stuffing

Scenarios (3)-(5) have a bit of overlap. Keyword-stuffed URLs also tend to be long and may cannibalize other pages. Typically, though a keyword-stuffed URL has either a lot of repetition or tries to tackle every variant of the target phrase. For example:
www.example.com/ponies/diamond-studded-ponies-diamond-ponies-pony
It’s pretty rare to see a penalty based solely on keyword-stuffed URLs, but usually, if your URLs are spammy, it’s a telltale sign that your title tags,copy, etc. are spammy. Even if Google doesn’t slap you around a little, it’s just a matter of focus. If you target the same phrase 14 different ways, you may get more coverage, but each phrase will also get less attention. Prioritize and focus – not just with URLs, but all keyword targeting. If you throw everything at the wall to see what sticks, you usually just end up with a dirty wall.

(5) Keyword Cannibalization

This is probably the toughest problem to spot, as it happens over an entire site – you can’t spot it in a single URL (and, practically speaking, it’s not just a URL problem). Keyword cannibalization results when you try to target the same keywords with too many URLs.
There’s no one right answer to this problem, as any site with a strong focus is naturally going to have pages and URLs with overlapping keywords. That’s perfectly reasonable. Where you get into trouble is splitting off pages into a lot of sub-pages just to sweep up every long-tail variant. Once you carry that too far, without the unique content to support it, you’re going to start to dilute your index and make your site look “thin”.
The URLs here are almost always just a symptom of a broader disease. Ultimately, if you’ve gotten too ambitious with your scope, you’re going to need to consolidate those pages, not just change a few URLs. This is even more important post-Panda. It used to be that thin content would only impact that content – at worst, it might get ignored. Now, thin content can jeopardize the rankings of your entire site.

Proceed With Caution

If you do decide a sitewide URL change is worth the risk, plan and execute it carefully. How to implement a sitewide URL change is beyond the scope of this post, but keep in mind a couple of high-level points:
  1. Use proper 301-redirects.
  2. Redirect URL-to-URL, for every page you want to keep.
  3. Update all on-page links.
  4. Don’t chain redirects, if you can avoid it.
  5. Add a new XML sitemap.
  6. Leave the old sitemap up temporarily.
Point (3) bears repeating. More than once, I’ve seen someone make a sitewide technical SEO change, implement perfect 301 redirects, but then not update all of their navigation. Your crawl paths are still the most important signal to the spiders – make sure you’re 100% internally consistent with the new URLs.
That last point (6) is a bit counterintuitive, but I know a number of SEOs who insist on it. The problem is simple – if crawlers stop seeing the old URLs, they might not crawl them to process the 301-redirects. Eventually, they’ll discover the new URLs, but it might take longer. By leaving the old sitemap up temporarily, you encourage crawlers to process the redirects. If those 301-redirects are working, this won’t create duplicate content. Usually, you can remove the old sitemap after a few weeks.
Even done properly and for the right reasons, measure carefully and expect some rankings bounce over the first couple of weeks. Sometimes, Google just needs time to evaluate the new structure.

Wednesday 19 October 2011

What Happiness In 60 Seconds Online

Date_Profile_on_PHp_Example

PHP is a called Personal Home Page server-side scripting language originally designed for web development to produce dynamic web pages.
I wanna just piece of code on date profile on PHp.Which basic idea about php.This is very simple and very handy.

EXAMPLE OF CODE




if(isset($_POST['submit'])){

$name = $_POST['name'];

$year = $_POST['year'];

$sex = $_POST['sex'];

$curr_year=2011;

$age=$curr_year-$year;
echo $name . "
". $year . "
". $sex;
/*if($name=='binod'){

echo"

welcome sir

";

}

else{

echo"

you are not binod

";

}



if($year=='2000'){

echo"thankyou";
}

else{

echo"You are not envited";

}*/

if($sex=='male'){

echo"Hello Sir";

}

else{

echo"Hello Mam";

}



}

?>



NAME*:


DATE OF YEAR*:


Male Female







Another-popularity-your-link-by-message-boards-and-discussion-lists
Free-create-back-link
Meta-tag-generator-for-seo
Tips-4blog-posting
How-to-optimization-your-website-best-secret-of-seo
Online-promotion-for-blogger
Seo-tips_for Blogger
Seo_on_link_building


Saturday 15 October 2011

The_Nofollow_tricks

You can concentrate on Page Rank where you need it with the nogollow attribute.First you create blog comments spam.The nofollow attribute can ba added to links that points to pages to which you don't want refer Page Rank.Nofollow attribute to concentrate its reffered Page Rank to flow to only those which wants to promote,
Such Example of code-





Another-popularity-your-link-by-message-boards-and-discussion-lists
Free-create-back-link
Meta-tag-generator-for-seo
Tips-4blog-posting
How-to-optimization-your-website-best-secret-of-seo
Online-promotion-for-blogger
Seo-tips_for Blogger
Seo_on_link_building





Yahoo traffic data into Bing Webmaster Tools reports

Written By Mail to Navjot
Bing Webmaster Tools will now be showing integrated data from Yahoo within certain areas and reports. Given the combined effort the Search Alliance represents, it makes sense to showcase relevant data from both engines within a webmaster account. Most areas and data within the accounts will not be affected. The short list of areas you will notice changes in are as follows:


On the Traffic Tab: Traffic summary report and Page Traffic reports will be impacted as follows:
  1. Impressions – will go up based on combined data numbers
  2. Clicks – will go up based on combined data numbers
  3. Click Through Rates (CTR) as appropriate from above (change only due to the mathematics involved in the first two items)
Impressions data will rise because we will now be showing you combined impressions for your listings across both search engines. For each query term in the list within the report, your impressions represent the combined number of times your result showed, based on queries initiated by searchers at both Bing and Yahoo. Clicks data will follow the same pattern. CTR data is a factor of the first two items, and will rise or fall based on searcher click activity at each search engine.
The changes affect the numbers shown only. No actual rankings will be affected by the combining of data within Bing Webmaster Tools. As a visual reminder that data is now combined, you will see both the Bing and Yahoo logos directly above the graphs shown on these pages.


At this time the data will be combined, not selectable. The data will also update in any market where Bing is powering Yahoo search.

Friday 14 October 2011

Technique_of_High-Ranking_in_Inbound_Links

How to get high-Ranking on Inbound Links,first of all you have to process of getting Page Rank backlinks.Good Technique of High-Ranking in Inbound Links.You have great design and good quality content.Create web-based tools to attract attRACT QUALITY BACKLINKS.Submit articles on articles site for exchange bio linking to ypur site.Register your RSS feeds and post quality content.Pay for faster reviews to be included in popular directory like Yahoo! and so many web directories.You sum monety to buy text link.
Linking on high Page Rank sites for pulled up your site for increasing Page Rank.Links from sites with lots of quality inlinks are worth more to your rankings.When you promote your site strive to gather links from higher Page Rank sites to boost your Page rank.


Meta-tag-generator-for-seo
Tips-4blog-posting
How-to-optimization-your-website-best-secret-of-seo
Online-promotion-for-blogger
Seo-tips_for Blogger
 Seo_on_link_building

Thursday 13 October 2011

Pay-per_click_Optimization




Pay-per-click optimization is the processes of improving keywords,ad copy,landing pages and ad group to boost of your search engine-bases ad campaigns.PPC advertising can be overwhelming at first.It has numerous options and complex targetings.
THE CYCLE OF ppc OPTIMIZATION STARTS WITH GOAL SETTING.Choosing Good Keywords,creating ad groups,writing asd copy.Creating landing pages and making bids.You can try ADWORDS.When starting ad campaign,you'll choose keywords,on-page ad location,language and network setting to aim your campaign at your target searches.

Ad Groups are the sets of keywords phrases that you can manages ad units.

Landing Pagesare the destination of ads.This is where the user lands after clicking on ad.Landing pages can be expanded lager group of pages that work together to vonvert visitors to buyers.

Bidsare called maximum costs per click.Bid must submit for a keywords is the pay to traffic.PPC is the a type of auction that is like a second-price sealed bidding system with private values.These type of auctions are difficults to bid successfully because you usually have incomplete information.


Meta-tag-generator-for-seo
Tips-4blog-posting
How-to-optimization-your-website-best-secret-of-seo
Online-promotion-for-blogger
Seo-tips_for Blogger

Wednesday 12 October 2011

Css_Optimization_tips

Cascading Style Sheet(CSS) optimization is transforms your HTML by abstracting inline style.CSS specific techniques-
Good CSS involves planning from the very good beginning for CSS layout style.To create a solid CSS-
Use a reset stylesheet.One solution to overlay specific selectors and cross- browser compatibility is to use a reset stylesheet.

Use the following techniques-


html,body,div,span,applet,object,iframe,a,
address,code,del,em,font,img,q,s,strike,
strong,tr,td,b,u,i,centre,ul,li,fieldset,h1,
h2,h3,h4,h5,h6,table,caption,tbody,label,{
margin:0px;
padding:0px;
border:0px;
outline:0px;
font-size:100%;
background:transparent;
}

Wednesday 5 October 2011

Different_between_Dynamic_ vs_ Static _Pages

The difference between dynamic versus static web design pages is how

content is presented to the browser as the visitor navigates to a page on

your site. A web design built with static pages has files in html or xhtml, and

perhaps a style sheet, that are stored on the server. The browser accesses

those files and the page displays. Static pages are also called client side

generated pages meaning they display using just the browser software on

your pc.

A web design using a dynamic technique of page creation has contents

stored in a database on the website server that are assembled and

displayed at the moment a page is accessed. Each page in the custom

CMS web design system is dynamically generated each time a person

requests that page. Dynamic pages are also called server side generated

pages, and the technical setup issues commands to take a custom cms

web design template and fill-in the content for the requested page.

This allows clients to create content in plain text that the CMS system will

dynamically assemble to generate the page web design as xhtml with css.

The conversion of your plain text to xhtml takes place without the site

owner learning custom website design using xhtml code.
Search Engine Friendly URLs
Search engines evaluate many aspects of on-page content when ranking
pages and the level of importance as compared to other similar websites
online. A search engine friendly URL refers to the page names that a web
designer assigns to each page they create. For example, on my site I have
an overview of my CMS web design services and the page file name is
cms-web-design-services.html.
The dashes are "invisible" to search engines, so they view that name as
"cms web design services", thus the page name is optimized to highlight
key words that describe the page content. The ranking value of search
engine friendly URLs is nominal, yet each subtle website design seo
strategy that is used in creating pages of your cms web design will begin to
add up, so they should be used.

Designing for Search Engines
The work performed by search engines indexing websites is done by
crawlers called bots. These are not human beings but supercomputers that
visit websites and access the code used to create the pages. The crawlers
are robotic, so they cannot view aesthetics like a gorgeous website design
or stunning photos. All they use to index your content and determine your
site theme is in the web design programming code.
Certainly your text content is included in the code, yet a well designed site
will also include code aimed at search engines and not viewed by human
visitors. The META code identifies key elements of your site and includes
"description" and "keywords". Other tags and strategies of using keywords
in hyperlinks and placing emphasis on key words using bold text are
additional clues that search engines use. Keywords or phrases used in any
hyperlinks are another search engine optimization web design strategy.
These will be reviewed in greater detail in the full SEO section of the
tutorial. For now it's enough to know that search engines only view code,
so providing clues as described will help search engines crawl your site.

Monday 3 October 2011

Best_practis_higher_search_engine_ranking




Wanna Achive high search engine ranking-You most first need to find the right keyphrases to target your content.Then after, create content around those keywords or keyphrases to target that is optimized with well-written titles,meta tags,header and body text
.



Keywords strategically-

Seo is a numbers of game.Each web page can effectively target one or two phrases.Shooting for one keyphrases that rank numbers one.You'll get more leads because you keyword reach will be higher.Take advantage of the long tail of search query distribution by targeting very specific phrases.


Good theme of your site


The theme of your web page should flow through everything associated with that page.The title tag,the headers,the meta tags(keywords and description tags),content,the link,navigation and URL of the page should all works properly.


Optimize Key Content

Most search engine favor title tag,body and headlines when ranking your site.They need the meta description element for search query resaulr pages.


Optimize on-site links

Search-friendly URLs that include keywords and hide the technology behind your site to improve your rankings.To focus on your Page rank and use the nofollow attribute.


Make it linkworthy

You have only one chance to make a first impression.Don't blow it with an unprofessional website.You arew much more likely to get links when your site is wel designed,with valuable and fresh content.Make your site on focused.

Inbound links

Search engines use external factors such as inbound links,anchor text,surrounding text and domain history to determine by the number and popularity of inbound links.

Sunday 2 October 2011

Increase_traffic_from_social_media

Social networking sites connect individuals from everywhere the globe. Business individuals will advertise their product, maintain their regular posts, increase the traffic to their websites and hence create cash on-line with the assistance of social media.
www.facebook.com
www.google+
www.twitter.com


Friday 16 September 2011

Yahoo! Rolls Out Redesign for Search Results Pages

Written By

At Yahoo!, we’re always looking for ways to make the Yahoo! Search experience even more organized and streamlined while serving the most relevant content. We have been working to unify the search experiences across web, multimedia, and vertical search results pages with a design that is clean and intuitive. Today, we are pleased to bring the latest design changes to life across various search results pages!

The following changes are available now on the Web, Images, Video, News, Blogs, Finance and Sports search results pages:

  • Clean and Simple – A cleaner and simplistic look and feel that helps you find what is most important to you and to enable you to take action faster.
  • Automatic Tabs – Easily accessible tabs will automatically appear right below the Search box to give you the specialized content you may be looking for on our other vertical search results pages. The tabs that may appear, based on the search results content, are Web, Images, Video, News, Blogs, Finance and Sports.
  • Left Filters – Filters on the left side of the results will appear, for sorting results by time and related searches.

The query-aware tabs make it easier for you to dive into a specific vertical search experience, such as the Sports search results page example below. The related athletes filter on the left, combined with sports videos results on the right, also enhance this results page.

The improved News search results page will also include filters on the left for news sources, as well as news videos to the right of the search results.

Since last month’s Image Search update, we have made the above changes to the Image search results page, plus we have added:

  • A larger pool of Facebook images to also include public profiles and fan pages
  • A richer “Latest” pictures experience from across even more Yahoo! content
  • Better recommendations at the end of galleries

Below is an example of the Image search results page for the “President Obama” query, which brings you the latest, most relevant pictures to the top, and indicates how long ago each image was published.

Twitter Web Analytics

Written By | Mail to Navjot

Twitter is a powerful platform for websites to share their content, and drive traffic and engagement. However, people have struggled to accurately measure the amount of traffic Twitter is sending to their websites, in part because web analytics software hasn’t evolved as quickly as online sharing and social signals.

Today we’re announcing Twitter Web Analytics, a tool that helps website owners understand how much traffic they receive from Twitter and the effectiveness of Twitter integrations on their sites. Twitter Web Analytics was driven by the acquisition of BackType, which we announced in July.

The product provides three key benefits:

  • Understand how much your website content is being shared across the Twitter network
  • See the amount of traffic Twitter sends to your site
  • Measure the effectiveness of your Tweet Button integration
  • Twitter Web Analytics will be rolled out this week to a small pilot group of partners, and will be made available to all website owners within the next few weeks. We’re also committed to releasing a Twitter Web Analytics API for developers interested in incorporating Twitter data in their products.

Wednesday 14 September 2011

OOP dreams





Object-oriented Programming (OOP) focuses on building programs from a set of 'smart' custom data types.This code helps save your time and makes it easier to reuse your code share it with others.I'll quick run through the syntax that is associated with OOP in PHP.You must understand the most basic concepts behind OOP.


>?php
$mybox=new Box('Rajesh');
echo $mybox ->get_what_is_inside();
?>









The variable $mybox stores a reference to a special 'smart'box built by new.

Tuesday 13 September 2011

HTML_inside_PHP

You have to check for double quotation marks.Using the ECHO statement may involve the use of double quotation marks because HTML also uses double quotation marks,you can do one of two things to avoid problems:

Escape your HTML double quotation marks with a backslash,as in the following:
echo"

";




Use single quotation marks around your html.This can help improve the readability of your code if you have many quotes:

echo'

';




Remember that you have still to follow's rules:Follow the PHP guidelines and end your sentences with a semicolon,as well as close all quotes.


Don't try cram too much HTML into your PHP:

EXAMPLE:

echo'';
echo'';
echo'';
echo'';
echo'';
echo'
';
echo'First Name:';
echo'
';
echo $_POST['fname'];
echo'
';
?>


Php was really needed for was to provided the value represented by $_POST['fname'] and display it on the screen.The rest of the related code was just to output HTML.In this case ,You're better off just staying in HTML and pulling out the PHP line when you need it.