PHP website and google spidering the site

Soldato
Joined
24 May 2006
Posts
3,824
Location
Surrey - UK
Hi guys,

Wondering if anyone can help with a query relating to google spidering a php & html based website.

Currently i am working on a website, which at present has a main index.php with several folders each containing their own index.php etc.....

Each index.php for each sub-directory has <html> all page code + the given content for that page</html>....

Now i am working on switching this all to use includes and php switch so the visitors will only have to load the header and footer code once, the content area simply includes the content for the given page.

The new way will use index.php?page=mysubdirectorypage to include the content, done using the switch and case commands.

Onto my question, although the 2nd method uses far far less code and is easier to use, will it be better or worse in terms of google spidering the site and links?

Any comments welcome, as long as they are relevant of course... ;)
 
I wouldn't think it should make a difference.
I have 7 server side includes on each page of my site so it is completely modularized.
 
Well from a few articles i've read there is an indication that if all your pages point to the main index it helps contribute toward how google ranks the site.

The second method of course would result in far more links to the main index....

Basic php, html, & css coding is my thing, how search engines work though really is out of my depth...
 
Now i am working on switching this all to use includes and php switch so the visitors will only have to load the header and footer code once, the content area simply includes the content for the given page.

where did you get that idea from? :confused: (i guess that's the way frames work but certainly not with includes.)

scripting language includes are totally transparent to the end user/robots. the html (what you see in "view source") is generated at the server dyamically on each page load. i think you've got the wrong end of the stick completely. :p
 
where did you get that idea from? :confused: (i guess that's the way frames work but certainly not with includes.)

scripting language includes are totally transparent to the end user/robots. the html (what you see in "view source") is generated at the server dyamically on each page load. i think you've got the wrong end of the stick completely. :p

no, the included section is in there cache so it does not need to be re-loaded.
 
no, the included section is in there cache so it does not need to be re-loaded.
Nope, if you mean you're simple doing an <?php include('header.php'); ?> then the PHP interpreter still has to run though header.php and process it just like it would for any other request - doing it this way isn't going to make a jot of difference to how browsers cache your pages, remember all they see is the outputted HTML source.

If you want to use caching for your scripts, you should look at server side caching engines such as PHPAccelerator.
 
Input all very welcome guys, appreciate the comments so far....

If anything it is much easier to manage, but my main area of confusion is which method is better for search engines to pick up on. I'm hoping the 2nd method is better or the same, as long as the 1st method isnt better for search engines i'll be happy.

To explain i'm totally re-writing most of a whole site, and reducing code, so i saw the opportunity to use some php to make the site easier to use at the same time, and any additional benefits are a bonus.

The original code had lots of un-needed HTML, tables within tables, within tables, with piles of attributes set in the HTML and not in the stylesheet.

So far the site is visually loading as far as i can tell around 20% faster and is now XHTML compliant, with the original having several XHTML errors. Much more to be done yet though.....
 
Back
Top Bottom