Quote:
Originally posted by Cyndalie
The search engines see's what code is on your page when you upload it to the server, not what you see when you View source on the page from the web.
|
100% wrong. A search engine sees exactly what you see when you view the source of your page. There's no way for a spider to see the "uploaded" version of the page. A search engine's spider makes the same type of HTTP request as a browser, so it sees exactly the same thing.
Quote:
Originally posted by Cyndalie
If you code is all includes, you have no chance of ranking that site unless you can plug in some static content, either hidden in code or visible on the actual page.
|
Do NOT try to hide text. Hidden text is just about the worst offense you can commit in the eyes of a search engine. Search engines want to deliver surfers to a page that relevant for the term they searched for. As such, they want to index a page based on what the surfer will see, not based on the words the webmaster wants to get traffic from. Let me repeat one more time, DO NOT HIDE TEXT.
Quote:
Originally posted by Cyndalie
Trust me I know, I've been battling it out with a Cold Fusion site that is 100% includes and html content plugins where the only pages I can get ranked are the static promotional pages. What you see from the web has no bearing on what the SE see's. An example is http://SmutDrs.com - every peice of code on that site is from an include. Not one bit of static content. Can you tell? Only the webmaster truly knows.
|
Taking a quick look at your page, I don't think your problem is includes. Most of the pages I looked at seemed to have very little actual text on the page. The pages that do have text, tend to have the most spider friendly parts at the end of the page, after lots of javascript and page layout junk. Your title and meta-tags also appear a bit "stuffed" and are the same on every page.
Quote:
Originally posted by Cyndalie
SE's cannot spider or index included content yet. They are just now becoming able to spider and index flash movies however Google can do links and AllTheWeb.com and indes links and actual text in the flash movie
|
Again, includes are not a problem and never were. A few years back search engines tended to avoid pages that looked dynamic (ie had query strings, or dynamic extensions) but that was mainly due to a fear of the spider getting stuck in a recursive loop and not from inability.
Quote:
Originally posted by Cyndalie
Just don't [ .. ! include... the headar content, the search engines needs the head tags static in order to read the title and metas. Anything else on the page has to be in the static CODE to be read by SE's. SE's cannot spider includes or the content within them yet.
|
There are no special requirements for the header of a page as compared to the body. There is no problem using a dynamic header as long as its done server side.
If you want to learn more about search engine optimization, check out:
http://www.searchenginewatch.com
http://www.webmasterworld.com
http://www.searchengineforums.com
One last time, do NOT hide text. Its like begging to be banned.