Quote:
Originally posted by Cyndalie
BUT when you view it from the web, all content is plugged in so what you see is NOT what you necessarily have to work with when optimizing dymanic sites. Since engines are robots and not humans calling a page, they can read what the programming code looks like - this used to be applied to cloaking several years ago. Show the engine one thing and the user another - but by IP, here it's because dymanic content reacts when it's CALLED, not necessarily read. This is why when the code is cached and viewed in a brower, it even looks normal as well.
|
Cold fusion is a server side scripting language. The server doesn't know whether the client is human or a robot. A search engine spider issues basically the same HTTP request that a human via a browser would issue. There is no special backdoor for robots.
If you want to try it for yourself, open a plain connection to your webserver on port 80 and issue the HTTP commands by hand. Just make sure you hit enter twice after the last line:
GET / HTTP/1.1
Host:
www.smutdrs.com
User-Agent: Googlebot/2.1 (+
http://www.googlebot.com/bot.html)
You'll see exactly the same thing a spider will see. If you want to play around more with direct connections, read up on the HTTP specifications:
http://www.faqs.org/rfcs/rfc2616.html
Quote:
Originally posted by Cyndalie
Most sites consist of both HTML and whatever dynamic programming language, and they can actually work FOR you, however if you have a say when working with a programmer, make sure they are not making 100% of your site dynamically generated it it will become optimization hell. I haven't run across this much, and look for it before committing to new site for optimization.
|
At the same time, if you are the programmer (or you can get him to understand what is important) a fully dynamic site can be an optimization dream. For example, you could adjust the keyword weighting across 100K pages by changing 1 line in one file.