Ben sürünerek ya da değil alırsa http://www.auditmypc.com/xml-sitemap.asp kontrol etmek için kullanın.
http://www.domain.com/
does NOT get crawled
http://www.domain.com/bg/
does NOT get crawled
http://www.domain.com/bg/medical/
does unleash the huge crawling activity
aynı zamanda, bu /bg/medical/
sahip olup olmadığını bir nedenle önemli www
. Eğer bu olmuyorsa, tarama tekrar çalışmaz.
. Htaccess dosyası:
RewriteEngine on
RewriteBase /
RewriteCond %{http_host} ^domain.com [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^.*$ site/lib/router.php
router.php dosya ilgili bölümleri:
@list($uri,$queryString) = explode('?',$_SERVER['REQUEST_URI'],2);
$uri = trim($uri,'/');
$_GET = array();
parse_str($queryString,$_GET);
if (empty($uri)) {
header('Location: /bg/home');
die();
}
@list($first,$second,$third,$fourth) = explode('/', $uri, 4);
///////////// VARIETY OF SWITCHES HERE with stuff like that:
if(empty($second))
switch($first){
case 'bg':
redirect('/'.$first.'/home/');
case 'en':
redirect('/'.$first.'/home/');
}
I think it's somewhere in the redirects that things get lost. But I can't figure out why. Any help would be appreciated.
robots.txt:
User-agent: *
Allow: /
ve redirect
fonksiyonunun tanımı:
function redirect($url) {
header("Location: $url");
die();
}
EDIT: bu da yardımcı olur:
Header fields
HTTP/1.1 302 Moved Temporarily
Date
Thu, 13 Jan 2011 11:02:13 GMT
Content-Length
0
Location
/bg/home
Keep-Alive
timeout=5, max=100
Connection
Keep-Alive
Content-Type
text/html
Server
Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8e-fips- rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635
X-Powered-By
PHP/5.2.11