Preventing direct access to include files

greg's picture

He has: 1,581 posts

Joined: Nov 2005

I have a few ideas, but wondering what you consider to be the best method.

Currently as I have very few include files, I do this in each file:

<?php
if ($_SERVER['SCRIPT_NAME'] == "/incs/the_filename.php"){ 
exit(
header('Location: /not_found.php'));
}
?>

So if the file is being viewed directly, user is sent to the "not found" page.

If I deny in .htaccess I presume scripts trying to include will also be denied access?

pr0gr4mm3r's picture

He has: 1,502 posts

Joined: Sep 2006

I just put my include files behind a folder that is not accessible by Apache. That doesn't mean it has to be outside the web root, but I would at least put in an .htaccess file with Deny from all in the file.

decibel.places's picture

He has: 1,494 posts

Joined: Jun 2008

pr0gr4mm3r wrote:
I would at least put in an .htaccess file with Deny from all in the file.

obviously you meant in the folder/directory, not "in the file"

greg and I know what you mean, but it could confuse n00bs Wink

I'm wondering about the paranoia level here. Presumably you're nervous about someone accessing includes to figure out vulnerabilities etc. But the baddies will only know the filenames if they can view the PHP source of the file that includes them, not the rendered HTML sent to the browser. What is the likelihood of guessing a directory name AND a filename?

Here is how Drupal secures the inc files in .htaccess:

# Protect files and directories from prying eyes.
<FilesMatch "\.(engine|inc|info|install|module|profile|po|sh|.*sql|theme|tpl(\.php)?|xtmpl)$|^(code-style\.pl|Entries.*|Repository|Root|Tag|Template)$">
  Order allow,deny
</FilesMatch>

try it: http://drupal.org/includes/session.inc

greg's picture

He has: 1,581 posts

Joined: Nov 2005

decibel.places wrote:
I'm wondering about the paranoia level here. Presumably you're nervous about someone accessing includes to figure out vulnerabilities etc.
It's not about security.
All my code is checked and limited to only allow what people can do through the normal usages.
So they could try to tamper with any file or includes and will get nowhere as a session or post/get variable wont be set or wont be allowable data, so they will be re-diverted.

But some of the includes are purely for include purposes, such as a file that emails me upon a certain error. As this will happen very rarely I just have the email scripts in an include file, to avoid having code in the main file as 99% of the time it will be redundant.

So while security isn't really an issue, people accessing this email file directly will send me an email, as with the other include files - one is a specific menu included on some pages, accessing it will show them the menu and nothing else. Again, it's not a security issue but they have no reason to be there.

Just like your local corner shop doesn't want you going behind the counter. There is no need for you to be there.

decibel.places wrote:
... if they can view the PHP source of the file that includes them
They shouldn't be able to do that. If they do, they cannot bypass anything by knowing my var names or what I do as everything is checked in a strict manner.
That said, of course I wouldn't want to tempt fate Wink

greg's picture

He has: 1,581 posts

Joined: Nov 2005

Hmm, double protection.

So now I have a htaccess file in the incs dir like this:

<FILES ~ "\.*$">
Order allow,deny
Deny from all
</FILES>

That causes an Apache 403, which my main htaccess file in root redirects the user to the "not found" page.
I think telling people "not found" is better than "permission denied" as with the latter you have actually told them it exists and for these files, people don't need to know they exist.

I also still leave the per file blocking as a backup

<?php
if ($_SERVER['SCRIPT_NAME'] == "/incs/the_filename.php"){
exit(
header('Location: /not_found.php'));
}
?>

I think those two will be enough, cheers!
Am I correct in thinking this will block robots too?

pr0gr4mm3r's picture

He has: 1,502 posts

Joined: Sep 2006

Am I correct in thinking this will block robots too?

Bots can't get anywhere the average user can't get to.

decibel.places's picture

He has: 1,494 posts

Joined: Jun 2008

re: bots

wouldn't hurt to add a robots.txt file and meta tags for bots that behave...

greg's picture

He has: 1,581 posts

Joined: Nov 2005

I always have a robts.txt to keep bots in order, but humans and bad bots don't follow the robots.txt file.

Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.