Bugtraq mailing list archives

Re: [linux-security] Things NOT to put in root's crontab


From: guenther () gac edu (Philip Guenther)
Date: Thu, 23 May 1996 15:34:22 -0500


Colin Jenkins <jenkins () DPW COM> writes:
Hopefully I'm not beating this excercise into the ground.  I think your
psuedo code does not quite work, and there are some inherent problems in the
approach- particularly if we assume that some hacker is trying hose up your
system (the reason for all of this in the first place).

If they're just trying to hose the system they can just run /tmp out of
inodes and be done with it.  find isn't the problem in that case.  find
does have some problems that can be cleaned up though, and they should
be.


William McVey <wam () fedex com> writes:
The race condition in find should be eliminatible by using fchdir()
and passing the '-exec'ed command a simple filename.  You have to keep

Watch your citing, I said that, not William McVey.


One major problem with this approach is that it assumes that file names
are passed to -exec directives with the intent of operating on the file itself.
This ignores the fact that many -exec directives operate on the *file name*,
and it may be critical to pass a full pathname.  This requirement of passing
a full path name is in conflict with the algorithm's purpose.

This is true.  Perhaps a "{}" on the command line should be sub'ed with
the relative name and a "{{}}" should be sub'ed with the absolute
name.  You would just have to remember that while decisions can take
place based on the absolute name, you should always use the relative
name when actually doing anything to the file.

This whole situation is a lot cleaner in perl using the File::Find
module, as there you can either use $_ (the relative filename) or
$File::Find::name (the absolute filename) in your "wanted" subroutine.
I just whipped up a new version of File::Find that uses fchdir, however
it won't work until perl can do fchdir -- currently there's no way to
do it in perl, though I've submitted and RFE, and from looking at the
code for perl's stat operator, it shouldn't be too difficult to add.
Maybe I'll do it when I have the time...


open one descriptor for each level descended which should max out at
MAXPATHLEN/2.  That should be within the bounds of modern UNIX systems.

I think the limiting number here has less to do with path length, and more
to do with NOFILE, the maximum number of file descriptors a process can have
open.  On many systems, this is only 256 (except solaris at 1024 I believe).
Since the attack creates a race condition by deep nesting of directories, this
algorithm fails completely if the hacker nests the directories deeply enough.

If MAXPATHLEN/2 < NOFILE then this shouldn't be a problem. Remember,
this is _not_ following symlinks, so you can only chase MAXPATHLEN/2
deep (1 character directory names would alternate slashes and a
non-slash character).


   > In pseudocode:
   >
   > cur = open argv[1];

How do you prevent following symlinks in argv[1]?   Also, you shouldn't assume
that arg1 is a directory.

This is pseudocode.  You'll note I don't catch errors either.  In real
code you'd do an fstat and branch on that.

Anyway, the code didn't even do what it was supposed to do: it needed
to do an lstat _also_ and compare the dev/ino combo between the fstat
and lstat to make sure it opened the same file it checked for
symlinkhood on.  Someone else posted the correct algorithm in response
to my brain fart.


The bottom line is that find probably could not be modified this way without
breaking its functionality for other purposes.  Moreover, recursive algorithms
must always include checks to prevent recursing beyond the capabilities of
the system they run on.  This is especially true where security is concerned.

That's the MAXPATHLEN/2 < NOFILE check.


I'd suggest that the best solution to the problem is a program written
specifically for the purpose of deleting or changing files.  Although I like
recursion in theory, the error recovery problems inherent in deep directory
nesting are more easily addressed with an iterative approach.

I don't see how you can cover a tree with any efficiency without saving
a hook (here, a file descriptor) to each level of directory as go in.


Philip Guenther

----------------------------------------------------------------
Philip Guenther                 UNIX Systems and Network Administrator
Internet: guenther () gac edu      Voicenet: (507) 933-7596
Gustavus Adolphus College       St. Peter, MN 56082-1498
I am _not_ a representative sample of the Gustavus Community.  Yeah, right...
Source code never lies (it just misleads).  (Programming by Purloined Letter?)



Current thread: