I need to enable some access to the httpd logs over ftp so they can be analyzed by another application to get a report. I used to do this on Windows NT before replacing the server with CentOS.
Thanks to help from another thread I have an ftp server enabled on the web server. I thought the easiest thing to do would be to create an id for the application to connect with, then provide a symlink to the logs in that generic user's home directory.
The problem is the logs are owned by root. How can I make them readable by this generic id without completely compromising security? Plus, as the logs rotate this id will still need access.
Any suggestions?
Thanks, James
On Thu, Dec 01, 2005 at 01:07:20PM -0500, James Pifer enlightened us:
I need to enable some access to the httpd logs over ftp so they can be analyzed by another application to get a report. I used to do this on Windows NT before replacing the server with CentOS.
Thanks to help from another thread I have an ftp server enabled on the web server. I thought the easiest thing to do would be to create an id for the application to connect with, then provide a symlink to the logs in that generic user's home directory.
The problem is the logs are owned by root. How can I make them readable by this generic id without completely compromising security? Plus, as the logs rotate this id will still need access.
Any suggestions?
Have a cron job running as root that copies the necessary files someplace that your special id can get to, and chown them to the special id.
Serve this location via ftp.
On Thu, 2005-12-01 at 13:13, Matt Hyclak wrote:
On Thu, Dec 01, 2005 at 01:07:20PM -0500, James Pifer enlightened us:
I need to enable some access to the httpd logs over ftp so they can be analyzed by another application to get a report. I used to do this on Windows NT before replacing the server with CentOS.
Thanks to help from another thread I have an ftp server enabled on the web server. I thought the easiest thing to do would be to create an id for the application to connect with, then provide a symlink to the logs in that generic user's home directory.
The problem is the logs are owned by root. How can I make them readable by this generic id without completely compromising security? Plus, as the logs rotate this id will still need access.
Any suggestions?
Have a cron job running as root that copies the necessary files someplace that your special id can get to, and chown them to the special id.
Serve this location via ftp.
Or, run the analyzer in the same cron job. Analog or webalizer might do what you need. If you have to combine logs from several machines you'll probably want to script ssh commands to pull them together first.
James Pifer wrote on Thu, 01 Dec 2005 13:07:20 -0500:
The problem is the logs are owned by root. How can I make them readable by this generic id without completely compromising security?
You can either change owner or make it readable to more than just root. I don't see this as a security problem.
Plus, as
the logs rotate this id will still need access.
This can be accomodated with logrotate configuration if that is also rotating your user's file which I don't know. Look in /etc/logrotate.d for the apache file. You can change owner, chmod etc. in that file.
Kai
James Pifer jep@obrien-pifer.com wrote:
I need to enable some access to the httpd logs over ftp so they can be analyzed by another application to get a
report.
I used to do this on Windows NT before replacing the server with CentOS. Thanks to help from another thread I have an ftp server enabled on the web server. I thought the easiest thing to do would be to create an id for the application to connect with, then provide a symlink to the logs in that generic user's home directory. The problem is the logs are owned by root. How can I make them readable by this generic id without completely compromising security? Plus, as the logs rotate this id
will
still need access. Any suggestions?
I know you just setup FTP, but consider using SSH instead.
First off, access to the logs are solved by always running the process as root at the end system. There is no reduced security by doing this.
Secondly, setup 1 regular user on 1 system where you want the logs to be localized for processing. Then have the root user of each system SCP the log file to that 1 system as the 1 regular user. You'll want to use public key authentication (or a Kerberos realm if you want to avoid generating and/or copying keys for each system).
If you're into a more formal setup, CVS or other version control or data collection repository check-ins of the log files might be ideal. For CVS (and several others), you can use the SSH login.
I know you just setup FTP, but consider using SSH instead.
First off, access to the logs are solved by always running the process as root at the end system. There is no reduced security by doing this.
Secondly, setup 1 regular user on 1 system where you want the logs to be localized for processing. Then have the root user of each system SCP the log file to that 1 system as the 1 regular user. You'll want to use public key authentication (or a Kerberos realm if you want to avoid generating and/or copying keys for each system).
If you're into a more formal setup, CVS or other version control or data collection repository check-ins of the log files might be ideal. For CVS (and several others), you can use the SSH login.
The analyzing software runs on windows. It's connection options for looking at logs is file, http, or ftp. What's worse, is I just found that it apparently does not support passive ftp. I'm trying to get vsftpd to do active, but either I'm not getting it configured right, or more likely, the firewall is messing it up. I used to run windows ftp server for providing the logs when it ran on windows, and ftp'ing was no problem.
Anyway, that's where I'm at right now.
James
On 12/1/05, James Pifer jep@obrien-pifer.com wrote:
I know you just setup FTP, but consider using SSH instead.
First off, access to the logs are solved by always running the process as root at the end system. There is no reduced security by doing this.
Secondly, setup 1 regular user on 1 system where you want the logs to be localized for processing. Then have the root user of each system SCP the log file to that 1 system as the 1 regular user. You'll want to use public key authentication (or a Kerberos realm if you want to avoid generating and/or copying keys for each system).
If you're into a more formal setup, CVS or other version control or data collection repository check-ins of the log files might be ideal. For CVS (and several others), you can use the SSH login.
The analyzing software runs on windows. It's connection options for looking at logs is file, http, or ftp. What's worse, is I just found that it apparently does not support passive ftp. I'm trying to get vsftpd to do active, but either I'm not getting it configured right, or more likely, the firewall is messing it up. I used to run windows ftp server for providing the logs when it ran on windows, and ftp'ing was no problem.
Anyway, that's where I'm at right now.
James
I've found that there are FAR superior logfile analyzer tools on linux than windows. We ended up sending all our log files from the windows servers we have to a consolidated drop on a linux box and running a linux based analyzer. It gives better information, more detail, etc. Things like splunk, awstats, mod_log_sql (which lets you do live "top links" type things), and any of several other tools available.
-- Jim Perrin System Architect - UIT Ft Gordon & US Army Signal Center
I've found that there are FAR superior logfile analyzer tools on linux than windows. We ended up sending all our log files from the windows servers we have to a consolidated drop on a linux box and running a linux based analyzer. It gives better information, more detail, etc. Things like splunk, awstats, mod_log_sql (which lets you do live "top links" type things), and any of several other tools available.
I'm definitely not stuck on the tool we have. I'll check out the couple you mention. If you have a recommendation that would be great. We pretty much look for general stats, what pages are being hit and how often, etc.
Thanks, James
I'm definitely not stuck on the tool we have. I'll check out the couple you mention. If you have a recommendation that would be great. We pretty much look for general stats, what pages are being hit and how often, etc.
Splunk is a reasonably new tool that makes log files searchable, similar to a google query of your log files. http://www.splunk.com/
mod_log_sql isn't so much an analyzer as it is a different method of storing log files. Instead of dumping them into a file and parsing them from flat text, it stores them directly in an sql database. You can query however you want from the database, and gather any information you want. Since it's input in realtime, you can (assuming enough horsepower on the system) query it in real time to see what people are doing at that instant. One site we have adjusts a "top rated links" based on what people are doing at the time.
awstats is excellent for general logfile analysis of web traffic. There are tons on freshmeat/sourceforge that I haven't mentioned or don't know about that are probably excellent as well.
-- Jim Perrin System Architect - UIT Ft Gordon & US Army Signal Center
James Pifer jep@obrien-pifer.com wrote:
The analyzing software runs on windows.
Setting up cygwin with the SSH service is easy on NT.
It's connection options for looking at logs is file, http, or ftp.
Then use file. Use another, automated process on your servers to send them file to the system -- or at least an intermediate system the analyzing system then pulls from via HTTP or FTP.
What's worse, is I just found that it apparently does not support passive ftp. I'm trying to get vsftpd to do active, but either I'm not getting it configured right, or more likely, the firewall is messing it up.
Another reason to consider SSH.
Have your Internet systems SCP the files to a SSH server on your LAN, or in your DMZ. Run the SSH server on a different port than port 22, and _only_ allow public key authentication (or Kerberos if you wish to set that up instead of maintaining SSH key rings).
I used to run windows ftp server for providing the logs
when
it ran on windows, and ftp'ing was no problem.
Then keep that system to FTP from, and just install Cygwin with the SSH service. I assume this is on your LAN (which probably means this is more of a firewall issue -- and not the FTP service on the systems outside the firewall).
BTW, if you're running ADS, you can use it's Kerberos service for SSH authentication! You only need to open (or proxy/redirect) port 88 for the external systems. Although there might be some security considerations in that regard.
I.e., maybe your Windows FTP server is a new DC, with its own domain (separate from your LAN), and that's where you have the Kerberos authentication/trusts (possibly in your DMZ)?
Anyway, that's where I'm at right now.
Golden Rule: Do _not_ let the limitations of an application dictate your end-to-end security. Shortcut the ends if needed, put the less secure/more problematic points on your LAN, but keep your Internet traffic secure, and easier to manage at the same time. ;->
-- Bryan
P.S. This would be so much easier to diagram on a whiteboard.
On Thu, 2005-12-01 at 15:08, James Pifer wrote:
The analyzing software runs on windows. It's connection options for looking at logs is file, http, or ftp. What's worse, is I just found that it apparently does not support passive ftp. I'm trying to get vsftpd to do active, but either I'm not getting it configured right, or more likely, the firewall is messing it up. I used to run windows ftp server for providing the logs when it ran on windows, and ftp'ing was no problem.
Anyway, that's where I'm at right now.
You could install cygwin sshd on the windows box and scp the files there in a cron job - or to an intermediate box in the right place with respect to your firewalls. The part no one has mentioned yet is that these files will be growing unless you coordinate the logrotate step with the transfer, or perhaps more cleanly, have apache pipe the logs to it's 'rotatelogs' helper that creates new timestamped files at specified intervals. You can run rsync against these to pick up whatever is new.
James Pifer jep@obrien-pifer.com wrote:
I need to enable some access to the httpd logs over ftp so they can be analyzed by another application to get a
report.
I used to do this on Windows NT before replacing the server with CentOS. Thanks to help from another thread I have an ftp server enabled on the web server. I thought the easiest thing to do would be to create an id for the application to connect with, then provide a symlink to the logs in that generic user's home directory. The problem is the logs are owned by root. How can I make them readable by this generic id without completely compromising security? Plus, as the logs rotate this id
will
still need access. Any suggestions?
I know you just setup FTP, but consider using SSH instead.
First off, access to the logs are solved by always running the process as root at the end system. There is no reduced security by doing this.
Secondly, setup 1 regular user on 1 system where you want the logs to be localized for processing. Then have the root user of each system SCP the log file to that 1 system as the 1 regular user. You'll want to use public key authentication (or a Kerberos realm if you want to avoid generating and/or copying keys for each system).
If you're into a more formal setup, CVS or other version control or data collection repository check-ins of the log files might be ideal. For CVS (and several others), you can use the SSH login.