I manage a web hosting server that we've recently upgraded, in part so we could accommodate a domain that will enable community mapping. In a recent exchange of mails one developer said:
"I could build the package directly on the server machine you have, provided that the potential security risk posed by having compilers installed is not an issue."
and another said:
"What sort of security risk is there in having compilers installed on a working server?
"Obviously we can remove the compilers, however when Mapserver or postgis get updated, we will need to build new packages somewhere. One option: create a second VM for mapchat. We'll put the build environment on it, and only turn it on to make new packages."
I don't have enough experience to assess the security issues. Does anyone have an opinion on this? It would be simple and feasible to allocate another domain as suggested above.
Dave
Dave Stevens wrote:
I don't have enough experience to assess the security issues. Does anyone have an opinion on this? It would be simple and feasible to allocate another domain as suggested above.
Unless your running an obscure platform having a compiler on the system shouldn't be a big deal, if you can upload source code, you can upload a precompiled binary
nate
On 3/6/2010 4:04 PM, nate wrote:
if you can upload source code, you can upload a precompiled binary
True, but most attacks are automated, and try to attack as wide a range of machines as possible.
If I were to write a bit of malware for *ix that needed a custom binary on the target machine, I'd at least consider distributing it as C code, banking on the fact that most *ix systems have a C compiler installed by default these days.
The core assumption here is that it's easier to write C code for an *ix system that will compile on a wide range of OSes than it is to craft a binary that will run on as many systems. One of the biggest problems in the *ix world is a reliance on source-level compatibility. Other OSes -- Windows in particular -- take a different tack, providing ABI-level compatibility over the course of decades. That has pluses and minuses. For a malware writer, it means it's far more reliable to distribute binaries than C code.
That being said, I always find it to be a colossal PITA to work on an *ix system without a C compiler. Again, source vs. ABI-level compatibility. Too often, I need to install something that isn't available as a binary package for that particular system, or I need it to install in a nonstandard way, so I have to build from source.
You might find that this is one of those security risks you're prepared to accept. Just because you identify a risk doesn't mean you have to defend against it. You should always do the cost-benefit calculation before you decide.
On 3/6/2010 4:04 PM, nate wrote:
if you can upload source code, you can upload a precompiled binary
True, but most attacks are automated, and try to attack as wide a range of machines as possible.
If I were to write a bit of malware for *ix that needed a custom binary on the target machine, I'd at least consider distributing it as C code, banking on the fact that most *ix systems have a C compiler installed by default these days.
<snip> Which is why, for the 10 or 11 years that I've used a linux box as a firewall router at home, it had almost *nothing* on it, and that was before I ran Bastille against it. I intended it as a cheap (old hardware, the second one was scrounged) firewall/router, and *nothing* *else*. So, when I built it, no compilers, no languages (other than things like perl and awk and shells), no X... and only one user other than the system users (me).
mark
On Mon, Mar 08, 2010 at 07:34:14AM -0700, Warren Young wrote:
On 3/6/2010 4:04 PM, nate wrote:
if you can upload source code, you can upload a precompiled binary
True, but most attacks are automated, and try to attack as wide a range of machines as possible.
If I were to write a bit of malware for *ix that needed a custom binary on the target machine, I'd at least consider distributing it as C code, banking on the fact that most *ix systems have a C compiler installed by default these days.
It is no longer just the C compiler. Perl, Python, Ruby, php even bash all have rich libs and can do more quicker than most can accomplish with a C program and with more portability too.
It makes sense to have a good firewall that limits all in and out paths as well as a proxy server for outgoing connections and other footprint tools.
Logs and management should involve another box such that the system admin folk have a safe and different place to do their job from.
On Sat, Mar 6, 2010 at 6:02 PM, Dave Stevens geek@uniserve.com wrote:
I don't have enough experience to assess the security issues. Does anyone have an opinion on this? It would be simple and feasible to allocate another domain as suggested above.
The compilers themselves aren't really a security risk, but IF someone gets into your system, there's no need to provide them with tools they can use to do their dastardly deeds. I'm a minimalist when it comes to my production systems. Not having extraneous packages on the system means (ostensibly) less patching, less applications with potential holes which in turn means less surface area to attack, etc.
I don't have enough experience to assess the security issues. Does anyone have an opinion on this? It would be simple and feasible to allocate another domain as suggested above.
As was stated by others the compiler itself isn't any more of a security risk then any other tool. If a hacker can get root he can just as easily upload binary packages as he can compile source.
That said, I'd still recommend running a second VM as a build environment. That way if for some reason an update to those custom packages somehow horribly breaks the entire OS (don't laugh, I've seen it happen) it's only the build environment you've trashed and not the production environment.
As was stated by others the compiler itself isn't any more of a security risk then any other tool. If a hacker can get root he can just as easily upload binary packages as he can compile source.
It is still a wise decision to not have the compiler installed if it can be avoided. Any hacker that is not at a senior/high end intermediate level of expertise will not have all the different versions of his rootkit and other tools easily available for all the different OS distros and kernels that he'll find on the Internet.. so I'd say that most hackers cannot just as easily upload binary packages because of the wide array of support that he'd need. Admittedly since Centos/RHEL is such a big presence there is a higher degree of likelihood that he'd have the right tools in a binary package at hand, but he'll still have to expend more time and effort, not to mention that the uploads are more likely to be noticed.
Making the bar higher, even in little increments, is a basic tenant of systems security. Never dismiss the power of baby steps.
-geoff
--------------------------------- Geoff Galitz Blankenheim NRW, Germany http://www.galitz.org/ http://german-way.com/blog/
Geoff Galitz wrote:
Making the bar higher, even in little increments, is a basic tenant of systems security. Never dismiss the power of baby steps.
Keep in mind diminishing returns with those baby steps.. Of the ~500-600 systems I've worked on over the past 10 years the only ones that were confirmed to be compromised were ones that were placed directly on the internet(not by me), and wasn't kept up to date with patches. I think I worked on 3 such systems.
- keep up to date on patches - if on the internet, lock ssh down to ssh key auth only, try to run a tight firewall on other ports. - don't allow untrusted local accounts - Run only well tested programs(especially when it comes to webapps) with a good track record wherever possible - If at all possible do not put any server directly on the internet (98% of my systems reside behind load balancers, which is a form of firewall since only ports that are specifically opened are allowed through)
To-date I haven't needed things like NIDS/HIDS (too many false positives), or things like SElinux(PITA). After this long, and so many systems I don't think luck plays a big role at this point. The servers I manage for my employer receive roughly 2 billion web hits per day.
If you can manage those things, the chance of being compromised is practically zero, barring some remote evil organization that has bad guys specifically out to get you.
nate
On Sunday 07 March 2010 03:35:43 pm nate wrote:
The servers I manage for my employer receive roughly 2 billion web hits per day.
2 billion per day? That's 20 000 hits per second, on average. How many servers do you actually have behind load-balancers to deal with this kind of activity? And also, what are you maintaining? Google?
Best, :-) Marko
On Sun, 2010-03-07 at 17:24 +0000, Marko Vojinovic wrote:
On Sunday 07 March 2010 03:35:43 pm nate wrote:
The servers I manage for my employer receive roughly 2 billion web hits per day.
2 billion per day? That's 20 000 hits per second, on average. How many servers do you actually have behind load-balancers to deal with this kind of activity? And also, what are you maintaining? Google?
--- And does your ssh machines get hammered? I see that daily on some of my clients. The longest attack yet I have seen in the log files was 37 hours straight. All use key auth also though. These do not have such as fail2ban or iptables limiting. I have always seen those types of schemes to be hindering security. Thus adding in more problems. I will admit the machine that was attacked for 37 hours was a CentOS 5.2 OS. So someone at OSCent is doing something right or I was.
John
On Sat, Mar 6, 2010 at 6:02 PM, Dave Stevens geek@uniserve.com wrote:
I manage a web hosting server that we've recently upgraded, in part so we could accommodate a domain that will enable community mapping. In a recent exchange of mails one developer said:
"I could build the package directly on the server machine you have, provided that the potential security risk posed by having compilers installed is not an issue."
and another said:
"What sort of security risk is there in having compilers installed on a working server?
"Obviously we can remove the compilers, however when Mapserver or postgis get updated, we will need to build new packages somewhere. One option: create a second VM for mapchat. We'll put the build environment on it, and only turn it on to make new packages."
I don't have enough experience to assess the security issues. Does anyone have an opinion on this? It would be simple and feasible to allocate another domain as suggested above.
Just playing Devil's advocate htere...
It's conceivable to be kernel specific code that would need to be compiled specifically for a particular system. For example, an exploit in a kernel module loader may need to be compiled. If someone had to deliver this exploit to many systems they could rely upon the ability to compile the code rather than pushing a binary module. The former could very well be hidden in some other vector, but the latter would likely trip off signature or other scanners.
I'd generally agree with the others though that in itself installing the compilers is not a great security risk, provided it's sufficiently locked down (e.g., maybe use selinux in addition to basic Unix permissions to prevent running from the web accounts, etc.).
Kwan Lowe wrote:
On Sat, Mar 6, 2010 at 6:02 PM, Dave Stevens geek@uniserve.com wrote:
I manage a web hosting server that we've recently upgraded, in part so we could accommodate a domain that will enable community mapping. In a recent exchange of mails one developer said:
"I could build the package directly on the server machine you have, provided that the potential security risk posed by having compilers installed is not an issue."
and another said:
"What sort of security risk is there in having compilers installed on a working server?
"Obviously we can remove the compilers, however when Mapserver or postgis get updated, we will need to build new packages somewhere. One option: create a second VM for mapchat. We'll put the build environment on it, and only turn it on to make new packages."
I don't have enough experience to assess the security issues. Does anyone have an opinion on this? It would be simple and feasible to allocate another domain as suggested above.
Just playing Devil's advocate htere...
It's conceivable to be kernel specific code that would need to be compiled specifically for a particular system. For example, an exploit in a kernel module loader may need to be compiled. If someone had to deliver this exploit to many systems they could rely upon the ability to compile the code rather than pushing a binary module. The former could very well be hidden in some other vector, but the latter would likely trip off signature or other scanners.
I'd generally agree with the others though that in itself installing the compilers is not a great security risk, provided it's sufficiently locked down (e.g., maybe use selinux in addition to basic Unix permissions to prevent running from the web accounts, etc.).
While I typically do have the compilers and kernel headers installed on general purpose servers where I might want to run VMware server or rebuild a source rpm, I would not be very comfortable if I did not have a matching test machine where I could build and test before trying it in production - and then it would be possible to just copy the binary anyway.
On Sunday 07 March 2010 09:54:23 am Les Mikesell and MANY others wrote:
While I typically do have the compilers and kernel headers installed on general purpose servers where I might want to run VMware server or rebuild a source rpm, I would not be very comfortable if I did not have a matching test machine where I could build and test before trying it in production - and then it would be possible to just copy the binary anyway.
and all in all I've had a lot of good advice.
So, we will run a separate VM as a development environment with authorized access to about 4 people. I already run updates every day so it seems we should be fairly safe there.
Thanks to all who answered.
Dave
Dave Stevens wrote:
I manage a web hosting server that we've recently upgraded, in part so we could accommodate a domain that will enable community mapping. In a recent exchange of mails one developer said:
"I could build the package directly on the server machine you have, provided that the potential security risk posed by having compilers installed is not an issue."
That's how the "Internet Worm" spread.
As a general principle, machines on the "periphery" or what one might call "firewall machines" should have nothing installed which they don't need in order to perform their primary intended function. That means both hardware and software, IMO.
The less which is there, the fewer potentials for compromise exist.
No services should run which aren't necessary for the functioning of the machine. Don't even install them unless you have to, but don't enable/start them if you install them.
I would install rkhunter and tripwire, and I would peruse their logs.
Mike