I have a perl script which runs from a cron job. How would you limit the amount of RAM that this script is allowed to consume? Is there a ulimit setting that will accomplish this? If so does ulimit have to be run each time the script is run, or is there a way to set it permanently?
Dear Fellows:
I'm under a proxy server (squid) and I need to update my centos but every time that I run yum -y update tells me that I can no find a file that exists. The problems is that this machine that I'm trying to update has no graphic shell, so I need to make all by console.
Saludos Fraternales _____________________________ Atte. Alberto García Gómez M:.M:. Administrador de Redes/Webmaster IPI "Carlos Marx", Matanzas. Cuba. ----- Original Message ----- From: "Sean Carolan" scarolan@gmail.com To: "CentOS mailing list" centos@centos.org Sent: Monday, July 20, 2009 11:28 AM Subject: [CentOS] Limit RAM used by a perl script
I have a perl script which runs from a cron job. How would you limit the amount of RAM that this script is allowed to consume? Is there a ulimit setting that will accomplish this? If so does ulimit have to be run each time the script is run, or is there a way to set it permanently? _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Alberto García Gómez wrote:
Dear Fellows:
I'm under a proxy server (squid) and I need to update my centos but every time that I run yum -y update tells me that I can no find a file that exists. The problems is that this machine that I'm trying to update has no graphic shell, so I need to make all by console.
Saludos Fraternales _____________________________ Atte. Alberto García Gómez M:.M:. Administrador de Redes/Webmaster IPI "Carlos Marx", Matanzas. Cuba. ----- Original Message ----- From: "Sean Carolan" scarolan@gmail.com To: "CentOS mailing list" centos@centos.org Sent: Monday, July 20, 2009 11:28 AM Subject: [CentOS] Limit RAM used by a perl script
I have a perl script which runs from a cron job. How would you limit the amount of RAM that this script is allowed to consume? Is there a ulimit setting that will accomplish this? If so does ulimit have to be run each time the script is run, or is there a way to set it permanently? _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Hi
To use yum with proxy you have to configure 3 files: /etc/yum.conf: add the line proxy=http://xxx.xxx.xxx.xxx:yyy where yyy is the port.
/etc/wgetrc: There are 2 lines that you have to edit: http_proxy=http://xxxx(...):yyy ftp_proxy=http://xxx.(...):yyy
/root/.bash_profile export http_proxy=http://xxx(...):yyyy export ftp_proxy=http://xxx(...):yyyy
Regards
mg.
Thanks Marcelo. Due an electric problem my parent proxy is down for a while but when it comming online I'll test it.
Send this answer to my personal mail: Where do you from? Do you speak Spanish?
Saludos Fraternales _____________________________ Atte. Alberto García Gómez M:.M:. Administrador de Redes/Webmaster IPI "Carlos Marx", Matanzas. Cuba. ----- Original Message ----- From: "Marcelo M. Garcia" marcelo.maia.garcia@googlemail.com To: "CentOS mailing list" centos@centos.org Sent: Monday, July 20, 2009 12:38 PM Subject: Re: [CentOS] YUM Proxy
Alberto García Gómez wrote:
Dear Fellows:
I'm under a proxy server (squid) and I need to update my centos but every time that I run yum -y update tells me that I can no find a file that exists. The problems is that this machine that I'm trying to update has no graphic shell, so I need to make all by console.
Saludos Fraternales _____________________________ Atte. Alberto García Gómez M:.M:. Administrador de Redes/Webmaster IPI "Carlos Marx", Matanzas. Cuba. ----- Original Message ----- From: "Sean Carolan" scarolan@gmail.com To: "CentOS mailing list" centos@centos.org Sent: Monday, July 20, 2009 11:28 AM Subject: [CentOS] Limit RAM used by a perl script
I have a perl script which runs from a cron job. How would you limit the amount of RAM that this script is allowed to consume? Is there a ulimit setting that will accomplish this? If so does ulimit have to be run each time the script is run, or is there a way to set it permanently? _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Hi
To use yum with proxy you have to configure 3 files: /etc/yum.conf: add the line proxy=http://xxx.xxx.xxx.xxx:yyy where yyy is the port.
/etc/wgetrc: There are 2 lines that you have to edit: http_proxy=http://xxxx(...):yyy ftp_proxy=http://xxx.(...):yyy
/root/.bash_profile export http_proxy=http://xxx(...):yyyy export ftp_proxy=http://xxx(...):yyyy
Regards
mg. _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
Alberto García Gómez wrote:
Dear Fellows:
I'm under a proxy server (squid) and I need to update my centos but every time that I run yum -y update tells me that I can no find a file that exists. The problems is that this machine that I'm trying to update has no graphic shell, so I need to make all by console.
Yum respects the standard environment variables for proxy settings, so from the command line you can:
http_proxy=http://my_proxy:3128 ftp_proxy=http://my_proxy:3128 yum -y update where my_proxy is the name or address of your squid server. Or you can edit the yum configuration to add a proxy setting.
On Mon, Jul 20, 2009 at 4:28 PM, Sean Carolanscarolan@gmail.com wrote:
I have a perl script which runs from a cron job. How would you limit the amount of RAM that this script is allowed to consume? Is there a ulimit setting that will accomplish this? If so does ulimit have to be run each time the script is run, or is there a way to set it permanently? _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
If you run it as a regular user, then maybe you can check out /etc/security/limits.conf
If you run it as a regular user, then maybe you can check out /etc/security/limits.conf
Currently the script runs as the root user. I may be able to change this, but wanted to find out whether there was some other way first.
Would it be possible to use a "ulimit" command within the perl script itself to accomplish this?
On 07/20/2009 05:28 PM, Sean Carolan wrote:
I have a perl script which runs from a cron job. How would you limit the amount of RAM that this script is allowed to consume? Is there a ulimit setting that will accomplish this? If so does ulimit have to be run each time the script is run, or is there a way to set it permanently?
First, install the perl module BSD::Resource
yum install perl-BSD-Resource
Then use it in your program like:
#!/usr/bin/perl
use BSD::Resource; setrlimit(RLIMIT_VMEM, 1_000_000, 1_000_000);
# rest of the program that is limited to 1MByte now
First, install the perl module BSD::Resource
yum install perl-BSD-Resource
Then use it in your program like:
#!/usr/bin/perl
use BSD::Resource; setrlimit(RLIMIT_VMEM, 1_000_000, 1_000_000);
# rest of the program that is limited to 1MByte now
Thanks, Paul. I knew I'd find an answer if I posted my question here.
From: Sean Carolan scarolan@gmail.com
First, install the perl module BSD::Resource
yum install perl-BSD-Resource
Then use it in your program like:
#!/usr/bin/perl
use BSD::Resource; setrlimit(RLIMIT_VMEM, 1_000_000, 1_000_000);
# rest of the program that is limited to 1MByte now
Thanks, Paul. I knew I'd find an answer if I posted my question here.
While having hard limits makes it safer, wouldn't it be better to control the memory usage of the script instead of setting limits that would trigger an "out of memory"...?
JD
From: Sean Carolan scarolan@gmail.com
While having hard limits makes it safer, wouldn't it be better to control the memory usage of the script instead of setting limits that would trigger an "out of memory"...?
How would you control the memory usage of the script if it's run by the root user?
By control I meant to design the script to use a specific amount of RAM, instead of letting it vampirise all available memory...
JD
On 07/21/2009 04:22 PM, John Doe wrote:
From: Sean Carolan scarolan@gmail.com
While having hard limits makes it safer, wouldn't it be better to control the memory usage of the script instead of setting limits that would trigger an "out of memory"...?
How would you control the memory usage of the script if it's run by the root user?
By control I meant to design the script to use a specific amount of RAM, instead of letting it vampirise all available memory...
But what if the program's memory use is dependent on lots of factors which are not easily predictable. And you want to avoid bringing the whole system to it's knees while swapping and killing arbritrary other programs while one program is consuming all of ram and swap. In that case it's easier to limit the memory of that program to e.g. 1 GByte RAM, in which normal input usually can be processed without any trouble. And then, when someone feeds the program some bad data which uses exponentially more memory, then it gracefully stops, giving a clear error message that this input results in too much memory use.
Lots of scenario's for a valid use of such a limit exist.
But what if the program's memory use is dependent on lots of factors which are not easily predictable. And you want to avoid bringing the whole system to it's knees while swapping and killing arbritrary other programs while one program is consuming all of ram and swap. In that case it's easier to limit the memory of that program to e.g. 1 GByte RAM, in which normal input usually can be processed without any trouble. And then, when someone feeds the program some bad data which uses exponentially more memory, then it gracefully stops, giving a clear error message that this input results in too much memory use.
Lots of scenario's for a valid use of such a limit exist.
I'm using the perl-BSD-Resource module, with the script confined to 512MB of RAM. So far it's working fine. I'm not terribly worried about the script failing, it's much more important that the server stay up and running since it is also a production mail server. If the script crashes, we can deal with that separately.