Hi Friends,
Same question has been asked on the Squid mailing list but so far no reply on the mailing list so posting it here also.
We are trying to cache some files from apple.com like .dmg, .pkg, .ipa etc.. so that local clients can fetch the data from the cache. The problem we are facing is that we have download restrictions for every client to 25 MB during work hours except for a particular client. Now when this exception client downloads the files from apple.com it gets downloaded from the site and gets stored in the cache but as the download restrictions are for 25MB the files which are even in cache with size more than 25MB are not accessed by the other clients, if we remove download restriction for that client then the software gets downloaded.
Is there any way we can allow any client to access objects/files in cache without removing the download restriction. We are using Squid 2.6 on Centos 5 64-bit.
Cache configuration: cache_mem 128 MB maximum_object_size_in_memory 1024 KB cache_dir ufs /var/spool/squid 400000 16 256 maximum_object_size 4096 MB refresh_pattern -i .(deb|rpm|exe|zip|tar|tgz|ram|rar|bin|ppt|doc|tiff|dmg|pkg|ipa)$ 10080 90% 43200 override-expire ignore-no-cache ignore-private
reply_body_max_size 0 allow sp-download-grant reply_body_max_size 0 allow sp-download-grant-dst acl WorkingHours4 time D 09:00-18:00 acl WorkingHours1 time D 10:30-12:59 acl WorkingHours2 time D 15:00-18:30 acl WorkingHours3 time D 13:00-14:59 acl google dstdomain .video.google.com acl youtube dstdomain .youtube.com reply_body_max_size 500000000 allow WorkingHours1 google reply_body_max_size 500000000 allow WorkingHours2 google reply_body_max_size 500000000 allow WorkingHours3 google reply_body_max_size 5000000 allow WorkingHours2 youtube reply_body_max_size 5000000 allow WorkingHours1 youtube reply_body_max_size 50000000 allow WorkingHours3 youtube http_access allow google indus http_access allow youtube indus reply_body_max_size 26000000 allow WorkingHours1 all reply_body_max_size 26000000 allow WorkingHours2 all reply_body_max_size 50000000 allow WorkingHours3 all http_access allow allowindus WorkingHours4 http_access allow indus
Do let me know if you need any further information.
Thanks & Regards
Ankush
Any update on this? We are stuck and need help..
Thanks & Regards
Ankush
On Wed, Oct 3, 2012 at 9:20 AM, ankush grover ankushcentos@gmail.comwrote:
Hi Friends,
Same question has been asked on the Squid mailing list but so far no reply on the mailing list so posting it here also.
We are trying to cache some files from apple.com like .dmg, .pkg, .ipa etc.. so that local clients can fetch the data from the cache. The problem we are facing is that we have download restrictions for every client to 25 MB during work hours except for a particular client. Now when this exception client downloads the files from apple.com it gets downloaded from the site and gets stored in the cache but as the download restrictions are for 25MB the files which are even in cache with size more than 25MB are not accessed by the other clients, if we remove download restriction for that client then the software gets downloaded.
Is there any way we can allow any client to access objects/files in cache without removing the download restriction. We are using Squid 2.6 on Centos 5 64-bit.
Cache configuration: cache_mem 128 MB maximum_object_size_in_memory 1024 KB cache_dir ufs /var/spool/squid 400000 16 256 maximum_object_size 4096 MB refresh_pattern -i .(deb|rpm|exe|zip|tar|tgz|ram|rar|bin|ppt|doc|tiff|dmg|pkg|ipa)$ 10080 90% 43200 override-expire ignore-no-cache ignore-private
reply_body_max_size 0 allow sp-download-grant reply_body_max_size 0 allow sp-download-grant-dst acl WorkingHours4 time D 09:00-18:00 acl WorkingHours1 time D 10:30-12:59 acl WorkingHours2 time D 15:00-18:30 acl WorkingHours3 time D 13:00-14:59 acl google dstdomain .video.google.com acl youtube dstdomain .youtube.com reply_body_max_size 500000000 allow WorkingHours1 google reply_body_max_size 500000000 allow WorkingHours2 google reply_body_max_size 500000000 allow WorkingHours3 google reply_body_max_size 5000000 allow WorkingHours2 youtube reply_body_max_size 5000000 allow WorkingHours1 youtube reply_body_max_size 50000000 allow WorkingHours3 youtube http_access allow google indus http_access allow youtube indus reply_body_max_size 26000000 allow WorkingHours1 all reply_body_max_size 26000000 allow WorkingHours2 all reply_body_max_size 50000000 allow WorkingHours3 all http_access allow allowindus WorkingHours4 http_access allow indus
Do let me know if you need any further information.
Thanks & Regards
Ankush
On Sat, Oct 6, 2012 at 4:56 AM, ankush grover ankushcentos@gmail.com wrote:
Any update on this? We are stuck and need help..
Can you add a second squid instance configured so the one near the clients caches but does not apply restrictions and the upstream parent applies the restrictions but would not need to cache?
From: ankush grover ankushcentos@gmail.com
We are trying to cache some files from apple.com like .dmg, .pkg, .ipa etc.. so that local clients can fetch the data from the cache. The problem we are facing is that we have download restrictions for every client to 25 MB during work hours except for a particular client. Now when this exception client downloads the files from apple.com it gets downloaded from the site and gets stored in the cache but as the download restrictions are for 25MB the files which are even in cache with size more than 25MB are not accessed by the other clients, if we remove download restriction for that client then the software gets downloaded.
Not sure if I understand your goal but did you have a look at delay_pools? You would not restrict by size but by bandwidth...
JD
Thanks Les. I will test your suggestion only thing I need to unable is sending the original source IP to the parent proxy and not the squid child proxy ip otherwise all the clients connected to child proxy will have unlimited download limit.
John,
Delay pools will not work in my case.. Thanks anyway..
Thanks & Regards
Ankush
On Mon, Oct 8, 2012 at 3:32 PM, John Doe jdmls@yahoo.com wrote:
From: ankush grover ankushcentos@gmail.com
We are trying to cache some files from apple.com like .dmg, .pkg, .ipa etc.. so that local clients can fetch the data from the cache. The problem we are facing is that we have download restrictions for every client to 25 MB during work hours except for a particular client. Now when this exception client downloads the files from apple.com it gets downloaded from the site and gets stored in the cache but as the download restrictions are for 25MB the files which are even in cache with size more than 25MB are not accessed by the other clients, if we remove download restriction for that client then the software gets downloaded.
Not sure if I understand your goal but did you have a look at delay_pools? You would not restrict by size but by bandwidth...
JD _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos