On Wed, May 31, 2017 at 7:05 PM, Ari LiVigni <alivigni@redhat.com> wrote:
As you pointed out Jenkins pipeline is a great fit for this scenario for a number of reasons.  
 - It keeps all the stages in one pipeline with little duplication
 - Great integration with Openshift

That said I understand having to switch from JJB to Pipeline.
There is also a way you can start to combine both while still using JJB:

Very helpful - I've decided to go with this, so I came up with this schema:

http://pastebin.centos.org/98936/

Basically I have 3 jobs instead of 1

1. Main - new, it is a pipeline and kicks off the other jobs
2. Build - existing, it does what the current job does
3. Deploy - new, parameterised job so that it does not have to be created N times (where N is number of repos we build)

The only thing I am not sure about is the deploy job - it might be a bit confusing with the parameters as things may fail "randomly", but I still like it a bit more than adding 30-40 new jobs which do exactly the same thing with just 2 changing variables..

Does it make sense like this? Would you say it follows "best practices"?:) 




Multijob is a good way to keep jobs in a pipeline that allows certain jobs to run in parallel and then gate on a final step.
I do believe this can fit what you are looking for.  If you need to run multiple similar scenarios of the same steps with subtle config differences than matrix-jobs are great as well.

I did a blog post on this of how we used this.  We use it for Openshift-ansible committed and PRs currently.


I hope this helps.

On Wed, May 31, 2017 at 10:44 AM, Václav Pavlín <vasek@redhat.com> wrote:
Hi all,

I am very new to Jenkins, so please bear with me:)



The script basically copies content of git cloned directory and runs build script. If that run is successful it deploys the resulting docker image to OpenShift.

We want to deploy images by tag - which corresponds to git commit hash, so it gets templated in by oc process. 

The problem is that there are templates which have more parameters and we have taken care of this with a small script in this repostory: https://github.com/openshiftio/saas/

So my goal is to use that saas tool to do the deploy instead of plain oc process > oc apply.

As we have multiple templates which have different scripts, I'd like to avoid code duplication - I don't want to copy&paste the oc process/oc apply or the script calls around in the file.

So my thinking to solve this was to create some kind of hierarchy - I found something about dependencies and something called multijob where a job would trigger a build phase (build a docker image) and a deploy phase (oc apply). The key part is that if any of the phases fail, whole job fails. Also, deploy phase must not be triggered when build phase fails.

What would be the correct approach with JJB to solve this? Is this wrong thinking? Am I missing something from the Jenkins point of view?

I know this would be ideally solved by pipelines, but I don't think that is an option at the moment.

Looking forward to comments and suggestions,
Vašek

--
Red Hat Developer Tools team
Brno, Czech Republic
Phone: +420 739 666 824


_______________________________________________
Ci-users mailing list
Ci-users@centos.org
https://lists.centos.org/mailman/listinfo/ci-users




--
-== @ri ==-



--
Red Hat Developer Tools team
Brno, Czech Republic
Phone: +420 739 666 824