Pipelines Jobs and Stages What goes where

Follow

Comments

2 comments

  • Avatar
    Ian Bridson

     The main difference between Stages and Jobs is that Stages are sequential within the pipeline, wheras jobs are concurrent, so if you want to run tasks in parallel define them as jobs within a stage, if you want to know the build succeeded before unit testing, use a build stage and a test stage.


    So in your example the Pipeline would have a Build stage which fetches material from your source repository, and has a job that builds the code. The output of the build is written to \output under the pipeline working directory. If this succeeds then the next stage, Test, is run which has a job that runs Nunit for instance and references the DLLs in \output. The results XML is then shown under the Test tab on that stage.


    Save the \output to artifact TestedCode and then the next pipeline which deploys to a QA environment can fetch the material from Pipeline, Test stage, TestedCode. Use resources to assign the appropriate agent to run the build, test and deploy jobs

  • Avatar
    Scott Baldwin

    A note if you are using enterprise edition:


    Since your jobs run in parallel, you could get into a situation where a different agent will run a later stage (and therefore job) than your initial build stage. That means the Go agent's working directory in the later stage won't have the "\output" directory that ian mentioned. The exception to this is if you assign the same resource to jobs in your pipeline. Or, you just need to designate an artifact from an earlier stage that you can fetch in the later stage.


    If you have a relatively small number of unit tests, you're probably better off doing your build and unit tests in the same job. If you split them across stages and you haven't assigned the same resource to the jobs (build and unit test), you'll have to publish and fetch the build artifacts which will add a bit of time to your overall pipeline execution.

Please sign in to leave a comment.