mercredi 6 novembre 2019

How to run system level test suite from a different repo in a pipeline?

I have a repository on bitbucket which has a test suite for the system to test the most crucial system functions on API-level. Whenever services which are part of this system (which all exist in their own repository) are deployed to staging (through bitbucket-pipeline), I would like to run the tests defined in this test suite (which exist in their own repository) in a step in the bitbucket pipeline after deployment to staging.

So basically adding a step to run the tests after deploying to staging where the test suite is loaded and executed as part of the pipeline.

It doesn't have to automatically recover the previous version of the service, but it should fail the pipeline so it can manually be fixed.

I put the tests in a docker image and pushed it to ECR. The test runs when I use docker run IMAGE, but when it fails it doesn't reflect this in the pipeline.

I also tried to define a step like this:

step:
    name: Running system level api tests
    image:
      name: image name of test suite in AWS ECR
      aws:
        access-key: $AWS_ACCESS_KEY_ID
        secret-key: $AWS_SECRET_ACCESS_KEY
    script:
      - mvn test
    services:
      - docker
    artifacts:
      - reporting_folders here

But this also doesn't work as it doesn't find any of the files.

How can I make sure the tests run as part of the timeline? I would like the pipeline to fail as the tests fail, as well as have the artifacts which are generated in the docker image available through the bitbucket-pipeline.

Aucun commentaire:

Enregistrer un commentaire