I have a Spark application that does Data transformation jobs with some Scalatest cases, which I could run locally. Is it possible to use Spark summit to run those tests as part of a pipeline? I only see --class as Spark summit argument. The idea is to run those tests first, if they all passed, the application will run. Or if this is not the way, how would you achieve this? Many thanks.
Aucun commentaire:
Enregistrer un commentaire