mercredi 29 avril 2015

Test plan for reporting system

I have a software suite that consists of multiple integrated software packages. They all run off of a single centralised SQL database.

We are in the stage where we are writing test plans and have allocated a single test plan for each independent module of the software. The only one left to write is the test plan for the reporting module. This particular module does nothing but run reports on the data in the SQL database (which will be written by the other modules).

Any testing iterations are preceded by developer, regression and integration testing, which should remove any issues of the database data not being correctly maintained.

My dilemma is how to approach the black box test plan for the reporting module. The way I see it there are three options:

  • Append the reporting test cases to the test plans for the modules that influence them (downside: the modules work together to produce the reports; reports cannot be divided up by module like that)
  • Write a test plan for reporting with specified pre-requisites, that are essentially a list of instructions of tasks to perform in the other modules, followed by test cases to test that the reporting is producing correctly in response to these tasks (downside: very complicated and long-winded)
  • Write a test plan for reporting with set datasets on a dedicated controlled SQL database (downside: lack of flexibility)

It looks to me that the second option is the best. It's the longest-winded but that alone is hardly a reason to discount something.

Does anyone have any experience with testing a module purely for reporting, who can hopefully provide an insight into the best / industry-standard ways to do this?

Thanks in advance!

Aucun commentaire:

Enregistrer un commentaire