i need to know a way of testing or technique for the below scenerio.
Input.txt ---> Processing module --> Output in database tables
Here is how my input.txt(It will be a huge file say 10k lines) looks like timestamp1,organisation1,data1,localtimestamp1,place1 timestamp2,organisation1,data2,localtimestamp2,place1 timestamp3,organisation1,data3,localtimestamp3,place1
I feed this file to processing module and the output will be entered into datbase tables in certain form (eg:- table1 to table10 & each table contains more than 6 columns).
eg:- table 1 column1 column2....... Processeddata1 place1 Processeddata2 place2
So like this i will get the output in db tables.
My question is how to test the output. as input is in the form of packets in .txt file and processed data is in huge rows of database
Here is What i have tried.. 1) Process the input normally (now processed data will be in DB tables) 2) Export all the processed data tables to .csv files. 3) validate these .csv files manually (Only first time). 4) Keep these .csv files as standard reference files(STD). 5) For every release run the process --> export the o/p data from tables to .csv files(Actual). 6) then compare these(Actual) .csv files with already stored .csv files(STD)
If you have any other way of testing please suggest. Please help.
Aucun commentaire:
Enregistrer un commentaire