This is more of a best/common practices question.
We are using Spacy in our production system. While testing, many times we have to download full spacy models (parser + word vectors) which can be very slow (~30 mins) and frustrating. Perhaps a better strategy could be to create a custom lightweight spacy model for testing, e.g., with only 1000 word vocab and a smaller parsing model.
Are there suggested strategies/best practices when testing with a large data model that can be applied to this scenario?
Aucun commentaire:
Enregistrer un commentaire