1

I am currently working on a project that utilizes the docai custom classifier. I have a question regarding the test dataset size limitations. As I understand, the current limit for the test dataset size is 2,000 documents. However, for my project, I would like to increase the number of test samples beyond this limit, I have a total of 20k+ documents which is within the training dataset limits. Could you please advise on the best way to achieve this? Is there a way to bypass or increase the 2,000 document limit for the test dataset? If so, what are the steps I need to follow to do so? Additionally, are there any considerations or potential implications I should be aware of when working with a larger test dataset? I would greatly appreciate your guidance on this matter. Your expertise in the GCP Document AI service would be invaluable in helping me address this requirement for my project.

Training a datasets beyond 2k limit with result an error that the test dataset is beyond the maximum.

asked May 16, 2024 at 2:10

0

Know someone who can answer? Share a link to this question via email, Twitter, or Facebook.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.