You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
assuming I have only 1000 training samples (or even fewer) for training and testing, are there specific model parameters or modules that can be adjusted or reduced to facilitate fitting, thereby ensuring effective inference during testing?
The text was updated successfully, but these errors were encountered:
if I have at most two A100 GPUs available, is it feasible to reduce the training dataset to a specific type (e.g., focusing only on chairs) to minimize dataset size and complete training?
specifically, how can training be achieved with a small dataset, a smaller-scale model, and limited hardware resources? is it theoretically feasible?
assuming I have only 1000 training samples (or even fewer) for training and testing, are there specific model parameters or modules that can be adjusted or reduced to facilitate fitting, thereby ensuring effective inference during testing?
The text was updated successfully, but these errors were encountered: