We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi team,
I am looking to train using my own data and wanted to get some basic explanation on the relevance of the parrameters
sample_num = 2048
batch_size = 12
num_epochs = 256
In my settings file i have :
num_class = 2
label_weights = [] for c in range(num_class): label_weights.append(1.0)
learning_rate_base = 0.001 decay_steps = 20000 decay_rate = 0.7 learning_rate_min = 1e-6
step_val = 500
weight_decay = 0.0
Trying to train on aerial data with 8 points per sq meter density and i have a training dataset of 800 000 000 million points.
I am seing some over-fitting very early on with these values:
sample_num = 12288
batch_size = 6
num_epochs = 8096
jitter = 0.0 jitter_val = 0.0
rotation_range = [0, math.pi/32., 0, 'u'] rotation_range_val = [0, 0, 0, 'u'] rotation_order = 'rxyz'
scaling_range = [0.0, 0.0, 0.0, 'g'] scaling_range_val = [0, 0, 0, 'u']
sample_num_variance = 1 // 8 sample_num_clip = 1 // 4
Any advice?
The text was updated successfully, but these errors were encountered:
Here is a snap of the loss for the validation set - very lumpy
Sorry, something went wrong.
Any updates on this?
No branches or pull requests
Hi team,
I am looking to train using my own data and wanted to get some basic explanation on the relevance of the parrameters
sample_num = 2048
batch_size = 12
num_epochs = 256
In my settings file i have :
num_class = 2
sample_num = 2048
batch_size = 12
num_epochs = 256
label_weights = []
for c in range(num_class):
label_weights.append(1.0)
learning_rate_base = 0.001
decay_steps = 20000
decay_rate = 0.7
learning_rate_min = 1e-6
step_val = 500
weight_decay = 0.0
Trying to train on aerial data with 8 points per sq meter density and i have a training dataset of 800 000 000 million points.
I am seing some over-fitting very early on with these values:
num_class = 2
sample_num = 12288
batch_size = 6
num_epochs = 8096
label_weights = []
for c in range(num_class):
label_weights.append(1.0)
learning_rate_base = 0.001
decay_steps = 20000
decay_rate = 0.7
learning_rate_min = 1e-6
step_val = 500
weight_decay = 0.0
jitter = 0.0
jitter_val = 0.0
rotation_range = [0, math.pi/32., 0, 'u']
rotation_range_val = [0, 0, 0, 'u']
rotation_order = 'rxyz'
scaling_range = [0.0, 0.0, 0.0, 'g']
scaling_range_val = [0, 0, 0, 'u']
sample_num_variance = 1 // 8
sample_num_clip = 1 // 4
Any advice?
The text was updated successfully, but these errors were encountered: