Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sourcery Starbot ⭐ refactored jevenzh/deepplantphenomics #1

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

SourceryAI
Copy link

Thanks for starring sourcery-ai/sourcery ✨ 🌟 ✨

Here's your pull request refactoring your most popular Python repo.

If you want Sourcery to refactor all your Python repos and incoming pull requests install our bot.

Review changes via command line

To manually merge these changes, make sure you're on the master branch, then run:

git fetch https://github.com/sourcery-ai-bot/deepplantphenomics master
git merge --ff-only FETCH_HEAD
git reset HEAD^

Comment on lines -142 to +146
return next(layer for layer in self.__layers if isinstance(layer, layers.convLayer) or isinstance(layer, layers.fullyConnectedLayer))
return next(
layer
for layer in self.__layers
if isinstance(layer, (layers.convLayer, layers.fullyConnectedLayer))
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function DPPModel.__first_layer refactored with the following changes:

Comment on lines -35 to +53
self.weights = tf.get_variable(self.name + '_weights',
shape=self.filter_dimension,
initializer=tf.contrib.layers.xavier_initializer_conv2d())
self.weights = tf.get_variable(
f'{self.name}_weights',
shape=self.filter_dimension,
initializer=tf.contrib.layers.xavier_initializer_conv2d(),
)
else:
self.weights = tf.get_variable(self.name + '_weights',
shape=self.filter_dimension,
initializer=tf.truncated_normal_initializer(stddev=5e-2),
dtype=tf.float32)

self.biases = tf.get_variable(self.name + '_bias',
[self.output_size[-1]],
initializer=tf.constant_initializer(0.1),
dtype=tf.float32)
self.weights = tf.get_variable(
f'{self.name}_weights',
shape=self.filter_dimension,
initializer=tf.truncated_normal_initializer(stddev=5e-2),
dtype=tf.float32,
)

self.biases = tf.get_variable(
f'{self.name}_bias',
[self.output_size[-1]],
initializer=tf.constant_initializer(0.1),
dtype=tf.float32,
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function convLayer.__init__ refactored with the following changes:

Comment on lines -134 to +160
self.weights = tf.get_variable(self.name + '_weights', shape=[vec_size, output_size],
initializer=tf.contrib.layers.xavier_initializer())
self.weights = tf.get_variable(
f'{self.name}_weights',
shape=[vec_size, output_size],
initializer=tf.contrib.layers.xavier_initializer(),
)
else:
self.weights = tf.get_variable(self.name + '_weights',
shape=[vec_size, output_size],
initializer=tf.truncated_normal_initializer(stddev=math.sqrt(2.0/self.output_size)),
dtype=tf.float32)

self.biases = tf.get_variable(self.name + '_bias',
[self.output_size],
initializer=tf.constant_initializer(0.1),
dtype=tf.float32)
self.weights = tf.get_variable(
f'{self.name}_weights',
shape=[vec_size, output_size],
initializer=tf.truncated_normal_initializer(
stddev=math.sqrt(2.0 / self.output_size)
),
dtype=tf.float32,
)

self.biases = tf.get_variable(
f'{self.name}_bias',
[self.output_size],
initializer=tf.constant_initializer(0.1),
dtype=tf.float32,
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function fullyConnectedLayer.__init__ refactored with the following changes:

Comment on lines -205 to +220
if deterministic:
return x
else:
return tf.nn.dropout(x, self.p)
return x if deterministic else tf.nn.dropout(x, self.p)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function dropoutLayer.forward_pass refactored with the following changes:

dense = tf.reshape(values, (batch_size, num_outputs))

return dense
return tf.reshape(values, (batch_size, num_outputs))
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function label_string_to_tensor refactored with the following changes:

Comment on lines -101 to +99
unique = set([label.strip() for label in labels])
unique = {label.strip() for label in labels}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function string_labels_to_sequential refactored with the following changes:

y = self.model.forward_pass_with_file_inputs(x)

return y
return self.model.forward_pass_with_file_inputs(x)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function arabidopsisStrainClassifier.forward_pass refactored with the following changes:

Comment on lines 37 to 38
labels = [mapping[index] for index in indices]

return labels No newline at end of file
return [mapping[index] for index in indices]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function tools.classify_arabidopsis_strain refactored with the following changes:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant