Python theano.tensor.true_div() Examples
The following are 5
code examples of theano.tensor.true_div().
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example.
You may also want to check out all available functions/classes of the module
theano.tensor
, or try the search function
.
Example #1
Source File: update_function.py From recnet with MIT License | 6 votes |
def fit(self, weights, o_error, tpo ): gradients = T.grad(o_error ,weights) updates = [] for c, v, w, g in zip(self.t_cache, self.t_velocity, weights,gradients): new_velocity = T.sub( T.mul(tpo["momentum_rate"], v) , T.mul(tpo["learn_rate"], g) ) new_cache = T.add( T.mul(tpo["decay_rate"] , c) , T.mul(T.sub( 1, tpo["decay_rate"]) , T.sqr(g))) new_weights = T.sub(T.add(w , new_velocity) , T.true_div( T.mul(g,tpo["learn_rate"]) , T.sqrt(T.add(new_cache,0.1**8)))) updates.append((w, new_weights)) updates.append((v, new_velocity)) updates.append((c, new_cache)) return updates ###### Nesterov momentum ########################################
Example #2
Source File: inception.py From kaggle-right-whale with MIT License | 5 votes |
def avg_pool(input_layer, **kwargs): # hack to work around https://github.com/Theano/Theano/issues/3776 norm = nn.layers.ExpressionLayer(input_layer, lambda X: T.ones_like(X)) norm = nn.layers.Pool2DLayer(norm, mode='average_inc_pad', **kwargs) l = nn.layers.Pool2DLayer(input_layer, mode='average_inc_pad', **kwargs) l = nn.layers.ElemwiseMergeLayer([l, norm], T.true_div) return l
Example #3
Source File: helpers.py From deep-prior with GNU General Public License v3.0 | 5 votes |
def SlopeLin(slope): """ Linear unit with different slopes :param slope: slope of negative quadrant :return: x if x > 0 else x/slope """ import theano.tensor as T def inner(x): return T.switch(T.gt(x, 0), x, T.true_div(x, slope)) return inner
Example #4
Source File: helpers.py From deep-prior with GNU General Public License v3.0 | 5 votes |
def SlopeLin2(x, slope): """ Linear unit with different slopes :param slope: slope of negative quadrant :return: x if x > 0 else x/slope """ import theano.tensor as T return T.switch(T.gt(x, 0), x, T.true_div(x, slope))
Example #5
Source File: layers.py From Neural-Photo-Editor with MIT License | 5 votes |
def get_output_for(self, inputs, deterministic=False, **kwargs): alpha,beta = inputs # return 2*T.true_div(alpha,T.add(alpha,beta)+1e-8)-1 return 2*(alpha/(alpha+beta+1e-8))-1 # Convenience Function to produce a residual pre-activation MDCL block