Python cntk.greater_equal() Examples

The following are 11 code examples of cntk.greater_equal(). You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may also want to check out all available functions/classes of the module cntk , or try the search function .
Example #1
Source File: cntk_backend.py    From GraphicDesignPatternByPython with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #2
Source File: cntk_backend.py    From DeepLearning_Wavelet-LSTM with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #3
Source File: cntk_backend.py    From DeepLearning_Wavelet-LSTM with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #4
Source File: cntk_backend.py    From DeepLearning_Wavelet-LSTM with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #5
Source File: cntk_backend.py    From DeepLearning_Wavelet-LSTM with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #6
Source File: cntk_backend.py    From DeepLearning_Wavelet-LSTM with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #7
Source File: cntk_backend.py    From DeepLearning_Wavelet-LSTM with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #8
Source File: cntk_backend.py    From DeepLearning_Wavelet-LSTM with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #9
Source File: FasterRCNN.py    From cntk-hotel-pictures-classificator with MIT License 5 votes vote down vote up
def create_detection_losses(cls_score, label_targets, rois, bbox_pred, bbox_targets, bbox_inside_weights):
    # classification loss
    cls_loss = cross_entropy_with_softmax(cls_score, label_targets, axis=1)

    p_cls_loss = placeholder()
    p_rois = placeholder()
    # The terms that are accounted for in the cls loss are those that correspond to an actual roi proposal --> do not count no-op (all-zero) rois
    roi_indicator = reduce_sum(p_rois, axis=1)
    cls_num_terms = reduce_sum(cntk.greater_equal(roi_indicator, 0.0))
    cls_normalization_factor = 1.0 / cls_num_terms
    normalized_cls_loss = reduce_sum(p_cls_loss) * cls_normalization_factor

    reduced_cls_loss = cntk.as_block(normalized_cls_loss,
                                     [(p_cls_loss, cls_loss), (p_rois, rois)],
                                     'Normalize', 'norm_cls_loss')

    # regression loss
    p_bbox_pred = placeholder()
    p_bbox_targets = placeholder()
    p_bbox_inside_weights = placeholder()
    bbox_loss = SmoothL1Loss(cfg["CNTK"].SIGMA_DET_L1, p_bbox_pred, p_bbox_targets, p_bbox_inside_weights, 1.0)
    # The bbox loss is normalized by the batch size
    bbox_normalization_factor = 1.0 / cfg["TRAIN"].BATCH_SIZE
    normalized_bbox_loss = reduce_sum(bbox_loss) * bbox_normalization_factor

    reduced_bbox_loss = cntk.as_block(normalized_bbox_loss,
                                     [(p_bbox_pred, bbox_pred), (p_bbox_targets, bbox_targets), (p_bbox_inside_weights, bbox_inside_weights)],
                                     'SmoothL1Loss', 'norm_bbox_loss')

    detection_losses = plus(reduced_cls_loss, reduced_bbox_loss, name="detection_losses")

    return detection_losses 
Example #10
Source File: cntk_backend.py    From deepQuest with BSD 3-Clause "New" or "Revised" License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y) 
Example #11
Source File: cntk_backend.py    From keras-lambda with MIT License 5 votes vote down vote up
def greater_equal(x, y):
    return C.greater_equal(x, y)