How to use the saliency.VisualizeImageGrayscale function in saliency

To help you get started, we’ve selected a few saliency examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github PreferredAI / tutorials / image-classification / face-emotion / src / visualize.py View on Github external
def vis_guided_backprop(model, gradient_saliency, neuron_selector, img_path, label):
  im = load_image(img_path)

  smoothgrad_mask_3d = gradient_saliency.GetSmoothedMask(im.reshape(48, 48, 1),
                                                         feed_dict={neuron_selector: label,
                                                                    model.is_training: False})

  smoothgrad_mask_grayscale = saliency.VisualizeImageGrayscale(smoothgrad_mask_3d)
  smoothgrad_mask_grayscale = cv2.resize(smoothgrad_mask_grayscale, None, fx=4.0, fy=4.0, interpolation=cv2.INTER_AREA)

  cv2.imwrite(os.path.join(FLAGS.output_dir, "saliency_map_{}".format(img_path.split('/')[-1])),
              scale_values(smoothgrad_mask_grayscale))