Channel Algebra
"Adding" and "subtracting" elements within the neural network.
If you want to follow along on google colab, check out this notebook

This time, we'll try to see what happens if we tweak our custom losses to optimize the sums and differences of of different channel outputs within the inceptionV3
model = models.inception_v3(pretrained=True)
dreamy_boi = dreamer(model)
We'll use channels from 2 layers this time, but feel free to work with any other layers.
layers_to_use = [
model.Mixed_6c.branch7x7_1.conv,
model.Mixed_6b.branch7x7dbl_2
]
First, let's define and run 2 custom optimizations that would optimize 2 channels individually, just to see how they look like:
def custom_func_1(layer_outputs):
# A
loss = layer_outputs[0][74].mean()
return loss
def custom_func_2(layer_outputs):
# B
loss = layer_outputs[1][88].mean()
return loss
config = {
"image_path": "noise.jpg",
"layers": layers_to_use,
"octave_scale": 1.1,
"num_octaves": 20,
"iterations": 100,
"lr": 0.04,
"max_rotation": 0.7,
}
Now let's run the optimizations and see how the channels look like:
config["custom_func"] = custom_func_1
out_1 = dreamy_boi.deep_dream(config)
plt.imshow(out_1)
plt.show()

config["custom_func"] = custom_func_2
out_2 = dreamy_boi.deep_dream(config)
plt.imshow(out_2)
plt.show()

Now let's see what happens when we "add" them
Look closely at custom_func_combined
, it's basically adding the outputs of both the layers and then returning the loss. Therefore both of the layer ouputs get optimized simultaneously.
def custom_func_combined(layer_outputs):
loss = layer_outputs[0][74].mean() + layer_outputs[1][88].mean()
return loss
config["custom_func"] = custom_func_combined
out_blend = dreamy_boi.deep_dream(config)
plt.imshow(out_blend)
plt.show()

Optimizing the difference is also just as easy:
def custom_func_diff(layer_outputs):
loss = layer_outputs[0][74].mean() - layer_outputs[1][88].mean()
return loss
config["custom_func"] = custom_func_diff
out_diff = dreamy_boi.deep_dream(config)
plt.imshow(out_diff)
plt.show()

Last updated
Was this helpful?