Channel Algebra

"Adding" and "subtracting" elements within the neural network.

If you want to follow along on google colab, check out this notebookarrow-up-right

This time, we'll try to see what happens if we tweak our custom losses to optimize the sums and differences of of different channel outputs within the inceptionV3

model = models.inception_v3(pretrained=True)
dreamy_boi = dreamer(model)

We'll use channels from 2 layers this time, but feel free to work with any other layers.

First, let's define and run 2 custom optimizations that would optimize 2 channels individually, just to see how they look like:

Now let's run the optimizations and see how the channels look like:

Channel A
Channel B

Now let's see what happens when we "add" them

Look closely at custom_func_combined , it's basically adding the outputs of both the layers and then returning the loss. Therefore both of the layer ouputs get optimized simultaneously.

Channel A + Channel B

Optimizing the difference is also just as easy:

Channel A - Channel B

Last updated