My network has a output layer with Relu activation function, but I want the output is something like "Relu+1", that is I want the output is all bigger than 1 and has the same shape of Relu function.
How should I change my torch.nn network?
原始Relu都是大于0的值,你想大于1的值,直接+1即可。关于shape一致,激活函数是不改变shape的。这还有什么问题吗?