tensorflow - keras combining two losses with adjustable weights -


enter image description here

so here detail description. have keras functional model 2 layers outputs x1 , x2.

x1 = dense(1,activation='relu')(prev_inp1)  x2 = dense(2,activation='relu')(prev_inp2) 

i need use these x1 , x2, merge/add them , come weighted loss function in attached image. propagate 'same loss' both branches. alpha flexible vary iterations

it seems propagating "same loss" both branches not take effect, unless alpha dependent on both branches. if alpha not variable depending on both branches, part of loss constant 1 branch.

so, in case, compile model 2 losses separate , add weights compile method:

model.compile(optmizer='someoptimizer',loss=[loss1,loss2],loss_weights=[alpha,1-alpha]) 

compile again when need alpha change.


but if indeed alpha dependent on both branches, need concatenate results , calculate alpha's value:

singleout = concatenate()([x1,x2]) 

and custom loss function:

def weightedloss(ytrue,ypred):     x1true = ytrue[0]     x2true = ytrue[1:]      x1pred = ypred[0]     x2pred = ypred[1:]      #calculate alpha somehow keras backend functions      return (alpha*(someloss(x1true,x1pred)) + ((1-alpha)*(someloss(x2true,x2pred)) 

compile function:

model.compile(loss=weightedloss, optimizer=....) 

Comments

Popular posts from this blog

angular - Ionic slides - dynamically add slides before and after -

minify - Minimizing css files -

Add a dynamic header in angular 2 http provider -