Strange


Total Posts: 1591 
Joined: Jun 2004 


Please don't laugh. I am trying to convert an "if  then" statement into a parametric form that would allow for a smoothed transition between states. It's probably something fairly simple, but I am completely blanking (seems to happen more and more lately). Here is the gist of the problem in it's current state:
A, B are continuous variables Z is threshold X is scale value
so I have the following statement in the code:
Q = (X if A > 0 and B > Z else 1) x (X if A less than 0 and B less than Z else 1)
Obviously, the above not a very optimal way of doing it, since Q changes from 1 to X discretely and I want to avoid that type of behavior. Ideally, it would be some sort of a continuous 2d function that can be tweaked from a completely discrete state above to a more continuous form via several variables. A simple version is to make a X a linear function of B and keep the conditions above, but that does not get rid of discontinuity wrt to A.

"In Russia, every CDS ends in bullet payment" 


gaj


Total Posts: 53 
Joined: Apr 2018 


The last part of the formula for Q is cut off. Anyway the first part can be written as
(A > 0) * (B > Z) * (X1) + 1
Then you can replace the binary variables with sigmoid:
sigmoid(A) * sigmoid(B  Z) * (X  1) + 1



Strange


Total Posts: 1591 
Joined: Jun 2004 


Thanks!
corrected it  it's essentially the same statement in reverse (if both A and B are less than zero and Z respectively).
The sigmoid version is interesting (indeed it removes the discontinuity with respect to both B and A) and I I am guessing I can extend it to have continuous decay from 1 instead of X
sigmoid(A) * sigmoid(B  Z) * ( decay(B)  1) + 1 
"In Russia, every CDS ends in bullet payment" 


nikol


Total Posts: 821 
Joined: Jun 2005 

 
TonyC

Nuclear Energy Trader

Total Posts: 1316 
Joined: May 2004 


the "sigmoid' function that is used in the backpropagation algorithm of neural networks is a very smooth transition between 0 and 1 ... and good explanations are in every introductory neural network machine learning text.
and if you think it's weird that I'm replying at a quarter of midnight on a Saturday in New York, it's even weirder that I'm actually replying at a quarter of 5 a.m. on a Sunday in Vernazza

flaneur/boulevardier/remittance man/energy trader 


TonyC

Nuclear Energy Trader

Total Posts: 1316 
Joined: May 2004 


oh wait, you already mentioned sigmoid functions, which means my prior reply is superfluous and that I am an idiot ...
... or at least I'm an idiot 5 a.m. in the morning 
flaneur/boulevardier/remittance man/energy trader 

nikol


Total Posts: 821 
Joined: Jun 2005 


TonyC: that's a widespread phenomenon  stupidity with positive feedback. )) 


