You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for the contribution! Here, I'd like to delve into a new issue. How can we achieve a 1-Lipschitz continuous neural network ( f ) while adhering to the initial condition ( f(0) = 0 )? Using a penalty method is a soft constraint, which might lead to unexpected smoothing. My idea is to append a new neural network ( g(x) ) at the end to fit the indicator function of whether ( x ) is 0, as theoretically, ( g(x) ) is also 1-Lipschitz continuous. Then the new output ( f(x) = g(x) \times f(x) ) is to ensure overall 1-Lipschitz continuity.
The text was updated successfully, but these errors were encountered:
Thanks for the contribution! Here, I'd like to delve into a new issue. How can we achieve a 1-Lipschitz continuous neural network ( f ) while adhering to the initial condition ( f(0) = 0 )? Using a penalty method is a soft constraint, which might lead to unexpected smoothing. My idea is to append a new neural network ( g(x) ) at the end to fit the indicator function of whether ( x ) is 0, as theoretically, ( g(x) ) is also 1-Lipschitz continuous. Then the new output ( f(x) = g(x) \times f(x) ) is to ensure overall 1-Lipschitz continuity.
The text was updated successfully, but these errors were encountered: