-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The most well behaved of our data, ducknest
, misbehaves with the most simple model
#3
Comments
I think that initial values are not passed out to 'mcds.exe' at present, so far as I can tell looking at the command file associated with the above code, when I run it (note that command files are not currently being deleted after runs):
Also when I look at the I don't know why Anyway, I'll leave @LHMarshall to confirm/deny! |
Running the ducknest R dataset (with exact distances, data set that generated this issue) through DistWin, requesting HN key with 0 cos adjustments:
produces expected behaviour, with a reasonable estimate of sigma
This is a dramatically different result that requesting the same model fit to the same data, but using R to call the (same) MCDS.exe optimiser. DistWin project below |
@erex @lenthomas The discrepancy is down to monotonicity constraints. In the Distance for windows analysis by default monotonicity is strictly decreasing, if you change it to none you get the odd result as above. ![]() ![]() So I tried setting the Distance analysis to strictly monotonic BUT Distance decides that because it is a key function only model it is fine to override that and set monotonicity back to NONE... hence the following doesn't work either
However, as it is Distance that decides that and not mrds if you bypass Distance and use mrds directly then you can replicate the results from Distance for Windows.
Thoughts on what should be done about this? |
This is indeed strange. I don't think it needs addressed in the up-coming release because the Distance fit will get selected. But it is strange that mcds.exe fits such a bad function here with monotonicity off. |
@erex looking at the project you just sent me jogged my memory... this anomaly had already been investigated as detailed above and the decision was not to do anything for this up coming release. |
OK Laura. Thanks for sending me to this issue. Should this issue be kept for future reference? |
I have left it open here. I am unsure where the issue is to be resolved however so I'm not sure if we should be opening another bug report somewhere else. |
I noted a discrepancy in log-likelihoods (mrds vs mcds) for a half normal with no adjustments for the ducknest data; with MCDS producing an unreasonable result. I ran the model in question upon the nice data, turning on the
debug
argument, with these resultsConvergence was presumably achieved, but I don't understand the change in initial value of sigma from -0.187 to 4.785. From that (erroneous) initial value, the MCDS optimiser gets stuck around a local minima and produces an absurd estimate of sigma 4.785 (on a log scale), resulting in a P_a = 0.9999999.
Why should MCDS go so badly wrong on a simple model with lovely data?
As a comparison, I checked what happens when a single adjustment term for the same key is fitted to the same data
The initial values are not adjusted to such an extreme measure for this model and the resulting estimated parameters are reasonable.
The text was updated successfully, but these errors were encountered: