GitLab upgrade completed. Current version is 17.10.3. We now benefit from the enhancements of 17.9 and 17.10 releases. Among other improvements, it is now possible to set the automatic deletion of continuous integration pipelines. You can help us reduce storage usage by setting an expiry date for your pipelines.
Yes, this seems to avoid the infinite loop.
How likely is a random curve parameter to produce an invalid curve? Is there any plausible risk of ECM exiting with error because it randomly chose 10 bad parameters?
for param=1, since d = sigma^2/2^64 and we want d different from 0 and 1 (mod n), I would say there are at most 3 bad values of sigma. For n=4 this gives a probability of 5% for 10 tries. Anyway this is better than an infinite loop. Maybe we could store the sigma values already tried (mod n) and only try "new" values. With only 4 tries this would ensure we find a valid value.
I did a quick test; n = 4 always instantly finds the factor 2, which is not a problem. n = 5 fails to find a valid sigma with 10 tries in about 1 in 150 runs.
I know nothing about the new parametrizations. Would it be possible, in case of an invalid sigma value, to try sigma_new = ceil(sqrt(2^64*(d+1))), so that the new value of d is 1 greater than the old one? Then we could reliably step over the bad values 0 and 1 (mod n) and find a good sigma with at most 3 tries.