Skip to content
GitLab
Menu
Projects
Groups
Snippets
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
cado-nfs
records
Commits
ba2c0178
Commit
ba2c0178
authored
May 27, 2020
by
ZIMMERMANN Paul
Browse files
remove (for now) the "Simulating the filtering output" sections
parent
5f643c34
Changes
3
Hide whitespace changes
Inline
Side-by-side
dlp240/README.md
View file @
ba2c0178
...
...
@@ -179,23 +179,6 @@ print (cost_in_core_hours, cost_in_core_years)
With this experiment, we get 20.9 core.sec per special-q, and therefore
we obtain about 2430 core.years for the total sieving time.
## Simulating the filtering output
This feature is still experimental and we do not claim full
reproducibility for this part of our work which was useful only to help
us choosing the parameters, but not for the proper computation.
To do this simulation, you need to build the `fake_rels` binary (just run
`make fake_rels`, in principle), and then use the script
[`scripts/estimate_matsize.sh`](https://gitlab.inria.fr/cado-nfs/cado-nfs/-/blob/8a72ccdde/scripts/estimate_matsize.sh)
available in the cado-nfs repository. An example command line can be:
```
mkdir $DATA/dlp240-estimate-matsize
NCHUNKS=6 NBSAMPLE=64 CADO_BUILD=$CADO_BUILD A=31 lim0=536870912 lim1=268435456 lpb0=35 lpb1=35 qmin=150000000 qmax=300000000 mfb0=70 mfb1=70 dlp=true threads=32 parallel=true allow_compsq=true qfac_min=8192 qfac_max=100000000 sqside=0 wdir=$DATA/dlp240-estimate-matsize shrink_factor=5 target_density=250 ./scripts/estimate_matsize.sh dlp240.poly
```
TODO: document what this predicts. Update numbers below as well.
## Estimating linear algebra time (coarsely)
After the fact, we know the matrix size for DLP-240 (about 36M, density
...
...
rsa240/README.md
View file @
ba2c0178
...
...
@@ -8,7 +8,6 @@ Several chapters are covered.
*
[
Searching for a polynomial pair
](
#searching-for-a-polynomial-pair
)
*
[
Estimating the number of (unique) relations
](
#estimating-the-number-of-unique-relations
)
*
[
Estimating the cost of sieving
](
#estimating-the-cost-of-sieving
)
*
[
Simulating the filtering output
](
#simulating-the-filtering-output
)
*
[
Estimating linear algebra time (coarsely)
](
#estimating-linear-algebra-time-coarsely
)
*
[
Validating the claimed sieving results
](
#validating-the-claimed-sieving-results
)
*
[
Reproducing the filtering results
](
#reproducing-the-filtering-results
)
...
...
@@ -312,36 +311,6 @@ print (cost_in_core_hours, cost_in_core_years)
With this experiment, we get 67.4 core.sec per special-q, and therefore
we obtain about 510 core.years for this sub-range.
## Simulating the filtering output
This feature is still experimental and we do not claim full
reproducibility for this part of our work which was useful only to help
us choose the parameters, but not for the proper computation.
In fact, our simulator could not take into account sieving parameters
that vary with the size of the special-q. Therefore the experiment was
run with always 2+3 large primes (2 on the rational side, and 3 on the
algebraic side), leading to an underestimate of the
matrix size (since we only used 2 large primes in the q-range [2.1e9,7.4e9]).
Nevertheless, this was a good way to check that the linear
algebra step was feasible with our resources.
To do this simulation, we need to build the
`fake_rels`
binary (just run
`make fake_rels`
, in principle), and then use the script
[
`scripts/estimate_matsize.sh`
](
https://gitlab.inria.fr/cado-nfs/cado-nfs/-/blob/8a72ccdde/scripts/estimate_matsize.sh
)
available in the cado-nfs repository. An example command line can be:
```
mkdir $DATA/rsa240-estimate-matsize
NCHUNKS=6 NBSAMPLE=64 CADO_BUILD=$CADO_BUILD A=32 lim0=1800000000 lim1=2100000000 lpb0=36 lpb1=37 qmin=800000000 qmax=7400000000 mfb0=72 mfb1=111 dlp=false threads=32 parallel=true allow_compsq=false wdir=$DATA/rsa240-estimate-matsize shrink_factor=5 target_density=200 ./scripts/estimate_matsize.sh rsa240.poly
```
Note however that this script takes several hours to run, and is somewhat
fragile. Approximately 300GB of disk space are needed to run this script.
Your mileage may vary. The script should output matrix dimensions
that, once multiplied by 5 (shrink_factor),
give the expected matrix size for the
experiment. The experiment above thus predicts a matrix size of 320M rows
and columns.
## Estimating linear algebra time (coarsely)
After the fact, we know the matrix size for RSA-240 (about 282M, density
...
...
rsa250/README.md
View file @
ba2c0178
...
...
@@ -214,16 +214,6 @@ print (cost_in_core_hours, cost_in_core_years)
With this experiment, we get 116.5 core.sec per special-q, and therefore
we obtain about 1300 core.years for this sub-range.
## Simulating the filtering output
Just use the script
[
`cado-nfs/scripts/estimate_matsize.sh`
](
https://gitlab.inria.fr/cado-nfs/cado-nfs/-/blob/master/scripts/estimate_matsize.sh
)
available in the cado-nfs repository.
This feature is still experimental and we do not claim full
reproducibility for this part of our work which was useful only to help
us choosing the parameters, but not for the proper computation.
## Estimating linear algebra time (coarsely)
## Validating the claimed sieving results
...
...
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment