Commit 13df288d authored by Emmanuel Thomé's avatar Emmanuel Thomé
Browse files


parent 7f6cbd1e
......@@ -522,7 +522,37 @@ replace the big files with files of size
`13*8*32768=3407872` bytes.)
After having successfully followed the steps above, a file named
`W.sols0-1` will be in `$DATA`. This file represents a kernel vector.
`K.sols0-1.0.txt` will be in `$DATA`. This file represents a nullspace
vector for the matrix (M|RHS) (where M has dimension 36190697x36190693).
## Back-substituting the linear algebra result in collected relations
The linear algebra result can be understood as the "core" of the
solutions. It is important to back-substitute this data into all
relations that were collected during the relation collection phase, in
order to have a database of known logarithms that can be used for
individual logarithm computations.
This is achieved by the `reconstructlog-dl` program.
$CADO_BUILD/filter/reconstructlog-dl -ell 62310183390859392032917522304053295217410187325839402877409394441644833400594105427518019785136254373754932384219229310527432768985126965285945608842159143181423474202650807208215234033437849707623496592852091515256274797185686079514642651 -mt 28 -log $DATA/K.sols0-1.0.txt -out $DATA/dlp240.reconstructlog.dlog -renumber $DATA/dlp240.renumber.gz -poly dlp240.poly -purged $DATA/purged7.gz -ideals $DATA/p240.ideals7.gz -relsdel $DATA/relsdel7.gz -nrels 2380725637
As written, this command line takes an annoyingly large amount of time
(several weeks). It is possible to reduce this time by precomputing
uncompressed files `$DATA/purged7_withsm.txt` and `$DATA/relsdel7_withsm.txt` that have the Schirokauer maps already computed. These can be computed well ahead of time with the `sm_append` program, which also works with MPI and scales very well. An example command line is:
$MPI/bin/mpiexec [[your favorite mpiexec args]] $CADO_BUILD/filter/sm_append -ell 62310183390859392032917522304053295217410187325839402877409394441644833400594105427518019785136254373754932384219229310527432768985126965285945608842159143181423474202650807208215234033437849707623496592852091515256274797185686079514642651 -poly $HERE/p240.poly -b 4096 -in "/grvingt/zimmerma/dlp240/filter/purged7.gz" -out "${HERE}/purged7.withsm.txt"
We did that in 8 hours on 16 grvingt nodes. Note that the files
`$DATA/purged7_withsm.txt` and `$DATA/relsdel7_withsm.txt` are quite
big: 158G and 2.3TB, respectively.
Once this precomputation is done, the two big files can be used as
drop-in replacements to the corresponding files in the
`reconstructlog-dl` command lines, and the program completes in about two
## Reproducing the individual logarithm result
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment