Mentions légales du service

Skip to content

Functional test of regression

BIGAUD Nathan requested to merge toy-regression into develop

Must have

  • review data creation
  • review regularizatiopn
  • create tf and torch model
  • use framework everywhere
  • create use existing objects for fmw list
  • check data type conversion in dataset > do I need it
  • deal with framework-specific multiprocessing issues
  • Make sure run_declearn_baseline works
  • Ensure proper seeding of all frameworks
  • Global metrics storing for tests, use r2
  • Transform into actual Assert loss value

Nice to have

  • further simplify regression and make client distribution diff

Issues met during implementation, might require patching elsewhere:

  • Torch : numpy data gets cast to flaot64, while default weights in torch are float32
  • Tensorflow : Naming of wights vs gradients in compute_batch_gradients

Remarks :

  • Errors can be hard to trace - the error message is sent back but not the exact location, requiring quite a bit of detective work to find the exact failing point
  • Logger is very verbose and one gets lost in the flow of information - a priority system + a verbose controller in the logger would be great
  • The original version of the examples bypassed declearn optimization and used the penalty arg of sklearn.linear_model.SGDRegressor for l2 regularization instead of our regularizer. It would be nice to catch those automaticaaly and raise something, because the risk is to do it twice.
Edited by BIGAUD Nathan

Merge request reports