It's actually more ironic. The results of one-time NAS (which took 88X less CO2 than the external paper estimated) are open-sourced and make training language models 1.3X faster & produce 1.3X less emissions (see figure 4 in https://
arxiv.org/abs/2104.10350) https://
github.com/tensorflow/ten
sor2tensor/blob/master/tensor2tensor/models/evolved_transformer.py
…
NAS Results Reduce Language Model Training Emissions by 1.3X
By
–
Leave a Reply