Now, it's possible to use arbitrary `Modules` inside a Keras Core model/layer (with the torch backend). The `TorchModuleWrapper` class turns them into Keras layers, keeping tracking of trainable weights / etc.
Keras Core now supports arbitrary PyTorch Modules with TorchModuleWrapper
By
–
Leave a Reply