Fix workload targets and max runtimes in docs#685
Fix workload targets and max runtimes in docs#685priyakasimbeg merged 1 commit intomlcommons:devfrom
Conversation
|
MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅ |
|
Hi Runa,
Right they inherit from the conformer JAX and PyTorch workloads to reuse the input pipeline and other methods. We actually had a bug until a few months ago where the DeepSpeech workloads were inheriting all the properties from the conformer workloads (including targets). The targets and runtimes were specified in a Librispeech DeepSpeech parent workload that was not being used so I removed it (#526). |
Fixes #660 and #684.
@priyakasimbeg I noticed that we don't have a shared parent class for the librispeech deepspeech workloads and specify the targets, max runtime, and other properties twice -- once for Jax and once for PyTorch. Is there any reason for that? If not we should probably create a separate workload parent class. Otherwise it is easy to introduce a bug by accidentally making these values inconsistent.