I'm using the Llava15ChatHandler but it seems I don't see anything for Llava16ChatHandler by looking at the source code? Moreover, it contains hard-coded templating instead of having support for custom in-model given the prop-value from the metadata tokenizer.chat_template by Nous Hermes 2 Yi 34B for example (Link) which is quite different from the hard-coded one? Any plans for that? Is LLava 1.6 really supported or should I fallback to the parent project?
Update: Seems using the current codebase state, I can get fairly okay results during inference but not sure if there might be some regression, need to check the original and compare.