Skip to content

Commit 8b29b88

Browse files
committed
refactor(chat_handler): extract MTMDChatHandler base class and Simplify the complexity of subsequent multimodal adaptation
- Extracted the core multimodal processing pipeline from `Llava15ChatHandler` into a generic `MTMDChatHandler` base class, separating pipeline logic from model-specific prompt formats. - Updated all multimodal subclass handlers (e.g., Qwen2.5vl, Qwen3-vl, MiniCPM, GLM4, LFM2-VL) to inherit from the new `MTMDChatHandler`. - Implemented strict `**kwargs` validation in the baseconstructor to gracefully intercept and report unsupported parameters, significantly improving Developer Experience (DX). - Introduced dynamic `self.log_prefix` (`self.__class__.__name__`) for accurate and consistent logging across all subclasses. - Cleaned up redundant state-clearing, image-count logic and hardcoded print statements across subclass `__call__` implementations. - To avoid exceptions occurring when the close method is called due to initialization failure and the call to exit_stack.
1 parent 00da436 commit 8b29b88

3 files changed

Lines changed: 129 additions & 121 deletions

File tree

llama_cpp/_internals.py

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,9 @@ def close(self):
8585
self.model = None
8686
self.vocab = None
8787

88-
self._exit_stack.close()
88+
if getattr(self, "_exit_stack", None) is not None and hasattr(self._exit_stack, "close"):
89+
self._exit_stack.close()
90+
self._exit_stack = None
8991

9092
def __del__(self):
9193
self.close()
@@ -386,8 +388,11 @@ def close(self):
386388
except Exception:
387389
pass
388390
self.ctx = None
391+
self.params = None
389392

390-
self._exit_stack.close()
393+
if getattr(self, "_exit_stack", None) is not None and hasattr(self._exit_stack, "close"):
394+
self._exit_stack.close()
395+
self._exit_stack = None
391396

392397
def __del__(self):
393398
self.close()
@@ -662,7 +667,9 @@ def close(self):
662667
pass
663668
self.batch = None
664669

665-
self._exit_stack.close()
670+
if getattr(self, "_exit_stack", None) is not None and hasattr(self._exit_stack, "close"):
671+
self._exit_stack.close()
672+
self._exit_stack = None
666673

667674
def __del__(self):
668675
self.close()

llama_cpp/llama.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -682,7 +682,9 @@ def close(self) -> None:
682682
self._c_tensor_split = None
683683
self._kv_overrides_array = None
684684

685-
self._stack.close()
685+
if getattr(self, "_stack", None) is not None and hasattr(self._stack, "close"):
686+
self._stack.close()
687+
self._stack = None
686688

687689
def __del__(self) -> None:
688690
self.close()

0 commit comments

Comments
 (0)