-
Notifications
You must be signed in to change notification settings - Fork 376
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Bug Description
Configuration : llm_examples_main branch, current torch version : 2.4, transformers==4.41.2
One of the subgraphs is receiving a SymInt node (s0+1) which relies on other SymIntnodes. Hence the node.meta does not have the shape information of this node but it has info about (s0). File a bug with PyTorch if this can be computed.
Error message:
Traceback (most recent call last):
File "/work/TensorRT/examples/dynamo/torch_export_llama2.py", line 38, in <module>
trt_model = torch_tensorrt.dynamo.compile(
File "/work/TensorRT/py/torch_tensorrt/dynamo/_compiler.py", line 227, in compile
trt_gm = compile_module(gm, inputs, settings)
File "/work/TensorRT/py/torch_tensorrt/dynamo/_compiler.py", line 365, in compile_module
submodule_inputs = partitioning.construct_submodule_inputs(submodule)
File "/work/TensorRT/py/torch_tensorrt/dynamo/partitioning/common.py", line 96, in construct_submodule_inputs
torchtrt_inputs.append(get_input(input_shape, input_meta.dtype))
File "/work/TensorRT/py/torch_tensorrt/dynamo/partitioning/common.py", line 69, in get_input
return construct_dynamic_input(
File "/work/TensorRT/py/torch_tensorrt/dynamo/partitioning/common.py", line 40, in construct_dynamic_input
assert var_range, var_val
AssertionError: NoneTo Reproduce
python examples/dynamo/torch_export_llama2.py
Environment
Build information about Torch-TensorRT can be found by turning on debug messages
- Torch-TensorRT Version (e.g. 1.0.0):
- PyTorch Version (e.g. 1.0):
- CPU Architecture:
- OS (e.g., Linux):
- How you installed PyTorch (
conda,pip,libtorch, source): - Build command you used (if compiling from source):
- Are you using local sources or building from archives:
- Python version:
- CUDA version:
- GPU models and configuration:
- Any other relevant information:
Additional context
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working