Skip to content

Commit 7295ce2

Browse files
authored
Fix two deadlinks in README.md (#21)
* Fix deadlines in README.md * Update README.md
1 parent 9a5da6d commit 7295ce2

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -56,11 +56,11 @@ and/or
5656
The Python API of TensorRT-LLM is architectured to look similar to the
5757
[PyTorch](https://pytorch.org) API. It provides users with a
5858
[functional](./tensorrt_llm/functional.py) module containing functions like
59-
`einsum`, `softmax`, `matmul` or `view`. The [layer](./tensorrt_llm/layer)
59+
`einsum`, `softmax`, `matmul` or `view`. The [layers](./tensorrt_llm/layers)
6060
module bundles useful building blocks to assemble LLMs; like an `Attention`
6161
block, a `MLP` or the entire `Transformer` layer. Model-specific components,
6262
like `GPTAttention` or `BertAttention`, can be found in the
63-
[model](./tensorrt_llm/model) module.
63+
[models](./tensorrt_llm/models) module.
6464

6565
TensorRT-LLM comes with several popular models pre-defined. They can easily be
6666
modified and extended to fit custom needs. See below for a list of supported

0 commit comments

Comments
 (0)