Feat: adding rpc_servers parameter to Llama class#1477
Conversation
rpc_servers parameter to Llama classrpc_servers parameter to Llama class
rpc_servers parameter to Llama classrpc_servers parameter to Llama class
f86d077 to
00a34ea
Compare
|
@chraac great work! Could you also include the build flags in the README as well as maybe a small section on running the rpc servers? |
yeah, sure, will add to readme, thanks for the reply! |
12bac9b to
7854795
Compare
|
@abetlen add a section in |
d0a79b8 to
40e3247
Compare
|
@abetlen , could you have another look in convenient please? have added a section regarding how to build the RPC backend package. |
|
Hi, I apologize for asking this here, but I'm trying to understand how to use the RPC_SERVERS functionality to enable inference across multiple machines. Can anyone help clarify this for me or give me an example?. Thanks in advance. |
|
How do I use this...? |
|
Hi @juanjfrancisco @statchamber , sorry for the inconvient here, currently the
|
This PR include those changes:
rpc_serverstoLlama, to bypass therpc_serversto llama.cpp libLLAMA_RPCflagTested on my machine, work as expected
Closes #1455