:orphan: .. _Auto-generated file, do not edit manually ... _Toolbox generate command: repo generate_toolbox_rst_documentation _ Source component: Mac_Ai.remote_llama_cpp_pull_model mac_ai remote_llama_cpp_pull_model ================================== Pulls a model with llama-cpp, on a remote host Parameters ---------- ``base_work_dir`` * The base directory where to store things ``path`` * The path to the llama-cpp binary ``name`` * The name of the model to fetch ``dest`` * If specified, where to put the model being pulled