# Avoid issues when wine is installed.
sudo su -c 'echo 0 > /proc/sys/fs/binfmt_misc/status'
And: # Capture the entirety of the instructions to obtain the input length.
readonly INSTRUCT=$(
join ${PATH_PREFIX_SYSTEM} ${PATH_PROMPT_SYSTEM} ${PATH_PREFIX_SYSTEM}
join ${PATH_SUFFIX_USER} ${PATH_PROMPT_USER} ${PATH_SUFFIX_USER}
join ${PATH_SUFFIX_ASSIST} "/dev/null" ${PATH_SUFFIX_ASSIST}
)
(
echo ${INSTRUCT}
) | ./llamafile \
-m "${LINK_MODEL}" \
-e \
-f /dev/stdin \
-n 1000 \
-c ${#INSTRUCT} \
--repeat-penalty 1.0 \
--temp 1.5 \
--silent-prompt > output.txt> sudo su -c 'echo 0 > /proc/sys/fs/binfmt_misc/status'
Please don’t recommend this. If binfmt_misc is enabled, it’s probably for a reason, and disabling it will break things. I have a .NET/Mono app installed that it would break, for example—it’s definitely not just Wine.
If binfmt_misc is causing problems, the proper solution is to register the executable type. https://github.com/mozilla-ai/llamafile#linux describes steps.
I made myself a package containing /usr/bin/ape and the following /usr/lib/binfmt.d/ape.conf:
:APE:M::MZqFpD::/usr/bin/ape:
:APE-jart:M::jartsr::/usr/bin/ape:1. Curate a continuously extended inventory of prebuilt llamafiles for models as they are released 2. Create both flexible builds (with dynamic backend loading for cpu and cuda) and slim minimalist builds 3. Upstreaming as much as they can into llama.cpp and partner with the project
It seems people have moved on from Llamafile. I doubt Mozilla AI is going to bring it back.
This announcement didn't even come with a new code commit, just a wish. https://github.com/mozilla-ai/llamafile/commits/main/
jart•1d ago
bsenftner•7h ago
rvz•6h ago
I don't know if you were informed but you realize that jart is no longer at Mozilla anymore and now at Google Inc?
setheron•1h ago
setheron•1h ago