With the new tabpfn v2.1 release, fine-tuning is now supported. Similar to LLMs, you can fine-tune tabular foundation models on your proprietary data to boost your performance.
A community member wrote a great walkthrough here: https://medium.com/@iivalchev/how-to-fine-tune-tabpfn-on-you...
You can also check out the example files directly at these links:
> Fine-tune classifier: https://github.com/PriorLabs/TabPFN/blob/main/examples/finet...
> Fine-tune regressor: https://github.com/PriorLabs/TabPFN/blob/main/examples/finet...
If you want to chat or see what others are experimenting with, there’s a Discord too: https://discord.com/invite/VJRuU3bSxt
Would love to hear what you think when you try it!