Hi there – I’m hosting Aleph locally and love it, but was wondering whether there would be an easy way to feed it into a LLM like mistral-7B that’s also hosted locally on my machine? I’m not a developer, so this is definitely beyond my expertise. I’ve been relying on ChatGPT to try and find an answer, but thought I’d see if anyone has ideas and could maybe help point me in the right direction. Thanks!
Hi fluffheady, welcome to Aleph
Aleph is quite a complex software so I doubt that just ChatGPT can help you to integrate a LLM into the stack. It depends what you want to achieve, what problems / tasks a local LLM should do. If you want to use it to preprocess documents, you should look to integrate it into the library ingest-file
which takes care of all the document ingestion. If you want it to use to “chat” with your documents, the place to integrate it would be different, somehow between the aleph app and it’s ui.
That said, there is not a simple answer. And on another note, I can’t speak for the open source Aleph version from OCCRP. I am not sure if this is still actively developed. That’s why we at the Data and Research Center created an updated version of Aleph, calling it OpenAleph from now on, which is actively under development of new features and we are currently discussing how to work with
AI
– including LLMs – in this software. You can join the discussion over there in our discourse.
Cheers,
Simon
Thanks Simon, appreciate you following up!