Hi All,
I am keen to find out if having a high end CPU + Board + GPU combination can allow options like LLAMA2 to support me in financial and systems research? Please pardon me for the noobish question. I just have some broad sense of what LLMs can do (and can not), based on ChatGPT and the few other well known alternatives. Yet, the public alternatives are heavily censored, and seem to be less useful that I might like them to be.
I know that it is probably too ambitious, but can the models be used as a "personal assistant" which can keep track of different datasets / ongoing projects / compare different research papers being published and give brief summaries of the papers when needed, so that I can decide whether or not to read them. That can be a great help, but I am conscious that I might be asking for the moon, since LLMs tend to "hallucinate".
I am keen to find out if having a high end CPU + Board + GPU combination can allow options like LLAMA2 to support me in financial and systems research? Please pardon me for the noobish question. I just have some broad sense of what LLMs can do (and can not), based on ChatGPT and the few other well known alternatives. Yet, the public alternatives are heavily censored, and seem to be less useful that I might like them to be.
I know that it is probably too ambitious, but can the models be used as a "personal assistant" which can keep track of different datasets / ongoing projects / compare different research papers being published and give brief summaries of the papers when needed, so that I can decide whether or not to read them. That can be a great help, but I am conscious that I might be asking for the moon, since LLMs tend to "hallucinate".