Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapting the App to work with Multiple LLMs #13

Open
TSSFL opened this issue Aug 10, 2024 · 4 comments
Open

Adapting the App to work with Multiple LLMs #13

TSSFL opened this issue Aug 10, 2024 · 4 comments
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed

Comments

@TSSFL
Copy link

TSSFL commented Aug 10, 2024

How could one adapt the app to work with multiple LLMs, especially, Llama3 and Claude? Would be great to generalize the app's functionality and identify any necessary modifications.

Any guidance, suggestions, or technical assistance from the community would be greatly appreciated.

TSSFL

@fjosue4 fjosue4 added enhancement New feature or request help wanted Extra attention is needed good first issue Good for newcomers labels Aug 10, 2024
@fjosue4
Copy link
Owner

fjosue4 commented Aug 10, 2024

Hey @TSSFL in order to call different API services you could add a Selector with the available options on the Setup.tsx

For each option on the select input, you will need to associate it with a different endpoint, so you can create a state on userSlice.tsx named selectedApi which will store the string you need to call the API

Using setUser as is now, you can store the selectedApi based on the user Selection.

    setUser: (state, action) => {
      state.name = action.payload.name
      state.API_KEY = action.payload.API_KEY
      state.proxy = action.payload.proxy
      state.selectedAPI = action.payload.selectedApi
    }

Then use that selectedApi state on the dispatcher

${proxy ? proxy : ''}https://generativelanguage.googleapis.com/v1beta/models/gemini-pro:generateContent?key=${apiKey}

Then replace the Google API with the restructured API endpoint

const { API_KEY: apiKey, proxy, conversation, selectedApi } = currentState.user

${proxy ? proxy : ''}${selectedApi}${apiKey}

Let me know if this helps you to have one path to complete your objective.

@TSSFL
Copy link
Author

TSSFL commented Aug 13, 2024

Thank you @fjosue4 , will work on this and come up for any help/clarification.

@fjosue4
Copy link
Owner

fjosue4 commented Aug 29, 2024

@TSSFL with the last update I sent there's a selector for the model, you could use the same but instead of a model for Gemini, you could ask for the AI endpoint you want to point in the setup page, so you provide the API key according yo the AI provider.

@TSSFL
Copy link
Author

TSSFL commented Aug 30, 2024

Thank you @fjosue4 for the updates. This is a great way to provide the API key based on the AI provider. I’ll go ahead and test the LLM web app.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants