Innovative AI Solutions: GPT Engineer Powered by LM Studio
In this brief guide, we'll walk you through the simple steps to integrate GPT-Engineer with LM Studio, unlocking a world of possibilities for your projects.
By making a quick tweak in the AI module, you can seamlessly connect GPT-Engineer to LM Studio. Here's how it's done:
- Location: Navigate to your GPT Engineer installation directory. Typically, it resides in a path similar to
C:\Users\Admin\Downloads\gpt-engineer\core
. - Open the AI Module: Locate the
ai.py
file within the core directory. - Make the Swap:
Change this to:
return ChatOpenAI(
model=self.model_name,
temperature=self.temperature,
streaming=self.streaming,
callbacks=[StreamingStdOutCallbackHandler()],
)
Following:
return ChatOpenAI(base_url="http://localhost:1234/v1", api_key="lm-studio", model="mistral-7b-instruct-v0.2.Q4_K_S")
That's it! With this simple substitution, you've seamlessly integrated GPT Engineer with LM Studio, unleashing the power of advanced AI conversations.
Comments
Post a Comment