DONE 조직모드: gptel-org-branching-context
[2024-12-12 Thu 13:25]
서브헤딩으로 가져가면 활용이 편하다
gptel-org-branching-context 인박스에 있을거야. 그래 인박스 정리 좀 하자
karthink/gptel: A simple LLM client for Emacs
(karthink [2023] 2024)
gptel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends. It works in the spirit of Emacs, available at any time and uniformly in any buffer.
지피텔에서 제공하는 클라이언트 중에 다음 목록을 이용한다. 그 중에서 주로 사용하는 녀석은 체크하였다.
-
[X] xAI API key
-
[X] ChatGPT API key
-
[X] Perplexity API key
-
[ ] Anthropic (Claude) API key
-
[ ] Github Models Token
-
[ ] Gemini API key
-
[ ] OpenRouter API key
-
[ ] together.ai API key
-
It's async and fast, streams responses.
-
Interact with LLMs from anywhere in Emacs (any buffer, shell, minibuffer, wherever)
-
LLM responses are in Markdown or Org markup.
-
Supports multiple independent conversations and one-off ad hoc interactions.
-
Supports multi-modal models (include images, documents)
-
Save chats as regular Markdown/Org/Text files and resume them later.
-
You can go back and edit your previous prompts or LLM responses when continuing a conversation. These will be fed back to the model.
-
Don't like gptel's workflow? Use it to create your own for any supported model/backend with a simple API.
gptel uses Curl if available, but falls back to url-retrieve to work without external dependencies.
그외 아직 사용안함
- Ollama Ollama running locally
- Llama.cpp Llama.cpp running locally
- Llamafile Local Llamafile server
- GPT4All GPT4All running locally
- Kagi FastGPT API key
- Kagi Summarizer API key
- Azure Deployment and API key
- Groq API key
- Anyscale API key
- PrivateGPT PrivateGPT running locally
- DeepSeek API key
- Cerebras API key
- Novita AI Token
#쥐피텔 #깃허브 모델 - 코파일럿 비교
[2024-10-17 Thu 07:08]
지피텔에서 깃허브 코파일럿을 연동한다면?! 아주 좋지. 10달러 쓰고 있는데.
Register a backend with ;; Github Models offers an OpenAI compatible API (gptel-make-openai "Github Models" ;Any name you want :host "models.inference.ai.azure.com" :endpoint "/chat/completions" :stream t :key "your-github-token" :models '(gpt-4o-mini)) For all the available models, check the marketplace. You can pick this backend from the menu when using (see Usage).
넣으면 코파일럿과 뭐가 다른가?
- Prototyping with AI models - GitHub Docs - docs.github.com 여기 보면 티어 별로 제공 된다.
- 그렇다면 gpt-4o-mini를 보자. marketplace/models/azure-openai/gpt-4o-mini - github.com 티어가 Low 니까. 이걸로 설정.
- 깃허브 모델도 파악
gptel: Mindblowing integration between Emacs and ChatGPT :
(Simon n.d.)
- Simon, Ben
- ChatGPT and its LLM brethren have been a potent source of FOMO : the AI revolution is happening, and I feel like I’m forever late to th...
- gptel
GPTEL: A simple LLM client for Emacs
(karthink [2023] 2024)
- "karthink/gptel" karthink 2024
GPTEL-QUICK: Quick LLM lookups in Emacs
(karthink [2024] 2024)
-
"karthink/gptel-quick" karthink 2024
키바인딩 + M-w M-ret 그리고 버터를 고정해야 한다.
DONE gptel-quick 프롬프트 작성 #프롬프트
NEXT gptel-context with evil in #스페이스맥스
왜? 모르던 기능에 대한 키 바인딩 준수하다.
관련노트
References
karthink. (2023) 2024. “Karthink/Gptel.” https://github.com/karthink/gptel.
———. (2024) 2024. “Karthink/Gptel-Quick.” https://github.com/karthink/gptel-quick.
Simon, Ben. n.d. “Gptel: Mindblowing Integration between Emacs and ChatGPT.” Accessed August 30, 2024. https://www.blogbyben.com/2024/08/gptel-mindblowing-integration-between.html.