progress.
This commit is contained in:
parent
611c27dc25
commit
d5204b6705
2
Makefile
2
Makefile
@ -42,6 +42,8 @@ run_rd:
|
||||
run_mingw:
|
||||
./r.exe --verbose
|
||||
|
||||
|
||||
|
||||
docker: docker_make docker_run
|
||||
|
||||
docker_make:
|
||||
|
70
README.md
70
README.md
@ -1,64 +1,26 @@
|
||||
# R
|
||||
|
||||
Note for the hackers: the openai api key in history of this repo is revoked. I am aware there's one. But for so far, thanks for checking it out. I feel very safe and stuff. Something about this site attracks certain people. Mostly with good intentions AFAIK.
|
||||
This is a CLI vibe coding tool written in C and is available for Linux x86 only.
|
||||
It works on OpenAI but is easy to convert to Ollama since Ollama supports OpenAI api on /v1/api.
|
||||
The tool is made for my private use, but since it works so well, I decided to publish it. It contains some experimental stuff like Claude. Claude works.
|
||||
|
||||
## Downloads
|
||||
It is possible that you need to install python3.14-dev to make it work!
|
||||
I'm considering to remove the python support because of this or have to figure out how to link python static.
|
||||
The python support is not finished yet, but it can be handy, it can manipulate data send or retrieval.
|
||||
### Free version
|
||||
Don't forget to `chmod +x`!
|
||||
```
|
||||
wget https://retoor.molodetz.nl/api/packages/retoor/generic/rf/1.0.0/rf
|
||||
```
|
||||
### Commercial version
|
||||
This version requires OpenAI API key to be put in a environment variable named `R_KEY`.
|
||||
Don't forget to `chmod +x`!
|
||||
If you need some help to get Ollama / Claude working, contact me on [Snek](https://snek.molodetz.nl).
|
||||
|
||||
What I vibed with it:
|
||||
- A part of the tool itself. It had many example code so it was easy to add modifications.
|
||||
- A web socket notification system in C#.
|
||||
|
||||
The application has a built-in OPENAI_API_KEY with limited resources, so people can try.
|
||||
|
||||
## Download
|
||||
```
|
||||
wget https://retoor.molodetz.nl/api/packages/retoor/generic/r/1.0.0/r
|
||||
```
|
||||
### TLDR:
|
||||
Make your own advanced chatbot with context by describing how the bot should answer in `~/.rcontext.txt`.
|
||||
For example `~/.rcontext.txt`:
|
||||
```
|
||||
Respond in the way the dark lord voldemort would do but do provide an usefull answer.
|
||||
You do like pineapple on pizza.
|
||||
Docs are better than cats.
|
||||
Epstein didn't kill himself.
|
||||
```
|
||||
Besides that, it gives nice colored answers as in screenshot below. Have fun!
|
||||
|
||||
## Project description
|
||||
R is a great and fast command line interface for gpt. It's also optimized for integration into other tools like vim. It works with base context files so that your client has a certain personality or way of writing that you prefer always. I always want to to give examples using the C programming language without mentioning it for example. This archievable by making a file like [.rcontext.txt](.rcontext.txt). The `.rcontext.txt` in your local directory has priority over the one in your home directory what is also an option.
|
||||
## Configure OpenAI API key
|
||||
Update your bashrc with `export OPENAI_API_KEY=sk-...`.
|
||||
|
||||
Essentially, you could write your own complete - and quite advanced - customized chatbot using the `.rcontext.txt` file.
|
||||
## Working on an existing project.
|
||||
When starting on existing project, use `init`. Now you can ask it to make modifications to your files / system.
|
||||
|
||||
You can use this application interactive or in a script like this:
|
||||
```bash
|
||||
r where do ponies come from?
|
||||
```
|
||||
|
||||
## Features
|
||||
- navigate trough history using arrows.
|
||||
- navigate trough history with recursive search using ctrl+r.
|
||||
- inception with python for incomming and outgoing content.
|
||||
- markdown and syntax highlighting.
|
||||
- execute python commands with prefixing !
|
||||
- list files of current workdirectory using ls.
|
||||
- type serve to start a webserver with directory listing. Easy for network transfers.
|
||||
|
||||
## Default configuration
|
||||
|
||||
- model temperature is 0.500000.
|
||||
- model name is gpt-4o-mini.
|
||||
- max tokens is 100.
|
||||
|
||||
## In development
|
||||
|
||||
- google search and actions with those results.
|
||||
- reminders.
|
||||
- predefined templates for reviewing / refactoring so you can personalize.
|
||||
|
||||
## Screenshot(s)
|
||||
|
||||

|
7
main.c
7
main.c
@ -13,6 +13,8 @@
|
||||
#include "utils.h"
|
||||
#include "db_utils.h"
|
||||
|
||||
#include "tools.h"
|
||||
|
||||
volatile sig_atomic_t sigint_count = 0;
|
||||
time_t first_sigint_time = 0;
|
||||
bool SYNTAX_HIGHLIGHT_ENABLED = true;
|
||||
@ -150,7 +152,10 @@ void repl() {
|
||||
continue;
|
||||
}
|
||||
if (line && *line != '\n') line_add_history(line);
|
||||
|
||||
if(!strncmp(line, "!tools", 6)) {
|
||||
printf("Available tools: %s\n", json_object_to_json_string(tools_descriptions()));
|
||||
continue;
|
||||
}
|
||||
if (!strncmp(line, "!models", 7)) {
|
||||
printf("Current model: %s\n", openai_fetch_models());
|
||||
continue;
|
||||
|
8
r.h
8
r.h
@ -4,7 +4,7 @@
|
||||
#include "utils.h"
|
||||
#include <stdbool.h>
|
||||
#include <string.h>
|
||||
bool is_verbose = false;
|
||||
bool is_verbose = true;
|
||||
|
||||
#ifndef RD
|
||||
#ifndef OLLAMA
|
||||
@ -19,11 +19,11 @@ char *fast_model = "gpt-3.5-turbo";
|
||||
#ifdef RD
|
||||
|
||||
char *models_api_url = "https://api.openai.com/v1/models";
|
||||
char *completions_api_url = "https://api.deepinfra.com/v1/openai/chat/completions";
|
||||
char *advanced_model = "meta-llama/Llama-3.3-70B-Instruct-Turbo";
|
||||
char *completions_api_url = "https://api.anthropic.com/v1/chat/completions";
|
||||
char *advanced_model = "claude-3-5-haiku-20241022";
|
||||
//char *advanced_model = "meta-llama/Meta-Llama-3.1-8B-Instruct";
|
||||
//char *advanced_model = "google/gemini-1.5-flash";
|
||||
char *fast_model = "Qwen/Qwen2.5-Coder-32B-Instruct";
|
||||
char *fast_model = "claude-3-5-haiku-20241022";
|
||||
|
||||
#endif
|
||||
#ifdef OLLAMA
|
||||
|
Loading…
Reference in New Issue
Block a user