Title: | Chat and FIM with 'Codestral' |
Version: | 0.0.1 |
Date: | 2025-05-07 |
Description: | Create an addin in 'Rstudio' to do fill-in-the-middle (FIM) and chat with latest Mistral AI models for coding, 'Codestral' and 'Codestral Mamba'. For more details about 'Mistral AI API': https://docs.mistral.ai/getting-started/quickstart/ and https://docs.mistral.ai/api/. For more details about 'Codestral' model: https://mistral.ai/news/codestral; about 'Codestral Mamba': https://mistral.ai/news/codestral-mamba. |
License: | MIT + file LICENSE |
Encoding: | UTF-8 |
RoxygenNote: | 7.3.2 |
Imports: | dplyr, httr, jsonlite, magrittr, rstudioapi, stringr, utils |
Depends: | R (≥ 4.1) |
LazyData: | true |
Suggests: | testthat (≥ 3.0.0) |
Config/testthat/edition: | 3 |
URL: | https://urbs-dev.github.io/codestral/ |
BugReports: | https://github.com/urbs-dev/codestral/issues |
NeedsCompilation: | no |
Packaged: | 2025-05-07 08:37:11 UTC; marcgrossouvre2 |
Author: | Marc Grossouvre [aut, cre], URBS company [cph, fnd] |
Maintainer: | Marc Grossouvre <marcgrossouvre@urbs.fr> |
Repository: | CRAN |
Date/Publication: | 2025-05-09 09:50:13 UTC |
codestral: Chat and FIM with 'Codestral'
Description
Create an addin in 'Rstudio' to do fill-in-the-middle (FIM) and chat with latest Mistral AI models for coding, 'Codestral' and 'Codestral Mamba'. For more details about 'Mistral AI API': https://docs.mistral.ai/getting-started/quickstart/ and https://docs.mistral.ai/api/. For more details about 'Codestral' model: https://mistral.ai/news/codestral; about 'Codestral Mamba': https://mistral.ai/news/codestral-mamba.
Author(s)
Maintainer: Marc Grossouvre marcgrossouvre@urbs.fr
Other contributors:
URBS company contact@rubs.fr [copyright holder, funder]
See Also
Useful links:
Endpoints for the Codestral API.
Description
Endpoints for the Codestral API.
Usage
ENDPOINTS
Format
A named list with elements chat and completion.
Fill in the middle with Codestral
Description
This function completes a given prompt using the Codestral API. It supports different models for fill-in-the-middle, chat with Codestral, and chat with Codestral Mamba. The function relies on environment variables for some parameters.
Usage
codestral(
prompt,
mistral_apikey = Sys.getenv(x = "R_MISTRAL_APIKEY"),
codestral_apikey = Sys.getenv(x = "R_CODESTRAL_APIKEY"),
fim_model = Sys.getenv(x = "R_CODESTRAL_FIM_MODEL"),
chat_model = Sys.getenv(x = "R_CODESTRAL_CHAT_MODEL"),
mamba_model = Sys.getenv(x = "R_MAMBA_CHAT_MODEL"),
temperature = as.integer(Sys.getenv(x = "R_CODESTRAL_TEMPERATURE")),
max_tokens_FIM = Sys.getenv(x = "R_CODESTRAL_MAX_TOKENS_FIM"),
max_tokens_chat = Sys.getenv(x = "R_CODESTRAL_MAX_TOKENS_CHAT"),
role_content = Sys.getenv(x = "R_CODESTRAL_ROLE_CONTENT"),
suffix = ""
)
Arguments
prompt |
The prompt to complete. |
mistral_apikey , codestral_apikey |
The API keys to use for accessing
Codestral Mamba and Codestral. Default to the value of the
|
fim_model |
The model to use for fill-in-the-middle. Defaults to the
value of the |
chat_model |
The model to use for chat with Codestral. Defaults to the
value of the |
mamba_model |
The model to use for chat with Codestral Mamba. Defaults to the
value of the |
temperature |
The temperature to use. Defaults to the value of the
|
max_tokens_FIM , max_tokens_chat |
Integers giving the maximum number of
tokens to generate for FIM and chat. Defaults to the value of the
|
role_content |
The role content to use. Defaults to the value of the
|
suffix |
The suffix to use. Defaults to an empty string. |
Value
A character string containing the completed text.
Initialize codestral
Description
Create environment variables for operationg FIM and chat.
Usage
codestral_init(
mistral_apikey = Sys.getenv(x = "R_MISTRAL_APIKEY"),
codestral_apikey = Sys.getenv(x = "R_CODESTRAL_APIKEY"),
fim_model = "codestral-latest",
chat_model = "codestral-latest",
mamba_model = "open-codestral-mamba",
temperature = 0,
max_tokens_FIM = 100,
max_tokens_chat = "",
role_content = NULL
)
Arguments
mistral_apikey , codestral_apikey |
The API keys to use for accessing
Codestral Mamba and Codestral. Default to the value of the
|
fim_model |
A string giving the model to use for FIM. |
chat_model |
A string giving the model to use for Codestral chat. |
mamba_model |
A string giving the model to use for Codestral Mamba chat. |
temperature |
An integer giving the temperature to use. |
max_tokens_FIM , max_tokens_chat |
Integers giving the maximum number of tokens to generate for each of these operations. |
role_content |
A role to assign to the system Default is "You write programs in R language only. You adopt a proper coding approach by strictly naming all the functions' parameters when calling any function with named parameters even when calling nested functions, by being straighforward in your answers." |
Details
The most important paremeters here are the ..._apikey
parameters
without which the Mistral AI API can not be used.
To start with, beginners may keep default values for other parameters. It
seems sound to use the latest models of each type. However with time, the
user may be willing to customize temperature
, max_tokens_FIM
, max_tokens_chat
and
role_content
for his/her own needs.
Value
Invisible 0
.
Analyses a prompt to re-buid the dialog
Description
Analyses a prompt to re-buid the dialog
Usage
compile_dialog(prompt)
Arguments
prompt |
The prompt to analyse. A vector of strings. |
Value
A list with the chatter (Codestral or Codestral Mamba) and the dialog in a data.frame whith columns role
and content
.
Fill in the middle or complete
Description
This function splits the current script into two parts: the part before the cursor and the part after the cursor.
Usage
complete_current_script()
Value
A character vector containing the two parts of the script.
Read and include files in a prompt
Description
Read and include files in a prompt
Usage
include_file(prompt, anyFile)
Arguments
prompt |
A vector of strings. |
anyFile |
A boolean of the same length of prompt indicating that an instruction |
Details
If anyFile[i]
is TRUE
then the sequence of characters following the instruction "ff:"
in prompt[i]
is read until the next space or the end of the string. This extracted string is assumed to be a file name. This file is looked for in the current working directory or any of its sub-directories. Once detected, the file is read with readLines()
and this content is inserted in prompt
between prompt[i-1]
and prompt[i+1]
. Note that prompt[i]
is therefore deleted.
The result is returned.
Value
A vector of strings containing prompt augmented by the files refered to in the original prompt.
Insert the model's answer
Description
This function inserts a Codestral FIM into the current script.
Usage
insert_addin()
Value
0
(invisible).