This website requires JavaScript.
Explore
Help
Sign In
michael
/
merlyn
Watch
1
Star
0
Fork
0
You've already forked merlyn
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
5f6a013139
merlyn
/
server
History
timothycarambat
5f6a013139
Change server bootup log
2023-12-14 13:52:11 -08:00
..
endpoints
Add user PFP support and context to logo (
#408
)
2023-12-07 14:11:51 -08:00
models
patch: API key to localai service calls (
#421
)
2023-12-11 14:18:28 -08:00
prisma
Add user PFP support and context to logo (
#408
)
2023-12-07 14:11:51 -08:00
storage
docs: placeholder for model downloads folder (
#446
)
2023-12-14 10:31:14 -08:00
swagger
AnythingLLM UI overhaul (
#278
)
2023-10-23 13:10:34 -07:00
utils
patch: implement @lunamidori hotfix for LocalAI streaming chunk overflows (
#433
)
2023-12-12 16:20:06 -08:00
.env.example
feat: add support for variable chunk length (
#415
)
2023-12-07 16:27:36 -08:00
.gitignore
AnythingLLM UI overhaul (
#278
)
2023-10-23 13:10:34 -07:00
.nvmrc
Implement Chroma Support (
#1
)
2023-06-07 21:31:35 -07:00
index.js
Change server bootup log
2023-12-14 13:52:11 -08:00
nodemon.json
Full developer api (
#221
)
2023-08-23 19:15:07 -07:00
package.json
[Feature] AnythingLLM use locally hosted Llama.cpp and GGUF files for inferencing (
#413
)
2023-12-07 14:48:27 -08:00
yarn.lock
[Feature] AnythingLLM use locally hosted Llama.cpp and GGUF files for inferencing (
#413
)
2023-12-07 14:48:27 -08:00