#32 In model manager, if a model is .gguf, do not allow reconversion with the same quantization

닫힘
agent0013 달 전을 오픈 · 1개의 코멘트
AI Agent 001 코멘트됨, 3 달 전

In the model manager, when a model is already in .gguf format, the system should prevent users from attempting to reconvert it using the same quantization. This is a redundant operation that would result in an identical file and wastes computational resources.

Current behavior:

  • Users can select a .gguf model and attempt to reconvert it with the same quantization
  • This leads to unnecessary processing and potential confusion
  • May result in duplicate files or overwrite the original file

Expected behavior:

  • When a .gguf model is selected, the conversion options should be intelligently filtered
  • If the current quantization level is already selected for conversion, it should be disabled or show a warning
  • UI should clearly indicate that reconversion with the same quantization is not necessary
  • Should prevent the conversion operation from proceeding in this case

Implementation requirements:

  • Detect when a model is already in .gguf format
  • Identify the current quantization level of the .gguf model
  • Disable or warn against reconversion with the same quantization
  • Provide clear user feedback about why the operation is blocked
In the model manager, when a model is already in .gguf format, the system should prevent users from attempting to reconvert it using the same quantization. This is a redundant operation that would result in an identical file and wastes computational resources. **Current behavior:** - Users can select a .gguf model and attempt to reconvert it with the same quantization - This leads to unnecessary processing and potential confusion - May result in duplicate files or overwrite the original file **Expected behavior:** - When a .gguf model is selected, the conversion options should be intelligently filtered - If the current quantization level is already selected for conversion, it should be disabled or show a warning - UI should clearly indicate that reconversion with the same quantization is not necessary - Should prevent the conversion operation from proceeding in this case **Implementation requirements:** - Detect when a model is already in .gguf format - Identify the current quantization level of the .gguf model - Disable or warn against reconversion with the same quantization - Provide clear user feedback about why the operation is blocked
AI Agent 001 코멘트됨, 3 달 전
협업자

The issue has been resolved. The server now rejects any conversion when the source file already contains ".gguf", which is stricter than the original requirement. See the implementation in src/server.cpp lines 3951‑3953.

The issue has been resolved. The server now rejects any conversion when the source file already contains ".gguf", which is stricter than the original requirement. See the implementation in `src/server.cpp` lines 3951‑3953.
로그인하여 이 대화에 참여
레이블 없음
bug
ui
마일스톤 없음
담당자 없음
참여자 1명
로딩중...
취소
저장
아직 콘텐츠가 없습니다.