Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(CTransformers): add support to CTransformers #1248

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Merge branch 'master' into feat/ctransformer
Aisuko authored Nov 7, 2023

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
commit f837f1cef29bafd839b067236cbc5bc7c148f599
File renamed without changes.
File renamed without changes.
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -16,6 +16,7 @@
import backend_pb2_grpc

from ctransformers import AutoModelForCausalLM
from ctransformer.llm import Config

# Adapted from https://github.com/marella/ctransformers/tree/main#supported-models
# License: MIT
@@ -48,10 +49,10 @@ def LoadModel(self, request, context):
model_path = request.Model
if not os.path.exists(model_path):
return backend_pb2.Result(success=False, message=f"Model path {model_path} does not exist")

model_type = request.ModelType
if model_type not in ModelType.__dict__.values():
return backend_pb2.Result(success=False, message=f"Model type {model_type} not supported")

llm = AutoModelForCausalLM.from_pretrained(model_file=model_path, model_type=model_type)
self.model=llm
except Exception as err:
@@ -67,11 +68,11 @@ def PredictStream(self, request, context):

def TokenizeString(self, request, context):
try:
token: List[int]=self.model.tokenize(request.prompt, add_bos_token=False)
l=len(token)
tokens: List[int]=self.model.tokenize(request.prompt, add_bos_token=False)
l=len(tokens)
except Exception as err:
return backend_pb2.Result(success=False, message=f"Unexpected {err=}, {type(err)=}")
return backend_pb2.TokenizationResponse(length=l, token=token)
return backend_pb2.TokenizationResponse(length=l, tokens=tokens)

def serve(address):
server = grpc.server(futures.ThreadPoolExecutor(max_workers=MAX_WORKERS))
File renamed without changes.
You are viewing a condensed version of this merge commit. You can view the full changes here.