Thanks to visit codestin.com
Credit goes to github.com

Skip to content

LlamaGrammar prints grammar on each iteration #1666

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
4 tasks done
JoshLoecker opened this issue Aug 7, 2024 · 0 comments
Closed
4 tasks done

LlamaGrammar prints grammar on each iteration #1666

JoshLoecker opened this issue Aug 7, 2024 · 0 comments

Comments

@JoshLoecker
Copy link

JoshLoecker commented Aug 7, 2024

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

LlamaGrammar should not print its grammar rules when verbose=False

Current Behavior

The grammar rules are printed on each inference call, even when verbose=False is set in both the Llama() and LlamaGrammar.from_string() initializers

Environment and Context

  • Physical (or virtual) hardware you are using, e.g. for Linux: M3 Macbook Pro

  • Operating System, e.g. for Linux: MacOS

  • SDK version, e.g. for Linux:

$ python3 --version
Python 3.12.4

$ make --version
GNU Make 3.81
Copyright (C) 2006  Free Software Foundation, Inc.
This is free software; see the source for copying conditions.
There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A
PARTICULAR PURPOSE.

This program built for i386-apple-darwin11.3.0

$ g++ --version
Apple clang version 15.0.0 (clang-1500.3.9.4)
Target: arm64-apple-darwin23.5.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin

Failure Information (for bugs)

Please help provide information about the failure if this is a bug. If it is not a bug, please remove the rest of this template.

Steps to Reproduce

The following is a minimum reproducible example

import llama_cpp
from llama_cpp.llama_grammar import JSON_GBNF


def main():
    llama = llama_cpp.Llama("/Users/joshl/Projects/MechanisticLLM/data/models/language/mistral-7b-instruct-v0.3-q4_k_m.gguf", verbose=False)
    grammar = llama_cpp.LlamaGrammar.from_string(JSON_GBNF, verbose=False)

    response = llama("What is the answer to life, the universe, and everything?")
    print(response)


if __name__ == '__main__':
    main()

Because I've set LlamaGrammer.from_string(verbose=False), I expected that the JSON grammar would not be printed to the screen

Failure Logs

root ::= object 
object ::= [{] ws object_11 [}] ws 
value ::= object | array | string | number | value_6 ws 
array ::= [[] ws array_15 []] ws 
string ::= ["] string_18 ["] ws 
number ::= number_19 number_39 number_57 ws 
value_6 ::= [t] [r] [u] [e] | [f] [a] [l] [s] [e] | [n] [u] [l] [l] 
ws ::= | [ ] | [<U+000A>] ws_77 
object_8 ::= string [:] ws value object_10 
object_9 ::= [,] ws string [:] ws value 
object_10 ::= object_9 object_10 | 
object_11 ::= object_8 | 
array_12 ::= value array_14 
array_13 ::= [,] ws value 
array_14 ::= array_13 array_14 | 
array_15 ::= array_12 | 
string_16 ::= [^"\�<U+0000>-<U+001F>] | [\] string_17 
string_17 ::= ["\bfnrt] | [u] [0-9a-fA-F] [0-9a-fA-F] [0-9a-fA-F] [0-9a-fA-F] 
string_18 ::= string_16 string_18 | 
number_19 ::= number_20 number_21 
number_20 ::= [-] | 
number_21 ::= [0-9] | [1-9] number_36 
number_22 ::= [0-9] | 
number_23 ::= [0-9] number_22 | 
number_24 ::= [0-9] number_23 | 
number_25 ::= [0-9] number_24 | 
number_26 ::= [0-9] number_25 | 
number_27 ::= [0-9] number_26 | 
number_28 ::= [0-9] number_27 | 
number_29 ::= [0-9] number_28 | 
number_30 ::= [0-9] number_29 | 
number_31 ::= [0-9] number_30 | 
number_32 ::= [0-9] number_31 | 
number_33 ::= [0-9] number_32 | 
number_34 ::= [0-9] number_33 | 
number_35 ::= [0-9] number_34 | 
number_36 ::= [0-9] number_35 | 
number_37 ::= [.] [0-9] number_38 
number_38 ::= [0-9] number_38 | 
number_39 ::= number_37 | 
number_40 ::= [eE] number_41 [0-9] number_56 
number_41 ::= [-+] | 
number_42 ::= [1-9] | 
number_43 ::= [1-9] number_42 | 
number_44 ::= [1-9] number_43 | 
number_45 ::= [1-9] number_44 | 
number_46 ::= [1-9] number_45 | 
number_47 ::= [1-9] number_46 | 
number_48 ::= [1-9] number_47 | 
number_49 ::= [1-9] number_48 | 
number_50 ::= [1-9] number_49 | 
number_51 ::= [1-9] number_50 | 
number_52 ::= [1-9] number_51 | 
number_53 ::= [1-9] number_52 | 
number_54 ::= [1-9] number_53 | 
number_55 ::= [1-9] number_54 | 
number_56 ::= [1-9] number_55 | 
number_57 ::= number_40 | 
ws_58 ::= [ <U+0009>] | 
ws_59 ::= [ <U+0009>] ws_58 | 
ws_60 ::= [ <U+0009>] ws_59 | 
ws_61 ::= [ <U+0009>] ws_60 | 
ws_62 ::= [ <U+0009>] ws_61 | 
ws_63 ::= [ <U+0009>] ws_62 | 
ws_64 ::= [ <U+0009>] ws_63 | 
ws_65 ::= [ <U+0009>] ws_64 | 
ws_66 ::= [ <U+0009>] ws_65 | 
ws_67 ::= [ <U+0009>] ws_66 | 
ws_68 ::= [ <U+0009>] ws_67 | 
ws_69 ::= [ <U+0009>] ws_68 | 
ws_70 ::= [ <U+0009>] ws_69 | 
ws_71 ::= [ <U+0009>] ws_70 | 
ws_72 ::= [ <U+0009>] ws_71 | 
ws_73 ::= [ <U+0009>] ws_72 | 
ws_74 ::= [ <U+0009>] ws_73 | 
ws_75 ::= [ <U+0009>] ws_74 | 
ws_76 ::= [ <U+0009>] ws_75 | 
ws_77 ::= [ <U+0009>] ws_76 | 
{'id': 'cmpl-311149bb-b6ce-4f10-be58-ea5929998781', 'object': 'text_completion', 'created': 1723061341, 'model': '/Users/joshl/Projects/MechanisticLLM/data/models/language/mistral-7b-instruct-v0.3-q4_k_m.gguf', 'choices': [{'text': ' According to Douglas Adams, the answer is 42, but only if you', 'index': 0, 'logprobs': None, 'finish_reason': 'length'}], 'usage': {'prompt_tokens': 14, 'completion_tokens': 16, 'total_tokens': 30}}
@abetlen abetlen closed this as completed in 0998ea0 Aug 8, 2024
benniekiss pushed a commit to benniekiss/llama-cpp-python that referenced this issue Aug 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant