Skip to content

Commit ef08273

Browse files
authored
Merge pull request #15 from WhatTheFuzz/feature/settings
Implement user settings
2 parents 53080f5 + 0eb8e4c commit ef08273

9 files changed

Lines changed: 162 additions & 33 deletions

File tree

.pylintrc

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,10 @@ disable=
103103
# We anticipate #3512 where it will become optional
104104
fixme,
105105
consider-alternative-union-syntax,
106-
relative-beyond-top-level
106+
relative-beyond-top-level,
107+
# Remove import error for clients without the Binary Ninja plugin installed,
108+
# as in non-commercial settings.
109+
import-error
107110

108111

109112
[REPORTS]

README.md

Lines changed: 16 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -23,9 +23,16 @@ please submit a pull request if you've tested it.
2323
## API Key
2424

2525
This requires an [API token from OpenAI][token]. The plugin checks for the API
26-
key in two ways (in this order).
26+
key in three ways (in this order).
2727

28-
First, it checks the environment variable `OPENAI_API_KEY`, which you can set
28+
First, it tries to read the key from Binary Ninja's preferences. You can
29+
access the entry in Binary Ninja via `Edit > Preferences > Settings > OpenAI`.
30+
Or, use the hotkey ⌘+, and search for `OpenAI`. You should see customizable
31+
settings like so.
32+
33+
![Settings](https://github.com/WhatTheFuzz/binaryninja-openai/blob/main/resources/settings.png?raw=true)
34+
35+
Second, it checks the environment variable `OPENAI_API_KEY`, which you can set
2936
inside of Binary Ninja's Python console like so:
3037

3138
```python
@@ -42,8 +49,8 @@ mkdir ~/.openai
4249
echo -n "INSERT KEY HERE" > ~/.openai/api_key.txt
4350
```
4451

45-
Note that if you have both set, the plugin defaults to the environment variable.
46-
If your API token is invalid, you'll receive the following error:
52+
Note that if you have all three set, the plugin defaults to one set in Binary
53+
Ninja. If your API token is invalid, you'll receive the following error:
4754

4855
```python
4956
openai.error.AuthenticationError: Incorrect API key provided: <BAD KEY HERE>.
@@ -64,12 +71,14 @@ inside of the function.
6471

6572
The output will appear in Binary Ninja's Log like so:
6673

67-
![The output of running the plugin.](./resources/output.png)
74+
![The output of running the plugin.](https://github.com/WhatTheFuzz/binaryninja-openai/blob/main/resources/output.png?raw=true)
6875

6976
## OpenAI Model
7077

7178
By default, the plugin uses the `text-davinci-003` model, you can tweak this
72-
inside of [entry.py][entry].
79+
inside Binary Ninja's preferences. You can access these settings as described in
80+
the [API Key](#api-key) section. It uses the maximum available number of tokens
81+
for each model, as described in [OpenAI's documentation][tokens].
7382

7483
## Known Issues
7584

@@ -86,6 +95,7 @@ This project is licensed under the [MIT license][license].
8695

8796
[default-plugin-dir]:https://docs.binary.ninja/guide/plugins.html
8897
[token]:https://beta.openai.com/account/api-keys
98+
[tokens]:https://beta.openai.com/docs/models/gpt-3
8999
[entry]:./src/entry.py
90100
[asyncio]:https://docs.python.org/3/library/asyncio.html
91101
[issue-8]:https://github.com/WhatTheFuzz/binaryninja-openai/issues/8

__init__.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,10 @@
11
from binaryninja import PluginCommand
2+
from . src.settings import OpenAISettings
23
from . src.entry import check_function
34

5+
# Register the settings group in Binary Ninja to store the API key and model.
6+
OpenAISettings()
7+
48
PluginCommand.register_for_high_level_il_function("OpenAI\What Does this Function Do (HLIL)?",
59
"Checks OpenAI to see what this HLIL function does." \
610
"Requires an internet connection and an API key "

plugin.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
"python3"
1010
],
1111
"description": "Queries OpenAI's GPT3 to determine what a given function does.",
12-
"longdescription": "Generates a query that asks 'What does this function do?' followed by a list of the instructions in the function. Returns the result from GPT3 and displays it to the user in the Binary Ninja console. Requires an OpenAI API key.",
12+
"longdescription": "",
1313
"license": {
1414
"name": "MIT",
1515
"text": "Copyright 2022 Sean Deaton (@WhatTheFuzz)\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."
@@ -27,6 +27,6 @@
2727
"openai"
2828
]
2929
},
30-
"version": "1.0.0",
30+
"version": "1.1.0",
3131
"minimumbinaryninjaversion": 3200
3232
}

resources/settings.png

164 KB
Loading

src/agent.py

Lines changed: 67 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -3,16 +3,15 @@
33
from pathlib import Path
44

55
import openai
6-
from openai.api_resources.engine import Engine
6+
from openai.api_resources.model import Model
77
from openai.error import APIError
88

99
from binaryninja.lowlevelil import LowLevelILFunction
1010
from binaryninja.mediumlevelil import MediumLevelILFunction
1111
from binaryninja.highlevelil import HighLevelILFunction
12+
from binaryninja.settings import Settings
1213
from binaryninja import log
1314

14-
from .exceptions import InvalidEngineException
15-
1615

1716
class Agent:
1817

@@ -30,7 +29,6 @@ class Agent:
3029

3130
def __init__(self,
3231
function: Union[LowLevelILFunction, MediumLevelILFunction, HighLevelILFunction],
33-
engine: str,
3432
path_to_api_key: Optional[Path]=None) -> None:
3533

3634
# Read the API key from the environment variable.
@@ -44,33 +42,82 @@ def __init__(self,
4442
f'LowLevelILFunction, MediumLevelILFunction, or '
4543
f'HighLevelILFunction, got {type(function)}.')
4644

47-
# Get the list of available engines.
48-
engines: list[Engine] = openai.Engine.list().data
49-
# Ensure the user's selected engine is available.
50-
if engine not in [e.id for e in engines]:
51-
InvalidEngineException(f'Invalid engine: {engine}. Valid engines '
52-
f'are: {[e.id for e in engines]}')
53-
5445
# Set instance attributes.
5546
self.function = function
56-
self.engine = engine
47+
self.model = self.get_model()
5748

5849
def read_api_key(self, filename: Optional[Path]=None) -> str:
59-
if os.getenv('OPENAI_API_KEY'):
60-
return os.getenv('OPENAI_API_KEY')
50+
'''Checks for the API key in three locations.
51+
52+
First, it checks the openai.api_key key:value in Binary Ninja
53+
preferences. This is accessed in Binary Ninja by going to Edit >
54+
Preferences > Settings > OpenAI.
55+
Second, it checks the OPENAI_API_KEY environment variable.
56+
Finally, it checks the file specified by the filename argument.
57+
Defaults to ~/.openai/api_key.txt.
58+
'''
59+
60+
# First, check the Binary Ninja settings.
61+
settings: Settings = Settings()
62+
if settings.contains('openai.api_key'):
63+
if key := settings.get_string('openai.api_key'):
64+
return key
65+
66+
# If the settings don't exist, contain the key, or the key is empty,
67+
# check the environment variable.
68+
if key := os.getenv('OPENAI_API_KEY'):
69+
return key
70+
71+
# Finally, if the environment variable doesn't exist, check the default
72+
# file.
6173
if filename:
6274
log.log_info(f'No API key detected under the environment variable '
6375
f'OPENAI_API_KEY. Reading API key from {filename}')
6476
try:
6577
with open(filename, mode='r', encoding='ascii') as api_key_file:
6678
return api_key_file.read()
67-
except FileNotFoundError as error:
79+
except FileNotFoundError:
6880
log.log_error(f'Could not find API key file at {filename}.')
6981

70-
raise APIError('No API key found. Please set the environment '
71-
'variable OPENAI_API_KEY to your API key, or write '
72-
'it to ~/openai/api_key.txt.')
82+
raise APIError('No API key found. Refer to the documentation to add the '
83+
'API key.')
84+
85+
def is_valid_model(self, model: str) -> bool:
86+
'''Checks if the model is valid by querying the OpenAI API.'''
87+
models: list[Model] = openai.Model.list().data
88+
return model in [m.id for m in models]
7389

90+
def get_model(self) -> str:
91+
'''Returns the model that the user has selected from Binary Ninja's
92+
preferences. The default value is set by the OpenAISettings class. If
93+
for some reason the user selected a model that doesn't exist, this
94+
function defaults to 'text-davinci-003'.
95+
'''
96+
settings: Settings = Settings()
97+
# Check that the key exists.
98+
if settings.contains('openai.model'):
99+
# Check that the key is not empty and get the user's selection.
100+
if model := settings.get_string('openai.model'):
101+
# Check that is a valid model by querying the OpenAI API.
102+
if self.is_valid_model(model):
103+
return model
104+
# Return a valid, default model.
105+
assert self.is_valid_model('text-davinci-003')
106+
return 'text-davinci-003'
107+
108+
def max_token_count(self, model: str) -> int:
109+
'''Returns the maximum number of tokens that can be generated by the
110+
model. Returns a default of 2,048 if the model is not found. '''
111+
# TODO: This should be somewhere else, as it's also shared by Settings.
112+
models: dict[str, int] = {
113+
'text-davinci-003': 4_000,
114+
'text-curie-001': 2_048,
115+
'text-babbage-001': 2_048,
116+
'text-ada-001': 2_048,
117+
'code-davinci-002': 8_000,
118+
'code-cushman-001': 2_048
119+
}
120+
return models.get(model, 2_048)
74121

75122
def instruction_list(self, function: Union[LowLevelILFunction,
76123
MediumLevelILFunction,
@@ -101,9 +148,8 @@ def generate_query(self, function: Union[LowLevelILFunction,
101148
def send_query(self, query: str) -> str:
102149
'''Sends a query to the engine and returns the response.'''
103150
response: str = openai.Completion.create(
104-
model=self.engine,
151+
model=self.model,
105152
prompt=query,
106-
max_tokens=2_048
153+
max_tokens=self.max_token_count(self.model) - len(query),
107154
)
108155
return response.choices[0].text
109-

src/entry.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,6 @@
99
def check_function(bv: BinaryView, func: Function) -> bool:
1010
agent: Agent = Agent(
1111
function=func,
12-
engine='text-davinci-003',
1312
path_to_api_key=API_KEY_PATH
1413
)
1514
query: str = agent.generate_query(func)

src/exceptions.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
1-
from openai.error import OpenAIError
1+
class RegisterSettingsGroupException(Exception):
2+
pass
23

3-
class InvalidEngineException(OpenAIError):
4+
class RegisterSettingsKeyException(Exception):
45
pass

src/settings.py

Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
import json
2+
from binaryninja.settings import Settings
3+
from . exceptions import RegisterSettingsGroupException, \
4+
RegisterSettingsKeyException
5+
6+
class OpenAISettings(Settings):
7+
8+
def __init__(self) -> None:
9+
# Initialize the settings with the default instance ID.
10+
super().__init__(instance_id='default')
11+
# Register the OpenAI group.
12+
if not self.register_group('openai', 'OpenAI'):
13+
raise RegisterSettingsGroupException('Failed to register OpenAI '
14+
'settings group.')
15+
# Register the setting for the API key.
16+
if not self.register_api_key_settings():
17+
raise RegisterSettingsKeyException('Failed to register OpenAI API '
18+
'key settings.')
19+
20+
# Register the setting for the model used to query.
21+
if not self.register_model_settings():
22+
raise RegisterSettingsKeyException('Failed to register OpenAI '
23+
'model settings.')
24+
25+
def register_api_key_settings(self) -> bool:
26+
'''Register the OpenAI API key settings in Binary Ninja.'''
27+
# Set the attributes of the settings. Refer to:
28+
# https://api.binary.ninja/binaryninja.settings-module.html
29+
properties = {
30+
'title': 'OpenAI API Key',
31+
'type': 'string',
32+
'description': 'The user\'s OpenAI API key used to make requests '
33+
'the server.'
34+
}
35+
return self.register_setting('openai.api_key', json.dumps(properties))
36+
37+
def register_model_settings(self) -> bool:
38+
'''Register the OpenAI model settings in Binary Ninja.
39+
Defaults to text-davinci-003.
40+
'''
41+
# Set the attributes of the settings. Refer to:
42+
# https://api.binary.ninja/binaryninja.settings-module.html
43+
properties = {
44+
'title': 'OpenAI Model',
45+
'type': 'string',
46+
'description': 'The OpenAI model used to generate the response.',
47+
# https://beta.openai.com/docs/models
48+
'enum': [
49+
'text-davinci-003',
50+
'text-curie-001',
51+
'text-babbage-001',
52+
'text-babbage-002',
53+
'code-davinci-002',
54+
'code-cushman-001'
55+
],
56+
'enumDescriptions': [
57+
'Most capable GPT-3 model. Can do any task the other models can do, often with higher quality, longer output and better instruction-following. Also supports inserting completions within text.',
58+
'Very capable, but faster and lower cost than Davinci.',
59+
'Capable of straightforward tasks, very fast, and lower cost.',
60+
'Capable of very simple tasks, usually the fastest model in the GPT-3 series, and lowest cost.',
61+
'Most capable Codex model. Particularly good at translating natural language to code. In addition to completing code, also supports inserting completions within code.',
62+
'Almost as capable as Davinci Codex, but slightly faster. This speed advantage may make it preferable for real-time applications.'
63+
],
64+
'default': 'text-davinci-003'
65+
}
66+
return self.register_setting('openai.model', json.dumps(properties))

0 commit comments

Comments
 (0)