-
Notifications
You must be signed in to change notification settings - Fork 310
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expose Lighthouse's internal state / DatabaseCoverage objects to disassembler interpreter #76
Comments
Thanks for the kind words, and sorry for the late response. I'm only catching up on lighthouse maintenance now... I have been thinking about adding access to this through the IDA / python API for a long time, and am hoping to sneak at least something primitive into the next release of Lighthouse. I am not sure what the best way is do this is yet, but it will probably involve exposing the live plugin state as a global such that you can extract information from the I can't promise driving the plugin 'headlessly' will be stable, but at least reading info out should mostly be okay :-) |
Somewhat related to #34 |
great, the idea is not to drive it headless, but to get the data set, so that it can be used in automation! |
With commit 0ef5c9d on the development branch, I have exposed the live instance of Lighthouse's Right now, the only usecase I will pretend to be supporting is read access to stuff like the coverage data. I would not try to 'drive' the plugin UI / state through the director as things might explode. # get access to the live, exposed CoverageDirector instance
from lighthouse import coverage_director
# get the list of NodeCoverage from the DatabaseCoverage object that is active in the UI
coverage_director.coverage.nodes
# request a loaded DatabaseCoverage object by name, or shorthand symbol:
cov_a = coverage_director.get_coverage('A')
print(cov_a.nodes) Please refer to |
can i get coverage related data in the python api of ida?
so that the data could be processed further?
ps awesome work!
The text was updated successfully, but these errors were encountered: