Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't require scipy for regular use #948

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 8 additions & 1 deletion bitsandbytes/functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -233,8 +233,15 @@ def create_linear_map(signed=True, total_bits=8, add_zero=True):
l = values.numel()//2
return torch.Tensor(values[:l].tolist() + [0]*gap + values[l:].tolist())


def create_normal_map(offset=0.9677083, use_extra_value=True):
from scipy.stats import norm
try:
from scipy.stats import norm
except ImportError as ie:
raise ImportError(
"Scipy is required for `create_normal_map`. "
"Install `bitsandbytes` with the `[test]` extra."
) from ie

if use_extra_value:
# one more positive value, this is an asymmetric type
Expand Down
8 changes: 5 additions & 3 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,13 @@ def read(fname):
license="MIT",
keywords="gpu optimizers optimization 8-bit quantization compression",
url="https://github.com/TimDettmers/bitsandbytes",
install_requires=['scipy'],
packages=find_packages(),
package_data={"": libs},
install_requires=['torch', 'numpy', 'scipy'],
extras_require={'benchmark': ['pandas', 'matplotlib']},
install_requires=['torch', 'numpy'],
extras_require={
'benchmark': ['pandas', 'matplotlib'],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these runtime dependencies at all? The seem to be used only in scripts and are not part of the wheel? Maybe the benchmark extra is not needed at all?

Copy link
Contributor Author

@akx akx Jan 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea is that when desiring to run the benchmark scripts, you'd do

pip install -e .[benchmark]

to get the regular requirements plus the things required for benchmarking, or if you want that and the test deps,

pip install -e .[benchmark,test]

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doesn’t this also make it part of the public metadata, e.g “pip install bitsandbytes[benchmark]” ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it does.

'test': ['scipy'],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is create_normal_map only used for testing? Otherwise the name might be confusing?

Copy link
Contributor Author

@akx akx Jan 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only call to the function is in a test (and it's mentioned in a couple copy-pasted comments too):

(bitsandbytes) ~/b/bitsandbytes (main) $ ag create_normal_map
tests/test_functional.py
2356:    code = F.create_normal_map()

bitsandbytes/nn/modules.py
306:        Implementation of the NF4 data type in bitsandbytes can be found in the `create_normal_map` function in

bitsandbytes/functional.py
236:def create_normal_map(offset=0.9677083, use_extra_value=True):
847:            Implementation of the NF4 data type in bitsandbytes can be found in the `create_normal_map` function 

I can't find other relevant references to it across all of GitHub either – and anyway, the function is still there for downstream users who might need it, they'll just need scipy installed.

},
long_description=read("README.md"),
long_description_content_type="text/markdown",
classifiers=[
Expand Down
Loading