-
-
Notifications
You must be signed in to change notification settings - Fork 55
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* docs * Update README.md * Delete quartz_solar_forecast/inverters/image/README directory * mocks api * changes ts to utc * mock auth code * mock access token * auth code fixture * inpmock * escape * mock inp * test1 * process funcs * process enphase data test * rm redundant deps * process pv test
- Loading branch information
1 parent
9eee595
commit 0e2b1af
Showing
6 changed files
with
252 additions
and
105 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,59 @@ | ||
# Adding an Inverter to Quartz Solar Forecast | ||
|
||
The aim of this module is to allow users to add their inverter brands to Quartz Solar Forecast and use live data instead of the default fake data. | ||
|
||
Quartz Solar Forecast has support for Enphase inverters as of now, and we are working on increasing support for a wide range of solar inverters. | ||
|
||
## Important Directories & Files | ||
|
||
```markdown | ||
Open-Source-Quartz-Solar-Forecast/ | ||
├── example/ | ||
│ └── inverter_example.py | ||
├── quartz_solar_forecast/ | ||
│ ├── data.py | ||
│ ├── pydantic_models.py | ||
│ └── inverters/ | ||
├── tests/ | ||
│ └── data/ | ||
│ └── test_make_pv_data.py | ||
``` | ||
|
||
## What each Directory holds | ||
|
||
1. `example/` | ||
* `inverter_example.py`: Makes input data depending on the inverter type and compares it with the type with no data and runs the ML model along with a comparison plot using `plotly`. This is the file that you need to run in order to run the ML model. An example output with Enphase is demonstrated below: | ||
|
||
![example_enphase_output](https://github.com/aryanbhosale/Open-Source-Quartz-Solar-Forecast/assets/36108149/7127a00e-c081-4f5e-a342-2be2e2efe00c) | ||
|
||
2. `quartz_solar_forecast`: | ||
* `data.py`: Contains the `make_pv_data()` function, that conditionally checks the inverter type and constructs and `xarray` dataframe | ||
* `pydantic_models.py`: Contains the PVSite class | ||
* `inverters/`: | ||
* This is the directory where you'd want to create a new file among the other `<inverter_name>.py` files to add your inverter | ||
* You will need to follow the appropriate authentication flow as mentioned in the documentation of the inverter you're trying to add | ||
* We need the past 7 days data formatted in intervals of 5 minutes for this model. Given below is an example with Enphase | ||
|
||
![example_enphase_data](https://github.com/aryanbhosale/Open-Source-Quartz-Solar-Forecast/assets/36108149/436c688c-2e59-4047-abfc-754acb629343) | ||
|
||
* Once all the processing is done, make sure that your return type is of `pd.DataFrame` that has 2 colums, namely | ||
|
||
* `timestamp`: `timestamp=datetime.fromtimestamp(interval_end_time_in_unix_epochs, tz=timezone.utc).strftime('%Y-%m-%d %H:%M:%S')`, and then convert the timestamp column to `pd.to_datetime` | ||
* `power_kw`: Power in **KiloWatts.** An example is shown below with the formatted `pd.DataFrame` | ||
![example_enphase_formatted_dataframe](https://github.com/aryanbhosale/Open-Source-Quartz-Solar-Forecast/assets/36108149/482b2f2a-e3f5-4a1a-97f1-2d322a1444d5) | ||
|
||
3. `tests/` | ||
* `data/` | ||
* `test_make_pv_data.py`: Mocks the `make_pv_data()` function `data.py` file using various type of inverters and the `None` value too using `pytest` | ||
* Run this using `pytest tests/data/test_make_pv_data.py` | ||
|
||
## How to Setup | ||
|
||
1. Ensure you have a Linux Machine like Ubuntu or Kali installed | ||
2. Navigate inside the `Open-Source-Quartz-Solar-Forecast` and create a `virtual environment` by entering `python -m venv venv` | ||
3. Activate the `virtual environment` by entering `source venv/bin/activate` | ||
4. Install the requirements by entering `pip install -r requirements.txt` and `pip install -e .` | ||
5. Install `plotly` by entering `pip install plotly` | ||
6. Create a `.env` file in the root directory, i.e. `Open-Source-Quartz-Solar-Forecast` | ||
7. Add your Solar Inverter's user credentials along with environment variables in the `.env` file, refer to the `.env.example` file for Enphase & SolarEdge credential examples | ||
8. Run the `inverter_example.py` file by entering `python examples/inverter_example.py` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,66 @@ | ||
import pytest | ||
import pandas as pd | ||
import numpy as np | ||
import xarray as xr | ||
from datetime import datetime, timezone | ||
from quartz_solar_forecast.data import process_pv_data | ||
from quartz_solar_forecast.pydantic_models import PVSite | ||
|
||
@pytest.fixture | ||
def sample_site(): | ||
return PVSite( | ||
latitude=51.75, | ||
longitude=-1.25, | ||
capacity_kwp=1.25, | ||
tilt=35, | ||
orientation=180, | ||
inverter_type="enphase" | ||
) | ||
|
||
@pytest.fixture | ||
def sample_timestamp(): | ||
timestamp = datetime.now().timestamp() | ||
timestamp_str = datetime.fromtimestamp(timestamp, tz=timezone.utc).strftime('%Y-%m-%d %H:%M:%S') | ||
return pd.to_datetime(timestamp_str) | ||
|
||
@pytest.fixture | ||
def sample_live_generation(): | ||
return pd.DataFrame({ | ||
'timestamp': [ | ||
pd.Timestamp('2024-06-16 10:00:00'), | ||
pd.Timestamp('2024-06-16 10:05:00'), | ||
pd.Timestamp('2024-06-16 10:10:00') | ||
], | ||
'power_kw': [0.75, 0.80, 0.78] | ||
}) | ||
|
||
def test_process_pv_data_with_live_data(sample_site, sample_timestamp, sample_live_generation): | ||
result = process_pv_data(sample_live_generation, sample_timestamp, sample_site) | ||
|
||
assert isinstance(result, xr.Dataset) | ||
assert 'generation_kw' in result.data_vars | ||
assert set(result.coords) == {'longitude', 'latitude', 'timestamp', 'pv_id', 'kwp', 'tilt', 'orientation'} | ||
assert result.pv_id.values.tolist() == [1] | ||
assert result.longitude.values.tolist() == [sample_site.longitude] | ||
assert result.latitude.values.tolist() == [sample_site.latitude] | ||
assert result.kwp.values.tolist() == [sample_site.capacity_kwp] | ||
assert result.tilt.values.tolist() == [sample_site.tilt] | ||
assert result.orientation.values.tolist() == [sample_site.orientation] | ||
assert len(result.timestamp) <= len(sample_live_generation) | ||
assert np.all(result.timestamp.values <= sample_timestamp) | ||
|
||
def test_process_pv_data_without_live_data(sample_site, sample_timestamp): | ||
result = process_pv_data(None, sample_timestamp, sample_site) | ||
|
||
assert isinstance(result, xr.Dataset) | ||
assert 'generation_kw' in result.data_vars | ||
assert set(result.coords) == {'longitude', 'latitude', 'timestamp', 'pv_id', 'kwp', 'tilt', 'orientation'} | ||
assert result.pv_id.values.tolist() == [1] | ||
assert result.longitude.values.tolist() == [sample_site.longitude] | ||
assert result.latitude.values.tolist() == [sample_site.latitude] | ||
assert result.kwp.values.tolist() == [sample_site.capacity_kwp] | ||
assert result.tilt.values.tolist() == [sample_site.tilt] | ||
assert result.orientation.values.tolist() == [sample_site.orientation] | ||
assert len(result.timestamp) == 1 | ||
assert result.timestamp.values[0] == sample_timestamp | ||
assert np.isnan(result.generation_kw.values[0][0]) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,56 @@ | ||
import pytest | ||
import pandas as pd | ||
import numpy as np | ||
from quartz_solar_forecast.inverters.enphase import process_enphase_data | ||
|
||
@pytest.fixture | ||
def sample_data(): | ||
return { | ||
'system_id': 3136663, | ||
'granularity': 'week', | ||
'total_devices': 4, | ||
'start_at': 1718530896, | ||
'end_at': 1719134971, | ||
'items': 'intervals', | ||
'intervals': [ | ||
{'end_at': 1718531100, 'devices_reporting': 4, 'powr': 624, 'enwh': 52}, | ||
{'end_at': 1718531400, 'devices_reporting': 4, 'powr': 684, 'enwh': 57}, | ||
{'end_at': 1718531700, 'devices_reporting': 4, 'powr': 672, 'enwh': 56}, | ||
] | ||
} | ||
|
||
def test_process_enphase_data(sample_data): | ||
# Set start_at to before/after the first interval | ||
start_at = sample_data['intervals'][0]['end_at'] + 1 | ||
|
||
# Process the data | ||
result = process_enphase_data(sample_data, start_at) | ||
|
||
# Check if the result is a DataFrame | ||
assert isinstance(result, pd.DataFrame) | ||
|
||
# Check if the DataFrame has the expected columns | ||
assert set(result.columns) == {'timestamp', 'power_kw'} | ||
|
||
# Check if the timestamp column is of datetime type | ||
assert pd.api.types.is_datetime64_any_dtype(result['timestamp']) | ||
|
||
# Check if power_kw values are correctly calculated (divided by 1000) | ||
expected_power_values = [interval['powr'] / 1000 for interval in sample_data['intervals']] | ||
assert all(value in expected_power_values for value in result['power_kw']) | ||
|
||
# Convert start_at to a naive UTC timestamp | ||
start_at_timestamp = pd.Timestamp(start_at, unit='s').tz_localize('UTC').tz_convert(None) | ||
|
||
# Check if all timestamps are after the start_at time | ||
assert np.all(result['timestamp'] >= start_at_timestamp) | ||
|
||
# Check if the number of rows is less than or equal to the number of intervals | ||
assert len(result) <= len(sample_data['intervals']) | ||
|
||
# Check if timestamps are formatted correctly | ||
expected_timestamps = [ | ||
pd.Timestamp(interval['end_at'], unit='s').tz_localize('UTC').tz_convert(None).strftime('%Y-%m-%d %H:%M:%S') | ||
for interval in sample_data['intervals'] | ||
] | ||
assert all(ts.strftime('%Y-%m-%d %H:%M:%S') in expected_timestamps for ts in result['timestamp']) |