Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any way to import all historical data? #12

Open
mikkyo9 opened this issue Mar 4, 2022 · 89 comments
Open

Any way to import all historical data? #12

mikkyo9 opened this issue Mar 4, 2022 · 89 comments
Labels
enhancement New feature or request

Comments

@mikkyo9
Copy link

mikkyo9 commented Mar 4, 2022

It seems the DB is populated from the time of install?
Any way to pull in all the history?

@jasonacox
Copy link
Owner

Hi @mikkyo9 - good question. The Powerwall doesn't keep that history itself. When you install the "Powerwall-Dashboard" stack, the Telegraf service starts polling the Powerwall to get the metrics. The data is then stored in the DB for Grafana to display. The Tesla App has this historical data because it (Tesla Cloud) started polling the Powerwall when it was commissioned. I'm not aware of a way to pull this historical data into the database and even if we could, it would be missing some data like the String and Temperature data.

@dmayo305
Copy link

dmayo305 commented Mar 5, 2022

You could probably manually download each day's data and then write a script to transfer that information into the influx DB. I wish I could do an aggregate data download with fine granularity. Best I've found is to do a daily download in the phone app, then upload that to my dropbox and then view the info on my desktop.

@dmayo305
Copy link

dmayo305 commented Mar 5, 2022

@jasonacox - Your abilities greatly outpace mine, so I'm bringing an idea without specific testing. On https://www.teslaapi.io/powerwalls/state-and-settings and https://www.teslaapi.io/energy-sites/state-and-settings there appears to be an API call for historical data using a query period of day, week, month, year (similar to the granularity offered in the app). My thoughts are, could one develop a script with a start date, and then increment by day to download historical data?

@jasonacox
Copy link
Owner

jasonacox commented Mar 7, 2022

Thanks @dmayo305 - I have been looking at those Tesla cloud APIs (used by the Tesla app) to add features to pyPowerwall (e.g. update battery reserve setting). However, it seems Tesla keeps making breaking changes so I haven't found a good example/library to use. TeslaPy seems to be close but I haven't been able to get it to work yet. If we can figure it out, it wouldn't be too difficult to do what you mention.

@fracai
Copy link

fracai commented Mar 11, 2022

I have a sample project that uses TeslaPy to download Powerwall and Solar site data as json files. I imagine it could be an example of how to get the data for adding to the Dashboard, but I don't know if it would need to be reformatted before loading in to the database.

This example uses the "refresh token" method for logging in, so you'll need to generate that first with a 3rd party app or one of the other examples floating around.

@fracai
Copy link

fracai commented Mar 11, 2022

https://github.com/fracai/tesla-history

The sample project in question

@jasonacox
Copy link
Owner

Thanks @fracai ! Nice work. Yes I think we could use that to port the data into InfluxDB. What method did you use to get the "refresh token"? If we extend your project to drop that data into the Dashboard DB, we could include instruction on how to make it work.

Ideally, I want to find an automated way to get the token so I can include that in the pyPowerwall library.

@fracai
Copy link

fracai commented Mar 12, 2022

I used "Auth for Tesla", an iOS app. There are a few apps listed on this TeslaMate FAQ.

I think an automated method is ideal, but I've gotten the impression that manually retrieving the refresh token is the best method as Tesla frequently requires entering a captcha in order to login.

Fortunately, I think the refresh token is valid for a very long time and, unless you need to fill in data gaps, it shouldn't be necessary to request historical data. It might not be ideal, but I think it'd be acceptable to use a manual process to fill in old data.

Then again, if the refresh token is stored, it'd be possible to make a daily request of historical data to regularly fill in any missing gaps.

@fracai
Copy link

fracai commented Mar 12, 2022

There's also this python script for retrieving the token, though it does require manual intervention.

https://github.com/bntan/tesla-tokens

jasonacox added a commit that referenced this issue Apr 16, 2022
@jgleigh
Copy link

jgleigh commented Apr 25, 2022

Any more work being done for historical data? It'd be really cool to pull in all the old data.

@jasonacox
Copy link
Owner

I agree. It's not an easy fix, unfortunatelyl. I'm doing more investigation and open to any help. ;)

@dvanderb
Copy link

dvanderb commented Aug 1, 2022

What I am most interested in is my NEM year data(which started a month ago). So even if I could just import the basic data(to grid, from grid, home usage) as a single point in history it would be useful to fill in the gap from when I started to use powerwall-dashboard.

@mcbirse
Copy link
Collaborator

mcbirse commented Sep 21, 2022

I have some good news regarding this! 😀

Thanks to @jasonacox for pointing out the TeslaPy module and also @fracai for the sample project using that. I started testing the module and have worked out how to import historical data via the Tesla Owner API (cloud) directly into InfluxDB. 🎉

Having >1yr of historical data that I would certainly love to see in my dashboard was a good motivator. 😊

I'm still working on the program but hopefully will be ready to release in the next week or few. Full time day job doesn't help 😞 - also I've had to brush up on my python knowledge, as these days I'm primarily coding in a language probably most people have never heard of!

So, I have developed this as a command line tool that accepts a start & end datetime range to retrieve history from the Tesla Owner API and import into InfluxDB. @jasonacox I'm thinking submitting a PR to add it to the tools directory seems appropriate?

I'm using the staged authorization process of TeslaPy as it seemed convenient and easy in my opinion. The login to your Tesla account is only required once, as after login a token is cached/saved and it appears to never expire (similar to the Tesla app which stays logged in).

Obviously not all data that the Powerwall-Dashboard logs is available for import. But, the main data shown in the "Energy Usage" and "Power Meters" panels is.

I think I can get the "Monthly Analysis" data working too (still testing). This is stored in the kwh retention policy, but, it is populated from a CQ based on the data in autogen. So I just need to run a query to refresh it after importing history data.

Also, I've been able to extrapolate the grid status from the backup event history, which is nice. 😃

In case there are problems with an import, obviously backups are highly recommended. But, the program will also tag all the data points with a "source=cloud" tag in InfluxDB. This makes it easy to reverse what was imported with a query like DELETE WHERE source='cloud' if something went wrong (and could be further limited to a specific time period if required). This would only delete the imported data, and not affect data that came from your gateway via the dashboard. Also note original dashboard data is never overwritten or changed if you were to enter a datetime range that overlapped with the dashboard data (although I would not recommend - it is best to just fill periods of missing data only).

The history data available from the Tesla cloud is in 5 minute intervals only (15 min for charge percentage), but it seems good enough. Here are some example comparisons.

Below is showing data logged from Powerwall-Dashboard as normal (i.e. every 5 seconds or so):
image

Compared with below where data was imported from the Tesla cloud - the graph and values (e.g. Power Meters) are pretty close:
image

Example showing imported history data with a backup event - the grid status value is extrapolated from backup event history:
image

If you have a data gap / missing data your graph might look like below:
image

This can fixed now by importing the history from Tesla cloud for the specific time period:
image

So it seems to be working well so far as I can tell, and useful for importing old data from before you installed Powerwall-Dashboard, as well as if your system went down for some reason!

While I'm still working on this, if anyone has any feature requests, questions, tips etc. please send them my way. 😊

(side note: I'm already thinking about fetching historical weather data as well now, which is available from the OpenWeather One Call API 3.0 "timemachine" API call 😉 )

@jasonacox
Copy link
Owner

Hi @mcbirse Great work here!!! Please feel free to submit a PR in the /tools directory. I'm sure others would love this too.

@jgleigh
Copy link

jgleigh commented Sep 22, 2022

I'm looking forward to testing this. I have several small data dropouts as well as several months of data before I got the dashboard setup.

@fracai
Copy link

fracai commented Sep 22, 2022

Same here. Take your time to get it where you're happy, but I'm really looking forward to getting a look as well.
Thanks for working on it.

@mcbirse
Copy link
Collaborator

mcbirse commented Oct 13, 2022

Update finally - I have been working on the script to import historical data from Tesla cloud over the past few weeks, and believe it is now good to go! 🤞

I will submit a PR to add it to the /tools directory shortly.

Hoping for some feedback from the community on this, as so far all testing has been with my system only.

Please see the below instructions for how to use the script (will also add this to the tools README).

The tool can be used to import historical data from Tesla cloud, for instance from before you started using Powerwall-Dashboard. It can also be used to fill in missing data (data gaps) for periods where Powerwall-Dashboard stopped working (e.g. system down, or lost communication with the Local Gateway). By default, the script will search InfluxDB for data gaps and fill gaps only, so you do not need to identify time periods where you have missing data, as this will be done automatically.

NOTE:

  • The available data from Tesla cloud is limited to 5 minute intervals of power usage, and backup events only. It will not be as accurate as data logged by Powerwall-Dashboard, however is still sufficient to provide approximate usage history. The historical data is obtained via the same API as used by the Tesla mobile app, so should be very similar in accuracy to what is displayed in the app.
  • Data imported by this tool will not overwrite existing data, and will update the following panels only: Energy Usage, Grid Status, Power Meters, and Monthly Analysis

To use the script:

  • Install the required python modules: pip install python-dateutil teslapy influxdb
  • Follow the steps below

Setup and logging in to Tesla account

On first use, it is recommended to use the --login option. This will create the config file, save an auth token so you will not need to login again, and then display the energy site details associated with your Tesla account.

# Login to Tesla account
python3 tesla-history.py --login

It will run in an interactive mode. The example below shows the config creation:

Config file 'tesla-history.conf' not found

Do you want to create the config now? [Y/n] Y

Tesla Account Setup
-------------------
Email address: [email protected]
Save auth token to: [tesla-history.auth]

InfluxDB Setup
--------------
Host: [localhost]
Port: [8086]
User (leave blank if not used): [blank]
Pass (leave blank if not used): [blank]
Database: [powerwall]
Timezone (e.g. America/Los_Angeles): Australia/Sydney

Config saved to 'tesla-history.conf'

In most cases, the [default] values will be correct and can be accepted by pressing Enter, however these can be changed if you have a custom setup.

Generally, only your Tesla account email address and your timezone will be required.

After the config is saved, you will be prompted to login to your Tesla account.

This is done by opening the displayed URL in your browser and then logging in:

----------------------------------------
Tesla account: [email protected]
----------------------------------------
Open the below address in your browser to login.

<copy URL to browser> e.g.: https://auth.tesla.com/oauth2/v3/authorize?response_type=code...etc.

After login, paste the URL of the 'Page Not Found' webpage below.

Enter URL after login: <paste URL from browser> e.g.: https://auth.tesla.com/void/callback?code=...etc.

After you have logged in successfully, the browser will show a 'Page Not Found' webpage. Copy the URL of this page and paste it at the prompt.

Once logged in successfully, you will be shown details of the energy site(s) associated with your account:

----------------------------------------
Tesla account: [email protected]
----------------------------------------
      Site ID: 1234567890
    Site name: My Powerwall
     Timezone: Australia/Sydney
    Installed: 2021-04-01 13:09:54+11:00
  System time: 2022-10-13 22:40:59+11:00
----------------------------------------

Once these steps are completed, you should not have to login again.

Basic script usage and examples

To import history data from Tesla cloud for a given start/end period, use the --start and --end options (date/time range is inclusive and in format YYYY-MM-DD hh:mm:ss):

# Get history data for start/end period
python3 tesla-history.py --start "2022-10-01 00:00:00" --end "2022-10-05 23:59:59"

The above example would retrieve history data for the first 5 full days of October.

You can run in test mode first which will not import any data, by using the --test option:

# Run in test mode with --test (will not import data)
python3 tesla-history.py --start "2022-10-01 00:00:00" --end "2022-10-05 23:59:59" --test

Example output:

----------------------------------------
Tesla account: [email protected]
----------------------------------------
      Site ID: 1234567890
    Site name: My Powerwall
     Timezone: Australia/Sydney
    Installed: 2021-04-01 13:09:54+11:00
  System time: 2022-10-13 22:40:59+11:00
----------------------------------------
Running for period: [2022-10-01 00:00:00+10:00] - [2022-10-05 23:59:59+11:00] (4 days, 22:59:59s)

Searching InfluxDB for data gaps (power usage)
* Found data gap: [2022-10-02 11:21:00+11:00] - [2022-10-02 12:41:00+11:00] (1:20:00s)
* Found data gap: [2022-10-04 06:09:00+11:00] - [2022-10-04 06:45:00+11:00] (0:36:00s)
* Found data gap: [2022-10-04 12:29:00+11:00] - [2022-10-04 14:56:00+11:00] (2:27:00s)

Searching InfluxDB for data gaps (grid status)
* Found data gap: [2022-10-02 11:21:00+11:00] - [2022-10-02 12:41:00+11:00] (1:20:00s)
* Found data gap: [2022-10-04 06:09:00+11:00] - [2022-10-04 06:45:00+11:00] (0:36:00s)
* Found data gap: [2022-10-04 12:29:00+11:00] - [2022-10-04 14:56:00+11:00] (2:27:00s)

Retrieving data for gap: [2022-10-02 11:22:00+11:00] - [2022-10-02 12:40:59+11:00] (1:18:59s)
* Loading daily history: [2022-10-02]
Retrieving data for gap: [2022-10-04 06:10:00+11:00] - [2022-10-04 06:44:59+11:00] (0:34:59s)
* Loading daily history: [2022-10-04]
Retrieving data for gap: [2022-10-04 12:30:00+11:00] - [2022-10-04 14:55:59+11:00] (2:25:59s)

Retrieving backup event history
* Creating grid status data: [2022-10-02 11:22:00+11:00] - [2022-10-02 12:40:59+11:00] (1:18:59s)
* Creating grid status data: [2022-10-04 06:10:00+11:00] - [2022-10-04 06:44:59+11:00] (0:34:59s)
* Creating grid status data: [2022-10-04 12:30:00+11:00] - [2022-10-04 14:55:59+11:00] (2:25:59s)

Writing to InfluxDB (*** skipped - test mode enabled ***)
Updating InfluxDB (*** skipped - test mode enabled ***)
Done.

If backup events are identified, this will be shown in the output, and the imported grid status data will include the outages:

----------------------------------------
Tesla account: [email protected]
----------------------------------------
      Site ID: 1234567890
    Site name: My Powerwall
     Timezone: Australia/Sydney
    Installed: 2021-04-01 13:09:54+11:00
  System time: 2022-10-13 22:40:59+11:00
----------------------------------------
Running for period: [2022-04-03 00:00:00+11:00] - [2022-04-20 00:00:00+10:00] (17 days, 1:00:00s)

Searching InfluxDB for data gaps (power usage)
* Found data gap: [2022-04-03 00:00:00+11:00] - [2022-04-20 00:00:00+10:00] (17 days, 1:00:00s)

Searching InfluxDB for data gaps (grid status)
* Found data gap: [2022-04-03 00:00:00+11:00] - [2022-04-20 00:00:00+10:00] (17 days, 1:00:00s)

Retrieving data for gap: [2022-04-03 00:00:00+11:00] - [2022-04-20 00:00:00+10:00] (17 days, 1:00:00s)
* Loading daily history: [2022-04-03]
* Loading daily history: [2022-04-04]
* Loading daily history: [2022-04-05]
* Loading daily history: [2022-04-06]
* Loading daily history: [2022-04-07]
* Loading daily history: [2022-04-08]
* Loading daily history: [2022-04-09]
* Loading daily history: [2022-04-10]
* Loading daily history: [2022-04-11]
* Loading daily history: [2022-04-12]
* Loading daily history: [2022-04-13]
* Loading daily history: [2022-04-14]
* Loading daily history: [2022-04-15]
* Loading daily history: [2022-04-16]
* Loading daily history: [2022-04-17]
* Loading daily history: [2022-04-18]
* Loading daily history: [2022-04-19]
* Loading daily history: [2022-04-20]

Retrieving backup event history
* Creating grid status data: [2022-04-03 00:00:00+11:00] - [2022-04-20 00:00:00+10:00] (17 days, 1:00:00s)
* Found backup event period: [2022-04-19 20:55:53+10:00] - [2022-04-19 22:00:16+10:00] (1:04:23s)
* Found backup event period: [2022-04-19 20:53:39+10:00] - [2022-04-19 20:54:46+10:00] (0:01:07s)
* Found backup event period: [2022-04-08 19:00:14+10:00] - [2022-04-08 19:02:55+10:00] (0:02:41s)
* Found backup event period: [2022-04-08 18:57:32+10:00] - [2022-04-08 18:58:28+10:00] (0:00:56s)
* Found backup event period: [2022-04-08 18:54:56+10:00] - [2022-04-08 18:56:21+10:00] (0:01:25s)
* Found backup event period: [2022-04-04 21:16:45+10:00] - [2022-04-04 21:19:10+10:00] (0:02:25s)

Writing to InfluxDB
Updating InfluxDB
Done.

Since grid status logging was only added to Powerwall-Dashboard in June, you can use the tool to import grid status history from before this time, without affecting your existing power usage data.

The search for missing power usage / grid status data is done independently, so power usage history retrieval will be skipped and the missing grid status data will still be retrieved from Tesla cloud and imported:

----------------------------------------
Tesla account: [email protected]
----------------------------------------
      Site ID: 1234567890
    Site name: My Powerwall
     Timezone: Australia/Sydney
    Installed: 2021-04-01 13:09:54+11:00
  System time: 2022-10-13 22:40:59+11:00
----------------------------------------
Running for period: [2022-04-03 00:00:00+11:00] - [2022-04-14 00:00:00+10:00] (11 days, 1:00:00s)

Searching InfluxDB for data gaps (power usage)
* None found

Searching InfluxDB for data gaps (grid status)
* Found data gap: [2022-04-03 00:00:00+11:00] - [2022-04-14 00:00:00+10:00] (11 days, 1:00:00s)

Retrieving backup event history
* Creating grid status data: [2022-04-03 00:00:00+11:00] - [2022-04-14 00:00:00+10:00] (11 days, 1:00:00s)
* Found backup event period: [2022-04-08 19:00:14+10:00] - [2022-04-08 19:02:55+10:00] (0:02:41s)
* Found backup event period: [2022-04-08 18:57:32+10:00] - [2022-04-08 18:58:28+10:00] (0:00:56s)
* Found backup event period: [2022-04-08 18:54:56+10:00] - [2022-04-08 18:56:21+10:00] (0:01:25s)
* Found backup event period: [2022-04-04 21:16:45+10:00] - [2022-04-04 21:19:10+10:00] (0:02:25s)

Writing to InfluxDB
Done.

Some convenience date options are also available (e.g. these could be used via cron):

# Convenience date options (both options can be used in a single command if desired)
python3 tesla-history.py --today
python3 tesla-history.py --yesterday

Finally, if something goes wrong, the imported data can be removed from InfluxDB with the --remove option. Data logged by Powerwall-Dashboard will not be affected, only imported data will be removed:

# Remove imported data
python3 tesla-history.py --start "YYYY-MM-DD hh:mm:ss" --end "YYYY-MM-DD hh:mm:ss" --remove

For more usage options, run without arguments or with the --help option:

# Show usage options
python3 tesla-history.py --help
usage: tesla-history.py [-h] [-l] [-t] [-d] [--config CONFIG] [--site SITE] [--ignoretz] [--force] [--remove] [--start START] [--end END] [--today] [--yesterday]

Import Powerwall history data from Tesla Owner API (Tesla cloud) into InfluxDB

options:
  -h, --help       show this help message and exit
  -l, --login      login to Tesla account only and save auth token (do not get history)
  -t, --test       enable test mode (do not import into InfluxDB)
  -d, --debug      enable debug output (print raw responses from Tesla cloud)

advanced options:
  --config CONFIG  specify an alternate config file (default: tesla-history.conf)
  --site SITE      site id (required for Tesla accounts with multiple energy sites)
  --ignoretz       ignore timezone difference between Tesla cloud and InfluxDB
  --force          force import for date/time range (skip search for data gaps)
  --remove         remove imported data from InfluxDB for date/time range

date/time range options:
  --start START    start date and time ("YYYY-MM-DD hh:mm:ss")
  --end END        end date and time ("YYYY-MM-DD hh:mm:ss")
  --today          set start/end range to "today"
  --yesterday      set start/end range to "yesterday"

Advanced option notes

  • --debug can be used to enable debug output. This will print the raw responses from Tesla cloud which might be helpful in some circumstances
  • --force option can be used to import data regardless of existing data (i.e. the search for data gaps is skipped). Should not be required normally, but could be useful for testing purposes
  • --remove will remove any previously imported data from InfluxDB for the date/time range, without affecting data logged by Powerwall-Dashboard
  • --ignoretz if your Tesla Powerwall timezone configuration is incorrect, this option may be required. Enabling it will ignore timezone differences between Tesla and InfluxDB. I'm not sure if this would ever happen without more real-world data on installs - could an installer have configured it wrong? The program will give an error if the detected Tesla site timezone doesn't match your configured InfluxDB timezone (please do check you have configured the InfluxDB timezone correctly if you get this error). If the Tesla timezone is in fact wrong (installer error?), I'd be interested to know. Using this option is untested, but I did attempt to account for it in the program by converting datetimes from the configured InfluxDB timezone to the detected Tesla site timezone and vice versa (it should work...)

InfluxDB notes and Analysis Data

I am recording some technical notes below regarding InfluxDB and the Analysis data updates for anyone interested - although mainly because I'm likely to forget why I have done something a certain way in the program if I need to look at this again in the future. 😊

Importing data:

  • When writing large amounts of data to InfluxDB, I found it is significantly faster to write data points in Line Protocol format, instead of JSON
  • The power usage data retrieved from Tesla cloud is imported into autogen.http however with an added tag of source='cloud'
  • Any existing data points will not be affected at all, as the tags don't match (refer here for an explanation of how InfluxDB handles duplicate data points)
  • This also means, it is easy to remove any imported data by simply specifying WHERE source='cloud' in a DELETE query - only imported data will be removed, as original data will not have this tag
  • When deleting data in InfluxDB, it is important to note it is not possible to remove data from a specific retention policy, instead a DELETE will remove data from all retention policies. However, due to the imported data being tagged with source='cloud' this will not be an issue as the data in the other retention policies will not have this tag

For updating analysis data of InfluxDB (kwh, daily, and monthly retention policies):

  • After data is imported into autogen, the script will currently replicate the SELECT INTO queries used in the CQ's for kwh, daily, and monthly
  • To update the measurement values of existing data points however, it is important to ensure the data points have the exact same timestamp and tags (refer here again), therefore points cannot be tagged with source='cloud' as this would add points (NOTE: some of the CQ's are a sum() of data points, so adding additional points would result in e.g. doubling of values)
  • So as not to affect the analysis data unnecessarily, the tool will only execute the update queries for time ranges where new data points have been imported (+/- the group by time)
  • Finally just noting, there has been some discussion in Enhancement suggestion - Align Production and consumption data with tesla app and gateway data #87 regarding the accuracy of the analysis data using integral calculations, so the script may need updating once a more accurate method has been determined

Some final notes for anyone using this tool to potentially import a lot of historical data:

If you have a lot of historical data you would like to import, it is probably advisable to split this up into batches of no more than few months at a time. During my testing, when retrieving the daily history data, I found the Tesla cloud site may stop responding if you overload it with too many requests in short periods (this is noted in the TeslaPy module Be aware that Tesla may temporarily block your account if you are hammering the servers too much).

However, this did only occur when trying to request data as quickly as possible. As such, a default 1 second delay between requests has been added to the script (can be changed by adding DELAY = x in the config file), after which I was able to retrieve a full year of historical data in one run without issue.

Please let us know if you find any issues or have questions.

@fracai
Copy link

fracai commented Oct 14, 2022

This looks fantastic. I haven't had a chance to test it yet, but I want to soon. I love that you have it designed to search for and fill gaps. That's great.
Thanks for your work. I look forward to testing.

@jgleigh
Copy link

jgleigh commented Oct 14, 2022

AMAZING!!! I imported over six months of data and filled a ton of small data gaps. No issues so far. THANKS!!!

@mikkyo9
Copy link
Author

mikkyo9 commented Oct 14, 2022

Cool! I can't wait to try it out. I assume your work is based on the most current release schema, so I should upgrade first?

@mcbirse
Copy link
Collaborator

mcbirse commented Oct 15, 2022

Cool! I can't wait to try it out. I assume your work is based on the most current release schema, so I should upgrade first?

@mikkyo9 - You will need to be running at least version 2.3.0 I believe, as that was the release when the CQ for grid status was added (released July). I'd still recommend upgrading to the latest version if you can though, as there has been many fixes.

AMAZING!!! I imported over six months of data and filled a ton of small data gaps. No issues so far. THANKS!!!

@jgleigh -

Awesome to hear, thanks for the feedback!

Yes, you may see a number of small data gaps identified even if you think your system had been logging non-stop without issues.

This is more likely with the grid status data as the gap threshold is 1 minute for that, as apposed to 5 minutes for power usage.

Any time you do an upgrade, for instance, you probably lose a few minutes of data but generally not more than 5 mins in my experience.

I considered accepting longer gaps for grid data - but then looking at my own backup history, I've had a number of events that range from a few seconds to a few minutes... so this way these will still be filled in if the system did not log data during that time for whatever reason.

@mikkyo9
Copy link
Author

mikkyo9 commented Oct 15, 2022

Great job on this!
Worked very well filling dozens of gaps and a missing year of data.

@youzer-name
Copy link
Contributor

Works like a charm. Thanks for putting this together @mcbirse

@DMBlakeley
Copy link

Worked quite well for me also, thanks for the all your efforts!

@oralallen82
Copy link

This worked like a charm. It's amazing what a community can do. I have all the date from system install August 2021 thanks @mcbirse and @jasonacox

@jasonacox
Copy link
Owner

GREAT writeup! Thanks @clukas1967 !

sudo synogroup --add docker
sudo chown root:docker /var/run/docker.sock
sudo synogroup --member docker

I wonder if we should update the README.md instructions for Dashboard installation on Synology systems (here). Would you want to submit a PR for that so you can get credit?

One change:

modify the Grafana section of the file powerwall.yml

That will break the upgrade.sh process. Instead, you should be able to edit compose.env to do the same thing:

GRAFANAUSER="1000:1000"

Thanks!! 🙏

@clukas1967
Copy link

Thanks @jasonacox. I should have also acknowledged you in my original post as we wouldn't have any of this without you!

I agree it makes sense to make the readme more explicit for PW Dashboard install. But I don't deserve any credit since I merely restated the procedure in the link I cited in more compact form. Regarding editing compose.env vs. powerwall.yml I would defer to your expertise here.

I do think it makes sense to have a Synology section in a readme specifically for the Tesla History utility. You could just take the last CLI block from my post and append the admonition to follow it up with the original procedure offered by @mcbirse on 10/13/2022. Probably that whole lengthy post should be the readme.

In a few days I will be starting a couple of new threads in here as I'm about to expand both my battery storage and my PV generation on the roof and have questions/suggestions about how that will work.

@mcbirse
Copy link
Collaborator

mcbirse commented Apr 12, 2023

Thanks @clukas1967 - great to hear the script has been so useful!

Next iteration will support solar-only sites (as per version under tools/solar-only). This is already working now for those solar-only users and at some point I will merge all these changes to the primary script so we are not maintaining multiple versions... (still some QoL updates to do for solar-only users, such as a daemon/background mode).

Excellent info for the Synology NAS users out there, of which there appears to be many! Personally I'm running Powerwall-Dashboard in an Alpine Linux VM on a QNAP NAS (since I had that set up for other purposes already). The versatility of Powerwall-Dashboard and number of platforms it can be run on is fantastic.

Once the download is done for the specified time interval, it writes out everything it has to InfluxDB so there is no residual state required. You can blow away the venv after you verify the history is showing up in Grafana.

You could - but I find it useful to have the script on hand so it can be easily run at any time. I have frequently used it myself for instances where my host server was down (e.g. due to OS updates, NAS firmware updates, etc.), so you can then fill in missing data from the Tesla cloud at least, whenever you need to.

@abd7786
Copy link

abd7786 commented Sep 29, 2023

Just a heads up I had issues running the python3 tesla-history.py --login script on Windows using Git Bash. After I pasted the callback URL after login it would just get stuck/not do anything.

I used Windows command prompt instead and worked correctly. In case anyone else is trying to run this on Windows.

THANKS! Just had the same issue and this worked.

@wiltocking
Copy link

Not sure if it's due to a change on Tesla's end but haven't been able to get past the Tesla Account setup steps in the instructions. After copying the URL, loging into the Tesla Account and getting "Page Not Found" the URL that I get results in the following error: Login failure - (mismatching_state) CSRF Warning! State not equal in request and response.

Is there another way to add the authentication token to the script?

@jasonacox
Copy link
Owner

Hi @wiltocking - A few things:

  • Try clearing your cookies in your browser and re-authenticating
  • Do you get any more information when you supply the --debug option to tesla-history.py?

@wiltocking
Copy link

Hi @wiltocking - A few things:

  • Try clearing your cookies in your browser and re-authenticating
  • Do you get any more information when you supply the --debug option to tesla-history.py?

Well, new day new results, cleared cookies again and this time the URL worked! Downloading Tesla cloud history now. Thank you :)

@jasonacox
Copy link
Owner

Awesome! Thanks @wiltocking !

@kylebrowning
Copy link

Im not able to run the script getting the error ERROR: Login failure - MissingTokenError('(missing_token) Missing access token parameter.') after login

@kylebrowning
Copy link

I should also mention that my dashboard is up and running

@jasonacox
Copy link
Owner

@simonrb2000
Copy link

I thought I'd try this; I'm away from home at the mo so doing it remotely and I get this error when I try to login. I have deleted the config file an started from the beginning and get the same error

ERROR: Failed to retrieve PRODUCT_LIST - LoginRequired('(login_required) Login required')

@jasonacox
Copy link
Owner

Interesting. Did you get any error from Tesla when you tried the login, or just the expected 404 page URL?

@porlcass
Copy link

porlcass commented Apr 7, 2024

Hi All,

First, huge thanks to @jasonacox for the awesome dashboard, @mcbirse for the super useful history tool and @clukas1967 for the life saving tips on getting everything running on a Synology NAS!

Now the issue: I am in Sydney/Australia, my local system timezone is Sydney/Australia, as is my InfluxDB and Site timezone, but for some reason that I cannot work out, when I run tesla-history, datetime values are output as Sydney/Australia time values, but with UTC offset (ie. +00:00) instead of Sydney/Australia offset (+11:00 or +10:00, depending on Daylight savings). As a consequence, the calls to get_calendar_history_data only return 10 or 11 hours of data and I end up with 13 or 14 hour gaps every day.

I have worked around the issue with a filthy hack (I changed day.replace(tzinfo=sitetz).isoformat() to f"{day.isoformat()}+11:00", noting the only gaps I had were during daylight savings), but I would like to work out what the issue is and fix it so that I can use the tool without the hack.

Any ideas?

@mcbirse
Copy link
Collaborator

mcbirse commented Apr 7, 2024

Hi @porlcass - I take it you are running the tesla-history script on a Synology NAS?

There seems to be some issue with the python dateutil module obtaining the timezone correctly on Synology NAS systems.

Refer issue #455 and the last post here.

I'm not sure of the exact cause, but it's possible the timezone files on Synology NAS are messed up in some way or incompatible with python dateutil.

If you are on Synology (or even another system), it would be great if you could run some of the python tests mentioned to see where the issue is occurring.

A subset of those, like below, is probably sufficient in fact. Run python and paste all of the below commands in the python shell, and post the results.

from dateutil.parser import isoparse
from dateutil import tz

start = isoparse("2023-06-26 00:00:00")
start
print(start)
type(start)
type(start.utcoffset())

influxtz = tz.gettz("Australia/Sydney")
influxtz
print(influxtz)
type(influxtz)

start = start.replace(tzinfo=influxtz)
start
print(start)
type(start)
type(start.utcoffset())

It would be interesting to see what is returned for the final print(start), like the below.

>>> start = start.replace(tzinfo=influxtz)
>>> start
datetime.datetime(2023, 6, 26, 0, 0, tzinfo=tzfile('/usr/share/zoneinfo/Australia/Sydney'))
>>> print(start)
2023-06-26 00:00:00+00:00

The above (from a Synology NAS) is showing the timestamp is output with a +00:00 offset (UTC), which is incorrect based on the timezone specified. It should show +10:00 for Australia/Sydney for the above example.

Is your system showing a similar issue?

Can any other Synology NAS users shed some light on this issue? Unfortunately, I do not have a Synology NAS to test with myself.

The script is currently using the python dateutil module, however from python 3.9 the built-in zoneinfo module is also available. If the above failed, it would be interesting to see what result the following gives in that case.

from dateutil.parser import isoparse
from zoneinfo import ZoneInfo

start = isoparse("2023-06-26 00:00:00")
start
print(start)

influxtz = ZoneInfo("Australia/Sydney")
influxtz
print(influxtz)
type(influxtz)

start = start.replace(tzinfo=influxtz)
start
print(start)

When originally writing the script, I tested the zoneinfo module vs. dateutil and found issues with zoneinfo, so I did not use it.

@porlcass
Copy link

porlcass commented Apr 7, 2024

Hi @mcbirse , yes, I am running the script on a Synology NAS.

The results of the first script are as follows:
2023-06-26 00:00:00
tzfile('/usr/share/zoneinfo/Australia/Sydney')
2023-06-26 00:00:00+00:00

The second script was a bit more fiddly. The version of Python 3 currently installed in the Synology DSM is 3.8.15, so ZoneInfo is not available. As noted above by @clukas1967, you can install Python 3.9, but it does not include PIP, and installing it is...beyond my abilities, so I could not install python-dateutil in Python 3. I ended up installing backports.zoneinfo and updated the second line in your second script to
from backports.zoneinfo import ZoneInfo

The results of the modified second script are as follows:
2023-06-26 00:00:00
Australia/Sydney
2023-06-26 00:00:00+10:00

So ZoneInfo looks like a winner for Synology NAS users. I will modify my version of tesla-history.py to use ZoneInfo, but it will be referencing backports.zoneinfo, so it will have to be a special 'Synology NAS users' version of the script.

@simonrb2000
Copy link

Interesting. Did you get any error from Tesla when you tried the login, or just the expected 404 page URL?

I got nothing.. got no option to log in.

@jasonacox
Copy link
Owner

no option to log in.

@simonrb2000 - Just to make sure we aren't missing anything, did you follow the login instructions here? https://github.com/jasonacox/Powerwall-Dashboard/tree/main/tools/tesla-history#setup-and-logging-in-to-tesla-account

@ceeeekay
Copy link

Great tool! How did I not know about this? I've been using Powerwall Dashboard since January but I've had the system much longer, so getting the old data in the dash is amazing. Thanks!

Just one question: Is there any way to fetch historical SOC for each Powerwall? One of mine is reading much lower than it should (but not warranty-claim low yet - 10.4kWh) and some history on it would be great.

@mcbirse
Copy link
Collaborator

mcbirse commented Sep 15, 2024

Hey @ceeeekay - Glad you found the tool and find it useful!

It's a godsend to be honest and was well worth the time spent developing and testing, and I still use it frequently myself... typically due to outages when you need to retrieve some missing data.

Unfortunately, as far as I'm aware there isn't any way to retrieve what you are after, which I'm guessing you mean the Powerwall Capacity (full pack energy) of your PW's.

@jasonacox
Copy link
Owner

It's a godsend to be honest and was well worth the time spent developing and testing, and I still use it frequently myself...

100%! ❤️ Same here. Thanks @mcbirse 🙏

@ceeeekay
Copy link

@mcbirse - thanks. I sort of assumed it wouldn't be available if it wasn't already included, but you never know for sure.

@jasonacox - I wonder if it's worth having this automatically run with a ~1 day history after an upgrade or restart? I've always been aware I'm creating gaps whenever I upgrade the dash, or patch & reboot the host, and didn't know I could recover the data this easily.

Also I feel like this tool should be mentioned on the main README.md. I only found it because I was digging around in the install directory. For new users it would be nice to start out with a bunch of history already in the DB.

@clukas1967
Copy link

clukas1967 commented Sep 25, 2024 via email

@clukas1967
Copy link

Here is the image that goes with the comment I just made showing two different days in the last week that I had a 4-5 gap overnight in the data.

2024-09-25_11-38-00

@jasonacox
Copy link
Owner

I like the idea of having the history import at setup time (and update during upgrades). It would require that the user signs in to the Tesla cloud which would create another set during setup. It probably makes sense to make it an option.

TODO

@HomeLandGIT
Copy link

Hi Thank you for this. I have go the dashboard up and running and would like to import my historical data. However how do I install the required python modules: pip install python-dateutil teslapy influxdb when i run this I get the error "error: externally-managed-environment"

I have on another linux VM run the https://github.com/netzero-labs /tesla-solar-download

@jasonacox
Copy link
Owner

The "externally-managed-environment" error typically occurs when you're trying to install packages in a virtual environment that's not properly activated.

To resolve this, try activating your virtual environment before running the pip install command. If you're using a Python environment like conda, you can activate it with conda activate <env_name>. If you're using a virtual environment created with python -m venv, you can activate it with source <env_name>/bin/activate (on Linux/Mac) or <env_name>\Scripts\activate (on Windows).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests