-
###Summary
-
- /data/json.html
- /data/history/YYYYMMDD.csv or /data/history/YYYYMMDD.txt
- /data/channels.json or /data/channels.dat
- /data/now.json or /data/now.dat
- /data/recent.json or /data/recent.dat
- /data/historyFiles.json or /data/historyFiles.dat
- /data/historyByDay.json or /data/historyByDay.dat
- /data/dayStats.json or /data/dayStats.dat
- /data/hostinfo.json or /data/hostinfo.dat
-
- DataGSCollector related arguments
- Serial port data source arguments
- Data output (JSON) arguments
- Local Logging arguments
- [Configuration File arguments] (#configarg)
-
###Website
Gather data from TCP/IP (simple ASCII format) or serial port (WorldData format) and process and make available
The channel description file is in JSON format. It specifies channel names, descriptions, and other attributes for a channel.
Example file with one element:
{
"data": [
{
"id": "age_inverter",
"title": "Inverter Age (255 indicates old data)",
"description": "Inverter Age (255 indicates old data)",
"units": "none",
"precision": 2,
"sortOrder": 240,
"dayStats": "false",
"log": "true",
"historyByDay": "false",
"recent": "false",
"mode": "SAMPLE"
}
]
}
"id"
is the channel id or name. Data type:String
"title"
is used as the column name in the historical data table. Data type:String
"description"
is used as the column name in the log file. Data type:String
"units"
is the metric in which the channel's data is measured. Data type:String
* ex: MPH, kWh, or %"precision"
is the number of digits the data is displayed with. Positive numbers indicate how many decimal points to round to and negative numbers indicates what digit to round to. Data type:Integer
ex:
number | precision | result |
---|---|---|
12.2345 | 0 | 12 |
12.5345 | 0 | 13 |
12.2345 | 2 | 12.23 |
12.2355 | 2 | 12.24 |
12.2345 | -1 | 10 |
15.2345 | -1 | 20 |
5.2345 | -3 | 0 |
"sortOrder"
is the order in which the data appears in the log file. The channel with the lowest number is first, then the next lowest number is second, etc... Data type:Integer
"dayStats"
indicates if the channel's data will be included in the dayStats.json file. Data type:String
"log"
indicates if this channel will have its data saved in log files. Data type:String
"historyByDay"
indicates if the channel's data will be included in the historyByDay.json file. Data type:String
"recent"
indicates if the channel's data will be included in the recent.json file. Data type:String
"mode"
can be"SAMPLE"
or"AVERAGE"
and indicates how the channel's data will be presented in the now.json file. Data type:String
Note that channel id's will be single character letters for data received via TCP/IP / ASCII. Longer channel names are possible for data that comes in via WorldDataProcessor via reflection.
Simple page with links to the different files. Read from file system. For information on parsing JSON in multiple programming and scripting languages, visit JSON.org
Interval statistics or sample of the last batch of data processed. Data is process at interval
specified by the -i
argument. This file is the primary way of getting current data.
The two modes, "SAMPLE"
and "AVERAGE"
, change how the data for the channel is presented.
Example file with both modes present:
{
data: [
{
channel: "r_parallel_threshold",
time: 1419258385610,
sampleValue: "6.0",
mode: "SAMPLE"
},
{
channel: "i_ac_volts_out",
time: 1419258385610,
n: 10,
avg: 118.3,
min: 117,
max: 119,
stddev: 0.8232726023485638,
mode: "AVERAGE"
}
]
}
If the channel is using the mode "SAMPLE"
, the channel data is obtained through "sampleValue"
.
If the channel is using the mode "AVERAGE"
, the channel data is split up between five values:
"n"
is the number of data points used to compute these values."avg"
is the average of the data points received within the given interval."min"
is the minimum value of the data points received within the given interval."max"
is the maximum value of the data points received within the given interval."stddev"
is the standard deviation of the data points received within the given interval.
Please note that "sampleValue"
will always be represented in string
format where as the data from "AVERAGE"
is numeric
.
Both modes contain "time"
which is a UNIX timestamp representation of when that data was generated. For more information on the UNIX timestamp, visit http://www.unixtimestamp.com/
Returned as application/json
if URI ends with .json
or as text/plain
if the URI ends with .dat
Example file:
"Data Date (UTC)","Milliseconds",b_dc_power,i_dc_power,calc_add_power,gen_power,load_power,b_dc_volts,b_dc_amps,i_ac_volts_in,b_amph_in_out,b_state_of_charge,i_dc_volts,i_dc_amps,i_ac_volts_out,i_amps_out,i_amps_in,i_ac_hz,i_status,i_fault,i_temp_transformer,i_temp_fet,i_temp_battery,a_gen_run_hours_since_boot,a_gen_runtime_minutes,age_inverter,a_temperature
"Data Date (UTC)","Milliseconds",DC Power (RMK),DC Power (Inverter),Calculated Renewable Power,AC In Power,AC Out Power,VDC (BMK),DC Amps (BMK),VAC In (Inverter),Ah In/Out (BMK),SOC (BMK),VDC (Inveter),DC Amps (Inverter),VAC Out (Inverter),AC Amps Out,AC Amps IN,AC Out Hz,Inverter Status,Fault Code,Transformer Temp (C ),FET Temp (C ),Battery Temp (C ),Total Generator Runtime (Since AGS Boot),Generator Runtime (Current Cycle),Inverter Age (255 indicates old data),AGS Temp (C )
2014-12-19 10:05:45,472,0,0,0,0,0,13.56,0.0,179.60,21.0,100,13.6,0,118,0.00,0.00,60.0,2.0,0.0,29.00,24.00,18.00,12.0,12.0,1.0,70.20
2014-12-19 10:05:55,473,0,0,0,0,0,13.57,0.0,179.30,21.0,100,13.6,0,118,0.00,0.00,60.1,2.0,0.0,29.00,24.00,18.00,12.0,12.0,1.0,70.20
This file uses the comma separated values (CSV) format. A column is created for every channel that has log
set to true. The first line of the file uses the channel's id
as the column header. The second line uses the channel's title
as the column header. The third row and beyond are the values corresponding to their respective column headers. Please note that Data Date (UTC)"
and"Milliseconds"
are the same on both the first and second line. This is because their values are generated when the log file is written to and are not values received from an outside source.
CSV files can be opened in most spread sheet applications such as Microsoft Excel and LibreOffice Calc. For more information on the CSV format, visit http://en.wikipedia.org/wiki/Comma-separated_values
Please note that there is no universal standard to the CSV format that applies quotes and strings.
Logged history file from file system. Return as MIME type test/csv
if the URI ends with .csv
or as
text/plain
if the URI ends with .txt
.
History files are stored in the log local directory which is set with the -w
argument
Channel description map as loaded from filesystem. File system location is set with the -c
argument.
For example file, see Channel Description File Format.
Returned as application/json
if URI ends with .json
or as text/plain
if the URI ends with .dat
Time series data covering from now to the number of hours specified by the -H
argument.
Example file :
{
recent: [
{
time: 1419174506269,
data: {
b_amph_in_out: 22,
calc_add_power: 0,
b_dc_watts: 0,
b_state_of_charge: 100
}
}
]
}
CAUTION: This file will contain every data point recorded for the specified channels from the last X amount of hours meaning this has the potential to use up a lot of memory. For example, Let's say we have an interval ( argument i
) of 10000 milliseconds ( 10 seconds ) and json-history-hours ( argument H
) of 24 hours. For every channel that contains "recent": "true"
in this scenario will add 8640 data points to the file. Make sure to be mindful of your device's specs when configuring these settings. If you are looking to create a graph spanning a period of time, use this. However, if you just want 24 hour summary data, consider using dayStats instead.
"time"
is a UNIX timestamp representation of when that data was generated."data"
is an object containing key-value pairs of channels that contain"recent": "true"
in it's channel description.
Returned as application/json
if URI ends with .json
or as text/plain
if the URI ends with .dat
Listing of the log files available in the log local directory.
Example file :
{
history_files: {
files: [
"20141222.csv",
"20141221.csv",
"20141220.csv",
"20141219.csv"
]
}
}
Returned as application/json
if URI ends with .json
or as text/plain
if the URI ends with .dat
Daily statistics that summarize the values of all of the files in the log local directory.
Statistics are generated on all of the channels that have history
set to true in the channel description map.
Example file:
{
summary_stats: [
{
day: "20141220",
n: 8638,
gen_power_min: 0,
gen_power_max: 0,
gen_power_avg: 0,
b_state_of_charge_min: 100,
b_state_of_charge_max: 100,
b_state_of_charge_avg: 100,
calc_add_power_min: -9.152,
calc_add_power_max: 44.7228,
calc_add_power_avg: 0.23513608345553358,
i_dc_power_min: 0,
i_dc_power_max: 38.08,
i_dc_power_avg: 0.05220331352422104,
b_dc_power_min: -9.152,
b_dc_power_max: 62.71640000000001,
b_dc_power_avg: 0.2874055799953692,
i_dc_volts_min: 12.69,
i_dc_volts_max: 13.670000000000002,
i_dc_volts_avg: 13.537876694708153
}
]
}
"day"
YYYYMMDD representation of the date. Data type:String
"n"
is the number of data points used to compute these values. Data type:Integer
"avg"
is the average of the data points received on that day. Data type:Float
"min"
is the minimum value of the data points received on that day. Data type:Float
"max"
is the maximum value of the data points received on that day. Data type:Float
Computing the results is done at startup and then continually updated. If the results aren't yet available,
will return an HTTP response of NO CONTENT
(HTTP result code 204).
Returned as application/json
if URI ends with .json
or as text/plain
if the URI ends with .dat
The summarized data covering from now to the number of hours specified by the -H
argument.
Statistics are generated on all of the channels that have dayStats
set to true in the channel description map.
Example file:
{
dayStats: {
i_ac_volts_out: {
n: 8640,
min: 117.1,
max: 121.6,
avg: 119.18382330246877
}
}
}
"n"
is the number of data points used to compute these values. Data type:Integer
"avg"
is the average of the data points received within 24 hours. Data type:Float
"min"
is the minimum value of the data points received within 24 hours. Data type:Float
"max"
is the maximum value of the data points received within 24 hours. Data type:Float
Returned as application/json
if URI ends with .json
or as text/plain
if the URI ends with .dat
Hostname, firmware date, and the drives of the server.
Example file:
{
hostname: "A3432",
firmware_date: "2014-11-19",
drives: [
{
total: 3023728,
used: 1872024,
avail: 978392,
readOnly: false,
name: "rootfs",
type: "rootfs",
description: "/ (rootfs)"
},
{
total: 4418624,
used: 57032,
avail: 4345208,
readOnly: false,
name: "/dev/mmcblk0p3",
type: "ext4",
description: "/data (/dev/mmcblk0p3)"
}
]
}
hostname
comes from the java.net methodInetAddress.getLocalHost().getHostName();
firmware_date
is the most recent build date of DataGS.drives
total
is the total amount of storage the drive has in kilobytes.used
is the amount of used storage the drive has in kilobytes.avail
is the amount of available storage the drive has in kilobytes.readOnly
if true, the drive is read-only. If false, drive is not read-only.name
is the name of the drive.type
is the file system or architecture of the drive.description
is the location of the file with a description.
Returned as application/json
if URI ends with .json
or as text/plain
if the URI ends with .dat
DataGS has a built in web server. The website included in the www/ directory was created to display information received from a magnum inverter. The website includes pages for Current Conditions, Current Settings, and Historical Data. These web pages use Ajax to retrieve data from the json pages created by DataGS.
The Current Conditions page displays current data as well as summarized data from the last 24 hours. The data displayed includes data from the battery monitor (example: ME-BMK), Magnum inverter, and Auto Generator Start. Also included on the Current Conditions page is an alarm feature that warns the user of any faults that may occur with the inverter by playing an alarm sound and displaying an error message on the screen.
The Current Settings page displays the current settings of devices connected to the Magnum network, such as: inverter settings, remote settings, Automatic Generator Start settings, battery monitor settings.
The Historical Data page allows the user to view data by month. The user may select up to 6 months to view on the same page. The charts display the average, min, and max values over each month for the battery state of charge, renewable watts, battery watts, load watts, and generator watts. Underneath the charts is the same data represented in a table as well as links to the raw data in both text and CSV format.
Argument | Title | Require Argument | Description |
---|---|---|---|
i | interval | true | Interval to summarize over (milliseconds) |
l | listen-port | true | DataGSCollector Listening Port |
t | socket-timeout | true | DataGSCollector connection socket timeout |
c | channel-map | true | Location of channel map JSON file |
a | process-all-data | false | Process all data, even if it isn't in channel map |
Argument | Title | Require Argument | Description |
---|---|---|---|
r | serialPortWorldData | true | Serial Port to listen for worldData packets |
R | serialPortWorldDataSpeed | true | Serial port speed |
Argument | Title | Require Argument | Description |
---|---|---|---|
b | http-document-root | true | webserver document root directory |
j | http-port | true | webserver port, 0 to disable |
H | json-history-hours | true | hours of history data to make available, 0 to disable |
Argument | Title | Require Argument | Description |
---|---|---|---|
w | loglocal-directory | true | directory for logging csv files |
Argument | Title | Required Argumentd | Description |
---|---|---|---|
configFile | false | File to use for JSON configuration. | |
configLockFile | false | File to require existence to modify JSON configutation. |
#!/bin/bash
cd /home/aprs/DataGS
java
-DSERIAL_PORT_LIST=/dev/ttyAMA0
-cp .:jars:bin:jars/commons-cli-1.2.jar:jars/commons-lang3-3.3.2.jar:jars/commons-math3-3.3.jar:jars/gson-2.3.jar:jars/mysql-connector-java-5.1.7-bin.jar:jars/commons-collections4-4.0.jar:jars/jspComm.jar:jars/Serialio.jar:jars/json-lib-2.4-jdk15.jar:jars/commons-io-2.4.jar:jars/commons-csv-1.0.jar
dataGS.DataGS -j 8080 -l 4010 -i 10000 -c channelDescriptions/channels_magWebPro.json
-r /dev/ttyAMA0 -R 57600 -a -w /data/logLocal
-Djava.rmi.server.hostname=192.168.10.201
-Dcom.sun.management.jmxremote
-Dcom.sun.management.jmxremote.port=9010
-Dcom.sun.management.jmxremote.local.only=false
-Dcom.sun.management.jmxremote.authenticate=false
-Dcom.sun.management.jmxremote.ssl=false
Replace 192.168.10.201
with the IP of your public interface. Replace 9010
with a unique local port.
Use software such as VisualVM (from Oracle) for monitoring
The serialio.com library we use for accessing serial ports doesn't have support for auto-detecting serial ports under most operating systems. But we can tell the Java VM what serial ports are available.
For example:
-DSERIAL_PORT_LIST=/dev/ttyUSB0
Multiple ports can be separated with a colon. Example:
-DSERIAL_PORT_LIST=/dev/ttyUSB0:/dev/ttyUSB1
Java needs to have the current directory, the bin directory, and the name of all the requires JAR files in the -cp
argument. The program itself is then started with packageName.className
. Or dataGS.DataGS
.
(NOTE: The current set up is a work in progress ) In the utilities directory you can find a directory called DataGSJar. Within that directory you will find a basic example set up for starting the software using a jar file.
DataGS.jar
- The DataGS jar file.startDataGS_jar
- the bash script to run the jar filechannels_magWebPro.json
- the channels.json file read in by DataGS.jar, given it is unchanged fromstartDataGS_jar
www
- local copy of the websitelogLocal
- The directory DataGS.jar will save log files to, given it is unchanged fromstartDataGS_jar
makeJar
- This bash script create the jar file.build.xml
- contains theant
script that creates the jar file.
You should be able to simply run ./start
from the terminal.
In the root directory of DataGS there is a bash script called changeDate.sh
. When added to the build scripts in eclipse, this script will change the FIRMWARE_DATE
variable in the Java source code and change the computer string it was last modified on automatically when you compile the program. If you are using eclipse, follow these directions:
-
Right-click on
DataGS
in the project browser and hover overBuild Path
in the drop-down menu. SelectConfigure Build Path
at the bottom ofBuild Path
menu. From there, selectBuilders
on the left side of the screen. -
Now click the
New...
button on the right side of the screen. SelectProgram
for the configuration type and clickOK
. This should bring up theEdit Configuration
window. You can give this program what ever name you want. I chosechangeDate
for mine. -
Under
Location
, selectBrowse File System...
and findchangeDate.sh
from the root directory of DataGS. -
For the
Working Directory
selectBrowse Workspace...
and chooseDataGS
. ClickOK
at the bottom of the window. -
Click on the program you just added (recall I called it
changeDate
) once to highlight it. With the program highlighted, clickUp
on the right side of the window until it is at the top (assuming you only hadJava Builder
before, you may only have to clickUp
once). -
Click
OK
at the bottom of the screen and you should be good to go.
Please Note:
- This script will not make any changes if the date is already set to today
- When this script does change the date, it recompiles the program. You should see the ant script's messages in the console briefly before DataGS starts.
- You will also have to refresh the
DataGS.java
file in Eclipse when the date is changed. This is as simple as right-clickingDataGS.java
in the project browser and selectingRefresh
. If the file is already open, you can simply click the editting window and eclipse will prompt you to refresh.