You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With the new polling rate on drivebrain, the files are getting very large. As we add more signals and put more data onto the VehicleState topic, the file size will just keep getting bigger. My computer and our cloud instance doesn't have the RAM to process all of this data in one go. As an immediate solution, we can start to sample the data at a lower rate (like 200hz).
Going forward, we will need to be able to quickly parse all the raw information in these files. One way we can combat this is by analyzing our data in batches and creating multiple data files, and then combining them at the end. We should also explore writing to HDF5 files because MATLAB can import data from them.
The text was updated successfully, but these errors were encountered:
With the new polling rate on drivebrain, the files are getting very large. As we add more signals and put more data onto the VehicleState topic, the file size will just keep getting bigger. My computer and our cloud instance doesn't have the RAM to process all of this data in one go. As an immediate solution, we can start to sample the data at a lower rate (like 200hz).
Going forward, we will need to be able to quickly parse all the raw information in these files. One way we can combat this is by analyzing our data in batches and creating multiple data files, and then combining them at the end. We should also explore writing to HDF5 files because MATLAB can import data from them.
The text was updated successfully, but these errors were encountered: