Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running CSV provider using fms-demo-compose.yaml #16

Open
Hariprasath-TE opened this issue Oct 20, 2023 · 11 comments
Open

Running CSV provider using fms-demo-compose.yaml #16

Hariprasath-TE opened this issue Oct 20, 2023 · 11 comments

Comments

@Hariprasath-TE
Copy link

Using the CSV-Provider
With the CSV-provider one can replay signals from a CSV-file to an instance of the Kuksa.val data broker. More details are available in the upstream repository. To execute the CSV-provider with the Docker Compose setup, add the argument '--profile csv':

docker compose -f ./fms-demo-compose.yaml --profile direct --profile csv up --detach

The above command necessitates "fms-demo-compose.yaml" file which i could not find in the git repo. Moreover, when i tried to feed data from signals.csv using provider.py file from upstream repository, i.e., Kuksa-val-feeders repo, i could not see the data on Grafana nor on InfluxDB.

However, I can get the data sent from kuksa databroker cli. From thereon, it is not clear. Could anybody help me with clear steps and proper commands to set it right up till end (visualization of data).

Any advice or information as to what i am missing out on and possible solutions to proceed with is appreciable.

@sophokles73
Copy link
Contributor

I am not sure about what you want to achieve. Have you followed the instructions given in README.md?

@Hariprasath-TE
Copy link
Author

i wanna to push the data from csv_provider to influxDB and grafana for that i have done the below steps

1)For that i have started all services by run this docker compose "docker compose -f ./fms-blueprint-compose.yaml up --detach".
2)Then i have sent the data from signal.csv to kuksa databroker.
Now how can i send those data from kuksa databroker to influxDB and grafana through fms forwarder.

@sophokles73
Copy link
Contributor

Then i have sent the data from signal.csv to kuksa databroker.

You do not need to do that manually, the Docker Compose file already starts a CSV-Provider container which publishes the data from the csv-provider/signalsFmsRecording.csv file to the Databroker. The FMS Forwarder component then retrieves the data and writes it to InfluxDB.

@Hariprasath-TE
Copy link
Author

Yes. It does publish the data to the databroker. But iam not sure if the fms forwarder is forwarding the data to influxdb. Because when I checked influxdb, no data similar to the fed data is showing up. 1. Any reason for that?
2. And kindly brief the working solution, please.

@sophokles73
Copy link
Contributor

Because when I checked influxdb, no data similar to the fed data is showing up

What data are you talking about here? The data from csv-provider/signalsFmsRecording.csv? Or are you trying to feed in data from a csv file that you have created yourself?

@Hariprasath-TE
Copy link
Author

How to feed a dataset that i have created myself? Can you please explain sophokles73

@sophokles73
Copy link
Contributor

In fms-blueprint-compose.yaml in the section

csv-provider:
    image: "ghcr.io/eclipse/kuksa.val.feeders/csv-provider:main"
    container_name: "csv-provider"
    cap_drop: *default-drops
    networks:
    - "fms-vehicle"
    depends_on:
      databroker:
        condition: service_started
    volumes:
    - "./csv-provider/signalsFmsRecording.csv:/dist/signals.csv"
    environment:
      PROVIDER_INFINITE: 1
      PROVIDER_LOG_LEVEL: "INFO"
      KUKSA_DATA_BROKER_ADDR: "databroker"
      KUKSA_DATA_BROKER_PORT: "55556"

just replace the reference to ./csv-provider/signalsFmsRecording.csv with the path to your own file. Reading the Docker Compose documentation would also be very helpful to better understand how things work together ...

@Hariprasath-TE
Copy link
Author

Hariprasath-TE commented Nov 1, 2023

i sent the data from in-vehicle to cloudside influxDB as mentioned in the documentation and now i tried to connect C2E hono instead of using hono sandbox to send the data to ditto however i couldn't. can you suggest the way to send in-vehicle data to ditto.

@sophokles73
Copy link
Contributor

I am not a Ditto expert but I guess you will need t o create a digital twin in Ditto for the vehicle and then create a Ditto Connection to Hono's Kafka broker for the Ditto tenant that the vehicle twin belongs to. The data published by the vehicle is a protocol buffer so you will need to add a Ditto mapping script to the Connection which parses the protocol buffer and transforms it into Ditto's (JSON based) data format ...

@sophokles73
Copy link
Contributor

@Hariprasath-TE can this be closed?

@Hariprasath-TE
Copy link
Author

I mean, my issue is actually pending. Let it be open for a little while, as a request from me, for anymore clarifications. I will notify you on when this can be closed? Thank you @sophokles73

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants