The Sample Solution App provides all required resources to deploy a fully functional SP-API application to the AWS cloud. Use this application to test the proposed solution, do changes and/or integrate it to your own product.
The Selling Partner API for Data Kiosk helps you programmatically submit GraphQL queries from a variety of schemas to help selling partners manage their businesses.
If you haven't already, we recommend you to navigate the following resources:
This Sample Solution offers a streamlined Data Kiosk experience. Simply create your query and the solution automatically processes incoming DATA_KIOSK_QUERY_PROCESSING_FINISHED notifications, downloads and securely stores the processed data in an S3 bucket to be later fetched and shared with the end user.
The solution consists of the following components:
- A Step Functions state machine with a fully functional document retrieval workflow.
- Lambda functions that support each of the steps of the state machine.
- An SQS queue to receive DATA_KIOSK_QUERY_PROCESSING_FINISHED notifications.
- An S3 bucket to store generated data kiosk reports.
- A DynamoDB table to store fetched documents for the selling partner's queried data.
- A Secrets Manager secret to securely store SP-API app credentials.
To kickstart the solution, begin by executing the SPAPISubscribeNotificationsLambdaFunction, providing the lambda with the necessary input containing the notificationType - DATA_KIOSK_QUERY_PROCESSING_FINISHED - thereby subscribing the SQS queue to the Data Kiosk Notification and obtaining the subscription_id and destination_id. For each query, utilize the Schema Explorer to generate a GraphQL query and paste it into the SPAPICreateQueryLambdaFunction along. Make sure to handle any quotation mark inconsistencies for query validity. After submission, the automated workflow will begin. Waiting for the DATA_KIOSK_QUERY_PROCESSING_FINISHED notification message to be received, this will trigger the SPAPIProcessNotificationLambdaFunction and parse the message. If no documentId is returned, the flow will return a no data message; otherwise, it will parse the relative documentId and trigger the state machine's execution. In the background, the SPAPIGetDocumentLambdaFunction fetches the documentId and documentUrl, passing them to the SPAPIStoreDocumentLambdaFunction) for storage and processing. Continuing the flow, the SPAPIStoreDocumentLambdaFunction retrieves the JSONL file from the documentUrl, creates an S3 bucket for storage, generates an item in DynamoDB with relevant data and S3 URI, concluding the execution.
Post-execution, all data document ids and S3 links are stored in DynamoDB, and developers can access this content from S3 as needed, ensuring an efficient and structured workflow.
The pre-requisites for deploying the Sample Solution App to the AWS cloud are:
- Registering as a developer for SP-API, and registering an SP-API application.
- An IAM user with permissions to create a new user, a policy, and attach it to the user.
- If you don't have one, you can create it following the steps under Usage - 2. Configure Sample Solution App's IAM user .
- The AWS CLI
- If not present, it will be installed as part of the deployment script.
- Maven
- Just for deploying a Java-based application.
- If not present, it will be installed as part of the deployment script.
To allow the Sample Solution App to connect to SP-API, the config file has to be updated to match the set-up of your SP-API application.
- Open app.config file and replace all occurrences of
<dev_value>
following the instructions below. - Update
ClientId
andClientSecret
attribute values with Client Id and Client Secret of the SP-API application respectively. - Update
RefreshToken
attribute value with the refresh token of the selling partner you will be using for testing.
Note: While updating the config file, don't leave blank spaces before and after
=
, and don't use quotation marks
ClientId=amzn1.application-oa2-client.abc123def456xyz789
ClientSecret=amzn1.oa2-cs.v1.abc123def456xyz789
RefreshToken=Atzr|Abc123def456xyz789
In order to execute the deployment script, an IAM user with IAMFullAccess
permissions is needed.
To create a new IAM user with required permissions, follow the steps below. If you already have a user with IAMFullAccess
policy, you can skip to Configure IAM user credentials section.
- Open the AWS console.
- Navigate to IAM Users console.
- Click Add users.
- Select a name for your user.
- In the Set permissions page, select Attach policies directly.
- In the Permissions policies, search for
IAMFullAccess
. Check the policy, and click Next. - Review the changes and click Create user.
Security credentials for the IAM user will be requested during the deployment script execution. To create a new access key pair, follow the steps below. If you already have valid access key and secret access key, you can skip this section.
- Open the AWS console.
- Navigate to IAM Users console.
- Select your IAM user, which has
IAMFullAccess
permissions. - Go to Security credentials tab.
- Under Access keys, click Create access key.
- In Access key best practices & alternatives page, select Command Line Interface (CLI).
- Acknowledge the recommendations, and click Next.
- Click Create access key.
- Copy
Access key
andSecret access key
. This is the only time that these keys can be viewed or downloaded, and you will need them while executing the deployment script. - Click Done.
The deployment script will create a Sample Solution App in the AWS cloud. To execute the deployment script, follow the steps below.
- Identify the deployment script for the programming language you want for your Sample Solution App.
- For example, for the Python application the file is app/scripts/python/python-app.sh.
- Execute the script from your terminal.
- For example, to execute the Python deployment script in a Unix-based system, run
bash python-app.sh
.
- For example, to execute the Python deployment script in a Unix-based system, run
- Wait for the CloudFormation stack creation to finish.
- Navigate to CloudFormation console.
- Wait for the stack named sp-api-app-<language>-random_suffix to show status
CREATE_COMPLETE
.
The deployment script creates a Sample Solution App in the AWS cloud. The solution consists of a Step Functions state machine with a fully functional workflow. To test the sample solution, follow the steps below.
- Open the AWS console.
- Navigate to Lambda console.
- Select the notification subscriber function, named SPAPISubscribeNotificationsLambdaFunction-random_suffix.
- Select Test tab.
- Under Event JSON, insert the following payload. Replace
NotificationType
with the notification type you want to subscribe to.{ "NotificationType": "DATA_KIOSK_QUERY_PROCESSING_FINISHED" }
- Click Test.
- The function will return
destination Id
andsubscription Id
.
- Open the AWS console.
- Navigate to Lambda console.
- Select the crete query function, named SPAPICreateQueryLambdaFunction-random_suffix.
- Select Test tab.
- Under Event JSON, insert the following payload. Replace
Query
with the minified query you created in the Schema Explorer. Make sure to escape any nested double quotes (") to ensure the query input is a valid string.{ "Query": "query MyQuery{analytics_salesAndTraffic_2023_11_15{salesAndTrafficByAsin(endDate:\"2024-01-31\" marketplaceIds:[\"A2Q3Y263D00KWC\"]aggregateBy:CHILD ..." }
- Click Test.
- The function will return
query id
.
- Open the AWS console.
- Navigate to Lambda console.
- Select the crete query function, named SPAPICreateScheduleLambdaFunction-random_suffix.
- Select Test tab.
- Under Event JSON, insert the following payload. Replace
Query
with the minified query you created in the Schema Explorer. Make sure to escape any nested double quotes (") to ensure the query input is a valid string.{ "query": "query MyQuery{analytics_salesAndTraffic_2023_11_15{salesAndTrafficByAsin(endDate:\"2024-01-31\" marketplaceIds:[\"A2Q3Y263D00KWC\"]aggregateBy:CHILD ...", "scheduleStartDate": "2024-07-09T14:34:00", "scheduleEndDate": "2024-07-23T08:00:00", "minuteRate": "2", "accountId": "ABC123DE" }
- Click Test.
- The function will return
schedule id
. - Check the schedule under EventBridge Schedules.
- Navigate to Step Functions console.
- Select the state machine created by the deployment script, named SPAPIStateMachine-random_suffix.
- Under Executions, you will see a workflow for the notification received through SQS.
- To check the workflow status and navigate into the individual steps, select the workflow and use the Graph view and Step Detail panels.
- Open the AWS console.
- Navigate to DynamoDB console.
- Under Tables, click on Explore items.
- Select the table created by the deployment script, named SPAPISellerItemsTable-random_suffix.
- Locate the item based on the query that was returned.
- Check the S3 URI to open the document and fetch the JSONL content.
The deployment script also creates a Lambda function that cancels an ongoing query. To test the function, follow the steps below.
- Open the AWS console.
- Navigate to Lambda console.
- Select the notification subscriber function, named SPAPICancelQueryLambdaFunction-random_suffix.
- Select Test tab.
- Under Event JSON, insert the following payload. Replace
QueryId
with the query_id you want to cancel.{ "QueryId": "1232942023" }
- Click Test.
- The function will return that the query has been successfully canceled.
The deployment script creates a number of resources in the AWS cloud which you might want to delete after testing the solution. To clean up these resources, follow the steps below.
- Identify the clean-up script for the programming language of the Sample Solution App deployed to the AWS cloud.
- For example, for the Python application the file is app/scripts/python/python-app-clean.sh.
- Execute the script from your terminal.
- For example, to execute the Python clean-up script in a Unix-based system, run
bash python-app-clean.sh
.
- For example, to execute the Python clean-up script in a Unix-based system, run
If the state machine execution fails, follow the steps below to identify the root-cause and retry the workflow.
- Navigate to Step Functions console.
- Select the state machine created by the deployment script, named SPAPIStateMachine-random_suffix.
- Under Executions, you can use the Status column to identify failed executions.
- To troubleshoot errors, select the corresponding workflow execution and use the Graph view and Step Detail panels.
- After fixing the issues that caused the error, retry the workflow by clicking on New execution. The original input parameters will be automatically populated.
- Click Start execution, and validate the results.