Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initialize katana grpc api #2682

Merged
merged 1 commit into from
Nov 13, 2024
Merged

Initialize katana grpc api #2682

merged 1 commit into from
Nov 13, 2024

Conversation

tarrencev
Copy link
Contributor

@tarrencev tarrencev commented Nov 12, 2024

Initializing a starknet grpc api to Katana for usage with Torii and the rest of the Dojo stack

Summary by CodeRabbit

  • New Features

    • Introduced a new gRPC package for the Katana project, enabling enhanced blockchain interactions.
    • Added multiple RPC methods for operations such as retrieving block information, transaction details, and contract class definitions.
    • Included a new member in the workspace to support gRPC functionality.
  • Bug Fixes

    • Updated dependencies for improved stability and performance.
  • Documentation

    • Added documentation comments to provide clarity on gRPC implementations.

Copy link

coderabbitai bot commented Nov 12, 2024

Walkthrough

Ohayo! This pull request introduces significant updates to the Cargo.toml files and adds new files to the crates/katana/grpc directory. A new workspace member for gRPC functionality is added, and the num-traits dependency is updated. A new Cargo.toml file for the katana-grpc package is created, along with a build script for compiling Protocol Buffers. Additionally, several .proto files are introduced to define gRPC services and messages, establishing the groundwork for blockchain interactions.

Changes

File Change Summary
Cargo.toml Added new member "crates/katana/grpc" to workspace; updated num-traits dependency.
crates/katana/grpc/Cargo.toml New package declaration and sections for dependencies, development dependencies, and build dependencies.
crates/katana/grpc/build.rs Introduced build script using tonic_build for compiling protobuf files; includes feature checks.
crates/katana/grpc/proto/starknet.proto Defined gRPC service and multiple RPC methods for interacting with Starknet blockchain.
crates/katana/grpc/proto/types.proto Introduced various messages, enums, and structures for gRPC context, including transaction-related definitions.
crates/katana/grpc/src/lib.rs Added documentation comment indicating the file contains gRPC implementations.

Possibly related PRs

Suggested labels

sozo

Suggested reviewers

  • glihm
  • steebchen

Warning

There were issues while running some tools. Please review the errors and either fix the tool’s configuration or disable the tool if it’s a critical failure.

🔧 buf
crates/katana/grpc/proto/starknet.proto

Config file JSON parsing error: invalid character 'v' looking for beginning of value. Please check your buf configuration file for JSON syntax errors.

crates/katana/grpc/proto/types.proto

Config file JSON parsing error: invalid character 'v' looking for beginning of value. Please check your buf configuration file for JSON syntax errors.


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🧹 Outside diff range and nitpick comments (12)
crates/katana/grpc/build.rs (3)

1-3: Ohayo! Consider using a more specific error type.

While using Box<dyn std::error::Error> works, we could make the error handling more explicit for build scripts.

-fn main() -> Result<(), Box<dyn std::error::Error>> {
+fn main() -> std::io::Result<()> {

4-8: Enhance error messages for environment variables.

The current error messages could be more descriptive to help developers understand what went wrong during the build process.

-    let out_dir =
-        PathBuf::from(std::env::var("OUT_DIR").expect("OUT_DIR environment variable not set"));
-    let target = std::env::var("TARGET").expect("TARGET environment variable not set");
+    let out_dir = PathBuf::from(
+        std::env::var("OUT_DIR").expect("OUT_DIR not set. This should be set by cargo build"),
+    );
+    let target = std::env::var("TARGET")
+        .expect("TARGET not set. This should be automatically set by cargo build");

16-16: Consider more specific rebuild triggers.

While watching the entire proto directory is valid, you might want to be more specific about which files trigger rebuilds.

-    println!("cargo:rerun-if-changed=proto");
+    println!("cargo:rerun-if-changed=proto/starknet.proto");
+    println!("cargo:rerun-if-changed=proto/common.proto");
Cargo.toml (1)

Line range hint 246-252: Ohayo! Some architectural guidance for the gRPC implementation, sensei!

Since you're initializing a new gRPC API, consider these points:

  1. Align with existing gRPC patterns in the torii-grpc crate
  2. Utilize the existing gRPC dependencies (tonic, prost, etc.) consistently
  3. Consider implementing reflection service using tonic-reflection for better API discoverability
  4. Plan for both sync and async clients (note the presence of tonic-web-wasm-client)
crates/katana/grpc/proto/starknet.proto (3)

79-80: Suggestion: Use google.protobuf.Empty for Empty Request Messages

Ohayo sensei! To simplify the code and adhere to best practices, consider using google.protobuf.Empty instead of defining empty messages like SpecVersionRequest.

You can modify the code as follows:

  • Import the Empty proto:

    +import "google/protobuf/empty.proto";
  • Update the SpecVersion RPC method:

    -rpc SpecVersion(SpecVersionRequest) returns (SpecVersionResponse);
    +rpc SpecVersion(google.protobuf.Empty) returns (SpecVersionResponse);
  • Remove the SpecVersionRequest message:

    -message SpecVersionRequest {}

This approach reduces redundancy and leverages standard protobuf definitions.


207-211: Clarify Simulation Flags Field

Ohayo sensei! The simulation_flags field in EstimateFeeRequest is defined as repeated string. To enhance clarity and prevent invalid flag values, consider using an enum or a predefined set of constants.

For example, define an enum for simulation flags:

enum SimulationFlag {
  SIMULATION_FLAG_UNSPECIFIED = 0;
  SIMULATION_FLAG_SKIP_VALIDATE = 1;
  SIMULATION_FLAG_SKIP_FEE_CHARGE = 2;
  // Add more flags as needed
}

Then update the request message:

 message EstimateFeeRequest {
     repeated types.Transaction transactions = 1;
-    repeated string simulation_flags = 2;
+    repeated SimulationFlag simulation_flags = 2;
     types.BlockID block_id = 3;
 }

This ensures only valid flags are used and helps with type checking.


118-121: Parameter Order Consistency in GetStorageAtRequest

Ohayo sensei! For consistency and clarity, consider placing contract_address before block_id in GetStorageAtRequest, as the contract address is the primary focus of the request.

Here's the adjusted message:

 message GetStorageAtRequest {
+    types.Felt contract_address = 1;
+    types.Felt key = 2;
+    types.BlockID block_id = 3;
-    types.BlockID block_id = 1;
-    types.Felt contract_address = 2;
-    types.Felt key = 3;
 }

This order aligns with other requests where the primary identifier comes first.

crates/katana/grpc/proto/types.proto (5)

108-121: Ohayo sensei! Potential inconsistency in field naming

In DeployAccountTxnV3 (lines 108-121), the field contract_address_salt (line 113) is consistent with DeployAccountTxn but consider ensuring that all similar fields across transaction messages use consistent naming conventions. This aids in readability and maintainability of the codebase.


278-290: Ohayo sensei! Inconsistent field naming in ExecutionResources

In the ExecutionResources message (lines 278-290), most fields are named with the suffix _builtin_applications, except for segment_arena_builtin (line 288). For consistency, consider renaming segment_arena_builtin to segment_arena_builtin_applications.

Apply this diff to rename the field:

 message ExecutionResources {
   // ...
-  uint64 segment_arena_builtin = 10;
+  uint64 segment_arena_builtin_applications = 10;
 }

123-127: Ohayo sensei! Review JSON field names for consistency

In ResourceBoundsMapping (lines 123-127), the json_name of fields l1_gas and l2_gas are set as "L1_GAS" and "L2_GAS" (lines 124-125) in uppercase. Elsewhere in the file, JSON names use snake_case. For consistency and to follow standard JSON naming conventions, consider updating them to lowercase with underscores.

Apply this diff to update the json_name:

 message ResourceBoundsMapping {
-  ResourceBounds l1_gas = 1 [json_name = "L1_GAS"];
-  ResourceBounds l2_gas = 2 [json_name = "L2_GAS"];
+  ResourceBounds l1_gas = 1 [json_name = "l1_gas"];
+  ResourceBounds l2_gas = 2 [json_name = "l2_gas"];
 }

317-322: Ohayo sensei! Enhance flexibility of EventFilter fields

In the EventFilter message (lines 317-322), all fields are required by default. To allow for more flexible event filtering, consider marking fields as optional by using wrappers (e.g., google.protobuf.StringValue) or oneof constructs. This way, clients can specify only the filters they need.


226-237: Ohayo sensei! Clarify field naming in BlockHeader

In BlockHeader (lines 226-237), the field l1_da_mode (line 235) might be unclear to some readers. For improved clarity, consider renaming it to l1_data_availability_mode to match the naming convention used elsewhere.

Apply this diff to rename the field:

 message BlockHeader {
   // ...
-  string l1_da_mode = 9;
+  string l1_data_availability_mode = 9;
 }
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between caaf06b and 3e33e97.

⛔ Files ignored due to path filters (1)
  • Cargo.lock is excluded by !**/*.lock
📒 Files selected for processing (6)
  • Cargo.toml (2 hunks)
  • crates/katana/grpc/Cargo.toml (1 hunks)
  • crates/katana/grpc/build.rs (1 hunks)
  • crates/katana/grpc/proto/starknet.proto (1 hunks)
  • crates/katana/grpc/proto/types.proto (1 hunks)
  • crates/katana/grpc/src/lib.rs (1 hunks)
✅ Files skipped from review due to trivial changes (2)
  • crates/katana/grpc/Cargo.toml
  • crates/katana/grpc/src/lib.rs
🔇 Additional comments (5)
crates/katana/grpc/build.rs (2)

18-19: LGTM! Clean error propagation.

The error handling here is appropriately simple for a build script, using the ? operator to propagate errors up the chain.


10-14: Verify the proto file path exists.

The hardcoded path to the proto file should be verified to ensure it exists in the expected location.

✅ Verification successful

ohayo sensei,

Proto file path exists at the expected location relative to build.rs.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if the proto file exists and verify its content

# Check if the proto file exists
if [ -f "proto/starknet.proto" ]; then
    echo "Proto file exists"
    # Show the service definitions
    grep -A 5 "service" proto/starknet.proto
else
    echo "Error: proto/starknet.proto not found"
    # List available proto files
    find . -name "*.proto"
fi

Length of output: 339

Cargo.toml (2)

184-184: Verify dependency optimization impact, sensei!

The optimization of num-traits by disabling default features is good for reducing unnecessary dependencies. However, let's ensure this change doesn't affect existing functionality.

Let's check the usage:

#!/bin/bash
# Description: Check num-traits usage across the codebase

# Find all uses of num-traits
rg --type rust "use.*num_traits" 

# Find any potential trait implementations
rg --type rust "impl.*num_traits" 

22-22: Ohayo! New workspace member looks good, sensei!

The addition of crates/katana/grpc is well-structured and aligns perfectly with the PR objective to initialize the Starknet gRPC API.

Let's verify the crate structure:

✅ Verification successful

Ohayo! There was an issue with the previous verification script. Let's try again, sensei!


Ohayo! Everything looks good, sensei!

The crates/katana/grpc directory exists and contains the essential files as expected.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the new crate structure and its basic files

# Check if the crate directory exists and has the expected structure
fd --type d "^grpc$" crates/katana

# Check for essential files
fd --type f "^(Cargo.toml|build.rs|lib.rs)$" crates/katana/grpc

Length of output: 1428


Script:

#!/bin/bash
# Description: Verify the new crate structure and its basic files

# Check if the grpc directory exists
fd --type d "^grpc$" crates/katana

# Check for essential files
fd --type f "^(Cargo.toml|build.rs|lib.rs)$" crates/katana/grpc

Length of output: 210

crates/katana/grpc/proto/starknet.proto (1)

222-223: Consistent Use of google.protobuf.Empty

Ohayo sensei! Following the earlier suggestion, you can replace BlockNumberRequest with google.protobuf.Empty to maintain consistency and simplify the code.

Comment on lines +131 to +134
message GetTransactionStatusResponse {
string finality_status = 1;
string execution_status = 2;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Use Enums Instead of Strings for Status Fields

Ohayo sensei! To enhance type safety and prevent invalid status values, consider using enum types for finality_status and execution_status instead of plain string.

Here's how you can define the enums:

enum FinalityStatus {
  FINALITY_STATUS_UNSPECIFIED = 0;
  FINALITY_STATUS_PENDING = 1;
  FINALITY_STATUS_ACCEPTED_ON_L2 = 2;
  FINALITY_STATUS_ACCEPTED_ON_L1 = 3;
  FINALITY_STATUS_REJECTED = 4;
}

enum ExecutionStatus {
  EXECUTION_STATUS_UNSPECIFIED = 0;
  EXECUTION_STATUS_NOT_RECEIVED = 1;
  EXECUTION_STATUS_RECEIVED = 2;
  EXECUTION_STATUS_PENDING = 3;
  EXECUTION_STATUS_ACCEPTED = 4;
  EXECUTION_STATUS_REJECTED = 5;
}

Then update the message:

 message GetTransactionStatusResponse {
-    string finality_status = 1;
-    string execution_status = 2;
+    FinalityStatus finality_status = 1;
+    ExecutionStatus execution_status = 2;
 }

This change ensures that only valid status values are used and helps with code maintenance.

Comment on lines +243 to +248
message SyncingResponse {
oneof result {
bool not_syncing = 1;
types.SyncStatus status = 2;
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Clarify the SyncingResponse Structure

Ohayo sensei! The current SyncingResponse uses a oneof to return either a boolean or a SyncStatus. To improve clarity, consider defining a single response that always includes the syncing status.

Here's a possible revision:

message SyncingResponse {
  bool is_syncing = 1;
  types.SyncStatus status = 2;
}

In this structure:

  • When is_syncing is false, the status field can be omitted.
  • When is_syncing is true, the status field provides detailed sync information.

This approach simplifies client-side handling of the response.

Comment on lines +23 to +33
message Transaction {
oneof transaction {
InvokeTxnV1 invoke_v1 = 1;
InvokeTxnV3 invoke_v3 = 2;
DeclareTxnV1 declare_v1 = 3;
DeclareTxnV2 declare_v2 = 4;
DeclareTxnV3 declare_v3 = 5;
DeployAccountTxn deploy_account = 6;
DeployAccountTxnV3 deploy_account_v3 = 7;
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Ohayo sensei! Reduce duplication by extracting common transaction fields

The transaction messages within the Transaction oneof (lines 24-32) share several common fields like type, version, signature, and nonce. Consider creating a CommonTransactionFields message to encapsulate these fields, reducing duplication and enhancing maintainability.

Here's how you might define a common message and use composition:

message CommonTransactionFields {
  string type = 1 [json_name = "type"];
  string version = 2 [json_name = "version"];
  repeated Felt signature = 3 [json_name = "signature"];
  Felt nonce = 4 [json_name = "nonce"];
}

message InvokeTxnV1 {
  CommonTransactionFields common = 1;
  Felt max_fee = 2 [json_name = "max_fee"];
  Felt sender_address = 3 [json_name = "sender_address"];
  repeated Felt calldata = 4 [json_name = "calldata"];
}

// Update other transaction messages similarly

Comment on lines +70 to +79
message DeclareTxnV2 {
string type = 1 [json_name = "type"];
Felt sender_address = 2 [json_name = "sender_address"];
Felt compiled_class_hash = 3 [json_name = "compiled_class_hash"];
Felt max_fee = 4 [json_name = "max_fee"];
string version = 5 [json_name = "version"];
repeated Felt signature = 6 [json_name = "signature"];
Felt nonce = 7 [json_name = "nonce"];
bytes class = 8 [json_name = "contract_class"];
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Ohayo sensei! Consider renaming the class field to avoid reserved keyword conflicts

In the DeclareTxnV2 message (lines 70-79), the field named class (line 78) may cause issues during code generation in languages where class is a reserved keyword (e.g., Python, Java, C++). To enhance cross-language compatibility, it's advisable to rename this field to something like contract_class.

Apply this diff to rename the field:

 message DeclareTxnV2 {
   // ...
-  bytes class = 8 [json_name = "contract_class"];
+  bytes contract_class = 8 [json_name = "contract_class"];
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
message DeclareTxnV2 {
string type = 1 [json_name = "type"];
Felt sender_address = 2 [json_name = "sender_address"];
Felt compiled_class_hash = 3 [json_name = "compiled_class_hash"];
Felt max_fee = 4 [json_name = "max_fee"];
string version = 5 [json_name = "version"];
repeated Felt signature = 6 [json_name = "signature"];
Felt nonce = 7 [json_name = "nonce"];
bytes class = 8 [json_name = "contract_class"];
}
message DeclareTxnV2 {
string type = 1 [json_name = "type"];
Felt sender_address = 2 [json_name = "sender_address"];
Felt compiled_class_hash = 3 [json_name = "compiled_class_hash"];
Felt max_fee = 4 [json_name = "max_fee"];
string version = 5 [json_name = "version"];
repeated Felt signature = 6 [json_name = "signature"];
Felt nonce = 7 [json_name = "nonce"];
bytes contract_class = 8 [json_name = "contract_class"];
}

Copy link

codecov bot commented Nov 12, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 57.58%. Comparing base (caaf06b) to head (2376fc6).
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2682      +/-   ##
==========================================
+ Coverage   57.57%   57.58%   +0.01%     
==========================================
  Files         404      404              
  Lines       51139    51139              
==========================================
+ Hits        29441    29447       +6     
+ Misses      21698    21692       -6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Outside diff range and nitpick comments (7)
crates/katana/grpc/Cargo.toml (3)

1-7: Add package description field

Ohayo! The package configuration looks clean, but adding a description field would help other developers understand this package's role in the Katana ecosystem.

 [package]
 edition.workspace = true
 license-file.workspace = true
 name = "katana-grpc"
+description = "gRPC API implementation for Katana, providing Starknet API compatibility"
 repository.workspace = true
 version.workspace = true

11-12: Add testing utilities for gRPC development

For proper testing of our gRPC services, we should include these testing utilities:

 [dev-dependencies]
+tokio-test = { workspace = true }
+tonic-build = { workspace = true }

1-15: Consider API versioning strategy

Ohayo sensei! Since this is initializing a new gRPC API for Starknet compatibility, we should consider:

  1. Adding version prefixes to service definitions
  2. Implementing backward compatibility measures
  3. Documenting breaking changes policy

This will help maintain API stability as the Dojo stack evolves.

crates/katana/grpc/proto/types.proto (4)

17-21: Ohayo sensei! Consider making enum values more explicit

The SimulationFlag enum values implicitly start from 0. For better maintainability and explicit intent, consider assigning values explicitly:

 enum SimulationFlag {
-  SKIP_FEE_CHARGE = 0;
-  SKIP_EXECUTE = 1;
-  SKIP_VALIDATE = 2;
+  SIMULATION_FLAG_UNSPECIFIED = 0;
+  SKIP_FEE_CHARGE = 1;
+  SKIP_EXECUTE = 2;
+  SKIP_VALIDATE = 3;
 }

142-173: Ohayo sensei! Consider adding documentation for block types

The different block representations (BlockWithTxHashes, BlockWithTxs, BlockWithReceipts) could benefit from comments explaining their specific use cases and when to use each type.


226-237: Ohayo sensei! Consider adding timestamp validation

The BlockHeader's timestamp field might benefit from validation to ensure it's not in the future and within acceptable bounds.


258-258: Consider structured error handling for revert_reason

The revert_reason field as a string might benefit from a more structured approach, such as an error code enum with associated messages.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

📥 Commits

Reviewing files that changed from the base of the PR and between 3e33e97 and 2376fc6.

⛔ Files ignored due to path filters (1)
  • Cargo.lock is excluded by !**/*.lock
📒 Files selected for processing (6)
  • Cargo.toml (2 hunks)
  • crates/katana/grpc/Cargo.toml (1 hunks)
  • crates/katana/grpc/build.rs (1 hunks)
  • crates/katana/grpc/proto/starknet.proto (1 hunks)
  • crates/katana/grpc/proto/types.proto (1 hunks)
  • crates/katana/grpc/src/lib.rs (1 hunks)
✅ Files skipped from review due to trivial changes (1)
  • crates/katana/grpc/proto/starknet.proto
🚧 Files skipped from review as they are similar to previous changes (3)
  • Cargo.toml
  • crates/katana/grpc/build.rs
  • crates/katana/grpc/src/lib.rs
🔇 Additional comments (5)
crates/katana/grpc/Cargo.toml (1)

13-15: Verify workspace-level protobuf build configuration

The build dependencies look good for both native and WASM targets, sensei! Let's verify the workspace configuration:

✅ Verification successful

ohayo, sensei! Workspace-level protobuf build configuration verified.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check workspace-level dependency configuration
rg -A 5 'tonic-build|wasm-tonic-build' Cargo.toml

# Verify proto files existence
fd -e proto . crates/katana/grpc/

Length of output: 564

crates/katana/grpc/proto/types.proto (4)

1-4: Ohayo sensei! The file structure looks good!

The proto3 syntax declaration and package naming follow the Protocol Buffers best practices.


23-121: Previous comment about reducing transaction field duplication is still valid

The transaction messages (InvokeTxnV1, InvokeTxnV3, DeclareTxnV1, etc.) share common fields like type, version, signature, and nonce. Consider extracting these into a common message type.


123-140: Ohayo sensei! Resource and fee structures are well-designed!

The separation of L1/L2 gas and comprehensive fee breakdown provide clear resource management.


78-78: Previous comment about renaming the class field is still valid

The class field should be renamed to avoid conflicts with reserved keywords in various programming languages.

Comment on lines +8 to +10
[dependencies]
tonic.workspace = true

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider adding essential gRPC-related dependencies

Ohayo sensei! For a robust gRPC implementation, we should consider adding these common dependencies:

 [dependencies]
 tonic.workspace = true
+prost.workspace = true
+tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
+tonic-types.workspace = true
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
[dependencies]
tonic.workspace = true
[dependencies]
tonic.workspace = true
prost.workspace = true
tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
tonic-types.workspace = true


message StateDiff {
repeated StorageDiff storage_diffs = 1;
repeated Felt deprecated_declared_classes = 2;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

ohayo sensei!

Deprecated deprecated_declared_classes field is still present

  • The deprecated_declared_classes field remains in multiple files without removal or replacement.
🔗 Analysis chain

Verify the usage of deprecated declared classes

The deprecated_declared_classes field suggests a migration or deprecation process. Let's verify its usage:


ohayo sensei!

The previous search for deprecated_declared_classes didn't work due to an unrecognized file type. Let's try searching without specifying the file type:


ohayo sensei!

Let's gather more details on how deprecated_declared_classes is being used:


ohayo sensei!

Let's gather more details on how deprecated_declared_classes is being handled:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Search for references to deprecated declared classes
rg -l "deprecated_declared_classes" --type proto

Length of output: 78


Script:

#!/bin/bash
# Search for references to deprecated declared classes in all files
rg -l "deprecated_declared_classes"

Length of output: 308


Script:

#!/bin/bash
# Find all instances and their contexts of 'deprecated_declared_classes' in the listed files
rg "deprecated_declared_classes" crates/saya/provider/src/rpc/state.rs crates/saya/core/src/prover/program_input.rs crates/torii/grpc/src/types/mod.rs crates/torii/grpc/src/client.rs crates/katana/grpc/proto/types.proto crates/katana/primitives/src/state.rs crates/katana/rpc/rpc-types/src/state_update.rs

Length of output: 1978


Script:

#!/bin/bash
# Search for comments and functions related to 'deprecated_declared_classes'
rg "deprecated_declared_classes" crates/saya/provider/src/rpc/state.rs crates/saya/core/src/prover/program_input.rs crates/torii/grpc/src/types/mod.rs crates/torii/grpc/src/client.rs crates/katana/grpc/proto/types.proto crates/katana/primitives/src/state.rs crates/katana/rpc/rpc-types/src/state_update.rs

Length of output: 1978

@kariy
Copy link
Member

kariy commented Nov 12, 2024

we still want to maintain jsonrpc (mainly for compatibility reason) so we should expose both protocols, with jsonrpc being the default option. something like katana --rpc.protocols grpc,json which will expose grpc at path /grpc and json at /, on same port.

Copy link
Collaborator

@glihm glihm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tarrencev did you generate this from the spec? Or it's handmade? If auto-generated, would be great to have the script somewhere to follow upcoming RPC upgrades of the network. 👍

We should also have some protos defined globally in a crate for the dojo stack since torii an other part of the stack will share those types to base the implementation on it.

We could do that in a second time though.

@kariy
Copy link
Member

kariy commented Nov 13, 2024

We should also have some protos defined globally in a crate for the dojo stack since torii an other part of the stack will share those types to base the implementation on it.

We could do that in a second time though.

good point, lets focus on getting it implemented on katana, torii, and sozo. in that particular order.

@glihm glihm merged commit c16f3c9 into main Nov 13, 2024
14 checks passed
@glihm glihm deleted the initkatanagrpc branch November 13, 2024 02:50
@tarrencev
Copy link
Contributor Author

@tarrencev did you generate this from the spec? Or it's handmade? If auto-generated, would be great to have the script somewhere to follow upcoming RPC upgrades of the network. 👍

We should also have some protos defined globally in a crate for the dojo stack since torii an other part of the stack will share those types to base the implementation on it.

We could do that in a second time though.

Most of these are protos we're using internally for the paymaster. It is a combination of hand rolled and gpt generated from the spec. I imagine they might further evolve but wanted to get things started

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants