Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cache #2558

Open
wants to merge 6 commits into
base: next
Choose a base branch
from
Open

cache #2558

Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions client/.env.preview
Original file line number Diff line number Diff line change
Expand Up @@ -3,19 +3,19 @@ VITE_PUBLIC_MASTER_PRIVATE_KEY=0x075362a844768f31c8058ce31aec3dd7751686440b4f220
VITE_PUBLIC_WORLD_ADDRESS="0x00fd85ef42eaed3b90d02d2cdc7417d6cae189ff4ba876aa5608551afbf1fb47"
VITE_PUBLIC_ACCOUNT_CLASS_HASH="0x07dc7899aa655b0aae51eadff6d801a58e97dd99cf4666ee59e704249e51adf2"
VITE_PUBLIC_FEE_TOKEN_ADDRESS=0x49d36570d4e46f48e99674bd3fcc84644ddd6b96f7c741b1562b82f9e004dc7
VITE_PUBLIC_TORII=https://api.cartridge.gg/x/sepolia-rc-18/torii
VITE_PUBLIC_NODE_URL=https://api.cartridge.gg/x/starknet/sepolia
VITE_PUBLIC_TORII=https://api.cartridge.gg/x/mainnet-cache-2/torii
VITE_PUBLIC_NODE_URL=https://api.cartridge.gg/x/starknet/mainnet
VITE_PUBLIC_DEV=false
VITE_PUBLIC_GAME_VERSION="v1.0.0-rc7"
VITE_PUBLIC_SHOW_FPS=false
VITE_PUBLIC_GRAPHICS_DEV=false
VITE_PUBLIC_TORII_RELAY=/dns4/api.cartridge.gg/tcp/443/x-parity-wss/%2Fx%2Fsepolia-rc-18%2Ftorii%2Fwss
VITE_PUBLIC_TORII_RELAY=/dns4/api.cartridge.gg/tcp/443/x-parity-wss/%2Fx%mainnet-cache-2%2Ftorii%2Fwss
VITE_SEASON_PASS_ADDRESS=0x23cc88996a5f9c7bcb559fdcffc257c0f75abe60f2a7e5d5cd343f8a95967f7
VITE_REALMS_ADDRESS=0x3205f47bd6f0b5e9cd5c79fcae19e12523a024709776d0a9e8b375adf63468d
VITE_LORDS_ADDRESS=0x0342ad5cc14002c005a5cedcfce2bd3af98d5e7fb79e9bf949b3a91cf145d72e

VITE_PUBLIC_CHAIN=sepolia
VITE_PUBLIC_SLOT=sepolia-rc-17
VITE_PUBLIC_CHAIN=mainnet
VITE_PUBLIC_SLOT=mainnet-cache-2

VITE_SOCIAL_LINK=http://bit.ly/3Zz1mpp

Expand Down
14 changes: 7 additions & 7 deletions client/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,14 @@
"@bibliothecadao/eternum": "workspace:^",
"@cartridge/connector": "0.5.6",
"@cartridge/controller": "0.5.6",
"@dojoengine/core": "1.0.4-alpha.3.1.0",
"@dojoengine/create-burner": "1.0.4-alpha.3.1.0",
"@dojoengine/react": "1.0.4-alpha.3.1.0",
"@dojoengine/core": "1.0.4-alpha.3.1.1",
"@dojoengine/create-burner": "1.0.4-alpha.3.1.1",
"@dojoengine/react": "1.0.4-alpha.3.1.1",
"@dojoengine/recs": "^2.0.13",
"@dojoengine/state": "1.0.4-alpha.3.1.0",
"@dojoengine/torii-client": "1.0.4-alpha.3.1.0",
"@dojoengine/torii-wasm": "1.0.4-alpha.3.1.0",
"@dojoengine/utils": "1.0.4-alpha.3.1.0",
"@dojoengine/state": "1.0.4-alpha.3.1.1",
"@dojoengine/torii-client": "1.0.4-alpha.3.1.1",
"@dojoengine/torii-wasm": "1.0.4-alpha.3.1.1",
"@dojoengine/utils": "1.0.4-alpha.3.1.1",
"@headlessui/react": "^1.7.18",
"@latticexyz/utils": "^2.0.0-next.12",
"@radix-ui/react-collapsible": "^1.1.1",
Expand Down
30 changes: 30 additions & 0 deletions client/src/dojo/entityWorker.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
import { debounce } from "lodash";

let entityBatch = {};
let logging = false;
const DEBOUNCE_DELAY = 1000;

let debouncedSendBatch = debounce(() => {
if (Object.keys(entityBatch).length > 0) {
console.log("Worker: Sending batch", entityBatch);
self.postMessage({ updatedEntities: entityBatch });
entityBatch = {};
}
}, DEBOUNCE_DELAY);
Comment on lines +7 to +13
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add error handling and consistent logging.

The batch sending logic needs improvement in error handling and logging consistency:

  1. Console.log should respect the logging flag
  2. Add error handling for postMessage
  3. Consider using const for the debounced function
-let debouncedSendBatch = debounce(() => {
+const debouncedSendBatch = debounce(() => {
   if (Object.keys(entityBatch).length > 0) {
-    console.log("Worker: Sending batch", entityBatch);
+    if (logging) console.log("Worker: Sending batch", entityBatch);
+    try {
       self.postMessage({ updatedEntities: entityBatch });
       entityBatch = {};
+    } catch (error) {
+      if (logging) console.error("Worker: Failed to send batch", error);
+    }
   }
 }, DEBOUNCE_DELAY);

Committable suggestion skipped: line range outside the PR's diff.


self.onmessage = async (e) => {
const { type, entities, logging: logFlag } = e.data;
if (type === "update") {
logging = logFlag;
if (logging) console.log("Worker: Received entities update");
handleUpdate(entities.fetchedEntities, entities.data);
}
};
Comment on lines +15 to +22
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add input validation and error handling for messages.

The message handler needs proper validation and error handling:

  1. Remove unnecessary async since there are no await operations
  2. Add validation for required message properties
  3. Add error handling for malformed messages
-self.onmessage = async (e) => {
+self.onmessage = (e) => {
+  try {
+    if (!e.data || typeof e.data !== 'object') {
+      throw new Error('Invalid message format');
+    }
+
     const { type, entities, logging: logFlag } = e.data;
+
+    if (!type || !entities) {
+      throw new Error('Missing required message properties');
+    }
+
     if (type === "update") {
       logging = logFlag;
       if (logging) console.log("Worker: Received entities update");
       handleUpdate(entities.fetchedEntities, entities.data);
     }
+  } catch (error) {
+    if (logging) console.error("Worker: Message processing error", error);
+    self.postMessage({ error: error.message });
+  }
 };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
self.onmessage = async (e) => {
const { type, entities, logging: logFlag } = e.data;
if (type === "update") {
logging = logFlag;
if (logging) console.log("Worker: Received entities update");
handleUpdate(entities.fetchedEntities, entities.data);
}
};
self.onmessage = (e) => {
try {
if (!e.data || typeof e.data !== 'object') {
throw new Error('Invalid message format');
}
const { type, entities, logging: logFlag } = e.data;
if (!type || !entities) {
throw new Error('Missing required message properties');
}
if (type === "update") {
logging = logFlag;
if (logging) console.log("Worker: Received entities update");
handleUpdate(entities.fetchedEntities, entities.data);
}
} catch (error) {
if (logging) console.error("Worker: Message processing error", error);
self.postMessage({ error: error.message });
}
};


function handleUpdate(fetchedEntities, data) {
entityBatch[fetchedEntities] = {
...entityBatch[fetchedEntities],
...data,
};
debouncedSendBatch();
}
Comment on lines +24 to +30
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Ensure uniqueness of dictionary key for fetchedEntities.

Using fetchedEntities directly as a property name can be risky if the object is not a string or has unexpected formatting. Consider converting it into a string or generating a unique key to avoid collisions or unexpected indexing.

-function handleUpdate(fetchedEntities, data) {
-  entityBatch[fetchedEntities] = {
-    ...entityBatch[fetchedEntities],
-    ...data,
-  };
+function handleUpdate(fetchedEntities, data) {
+  const entityKey = String(fetchedEntities);
+  entityBatch[entityKey] = {
+    ...entityBatch[entityKey],
+    ...data,
+  };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
function handleUpdate(fetchedEntities, data) {
entityBatch[fetchedEntities] = {
...entityBatch[fetchedEntities],
...data,
};
debouncedSendBatch();
}
function handleUpdate(fetchedEntities, data) {
const entityKey = String(fetchedEntities);
entityBatch[entityKey] = {
...entityBatch[entityKey],
...data,
};
debouncedSendBatch();
}

115 changes: 115 additions & 0 deletions client/src/dojo/indexedDB.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
import { Component, Metadata, Schema } from "@dojoengine/recs";
import { setEntities } from "@dojoengine/state";
import { Entities } from "@dojoengine/torii-client";

const DB_NAME = "eternum-db";
const DB_VERSION = 1;

function openDatabase(): Promise<IDBDatabase> {
let db: IDBDatabase;

return new Promise((resolve, reject) => {
const request: IDBOpenDBRequest = indexedDB.open(DB_NAME, DB_VERSION);

request.onupgradeneeded = (event: IDBVersionChangeEvent) => {
const db = (event.target as IDBOpenDBRequest).result;
if (!db.objectStoreNames.contains("entities")) {
db.createObjectStore("entities", { keyPath: "id" });
}
};

request.onsuccess = (event: Event) => {
db = (event.target as IDBOpenDBRequest).result;
resolve(db);
};

request.onerror = (event: Event) => {
console.error("Database error:", (event.target as IDBOpenDBRequest).error);
reject((event.target as IDBOpenDBRequest).error);
};
});
}

async function syncEntitiesFromStorage<S extends Schema>(
dbConnection: IDBDatabase,
components: Component<S, Metadata, undefined>[],
): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = dbConnection.transaction(["entities"], "readonly");
const store = transaction.objectStore("entities");
const request = store.getAll();

request.onsuccess = () => {
const entities = request.result;

const CHUNK_SIZE = 50000;

// Process entities in chunks
for (let i = 0; i < entities.length; i += CHUNK_SIZE) {
const chunk = entities.slice(i, i + CHUNK_SIZE);
const chunkMap: Entities = {};

for (const entity of chunk) {
const { id, ...data } = entity;
chunkMap[id] = data;
}

setEntities(chunkMap, components, false);
}
Comment on lines +37 to +58
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consider streaming or asynchronous iteration for large datasets.
Processing 50,000 entries per chunk may be memory-intensive for large databases. If the dataset grows significantly, consider streaming or an asynchronous iteration approach (e.g., openCursor) to manage memory usage more gracefully.


resolve();
};

request.onerror = () => {
console.log("Error fetching entities from storage:", request.error);
reject(request.error);
};
});
}

async function insertEntitiesInDB(db: IDBDatabase, entities: Entities): Promise<void> {
return new Promise((resolve, reject) => {
const transaction = db.transaction(["entities"], "readwrite");
const store = transaction.objectStore("entities");

let error: Error | null = null;

// Handle transaction completion
transaction.oncomplete = () => {
if (error) {
reject(error);
} else {
resolve();
}
};

transaction.onerror = () => {
reject(transaction.error);
};

// Store each entity
for (const [entityId, data] of Object.entries(entities)) {
const entityData = {
id: entityId,
...data,
};

const request = store.put(entityData);

request.onerror = () => {
error = request.error;
};
}
});
}
Comment on lines +70 to +104
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Consolidate transaction-level error handling.
Here, you track write errors per entity but also rely on the top-level transaction error event. In some scenarios, partial success with multiple items can be tricky. Consider capturing partial success more explicitly (e.g., storing or returning the IDs that fail to persist). This can provide better feedback in case a subset of entities fails to be inserted.


async function clearCache() {
Object.keys(localStorage)
.filter((x) => x.endsWith("_query"))
.forEach((x) => localStorage.removeItem(x));

indexedDB.deleteDatabase(DB_NAME);
location.reload();
}

Comment on lines +106 to +114
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Reevaluate forcing page reload after cache removal.
Calling “location.reload()” may disrupt users if done unexpectedly (for instance, if triggered by a background or user-driven action). Depending on your application’s workflow, consider a more graceful approach—such as notifying the user that a reload is needed, or elegantly reinitializing the UI after clearing the cache.

export { clearCache, insertEntitiesInDB, openDatabase, syncEntitiesFromStorage };
36 changes: 11 additions & 25 deletions client/src/dojo/queries.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,35 +4,11 @@ import { Component, Metadata, Schema } from "@dojoengine/recs";
import { getEntities } from "@dojoengine/state";
import { PatternMatching, ToriiClient } from "@dojoengine/torii-client";

// on hexception -> fetch below queries based on entityID

// background sync after load ->

export const syncPosition = async <S extends Schema>(
client: ToriiClient,
components: Component<S, Metadata, undefined>[],
entityID: string,
) => {
await getEntities(
client,
{
Keys: {
keys: [entityID],
pattern_matching: "FixedLen" as PatternMatching,
models: ["s0_eternum-Position"],
},
},
components,
[],
[],
30_000,
);
};

export const addToSubscriptionTwoKeyModelbyRealmEntityId = async <S extends Schema>(
client: ToriiClient,
components: Component<S, Metadata, undefined>[],
entityID: string[],
db: IDBDatabase,
) => {
await getEntities(
client,
Expand All @@ -54,13 +30,16 @@ export const addToSubscriptionTwoKeyModelbyRealmEntityId = async <S extends Sche
[],
[],
30_000,
false,
{ dbConnection: db, timestampCacheKey: `entity_two_key_${entityID}_query` },
);
};

export const addToSubscriptionOneKeyModelbyRealmEntityId = async <S extends Schema>(
client: ToriiClient,
components: Component<S, Metadata, undefined>[],
entityID: string[],
db: IDBDatabase,
) => {
await getEntities(
client,
Expand All @@ -82,13 +61,16 @@ export const addToSubscriptionOneKeyModelbyRealmEntityId = async <S extends Sche
[],
[],
30_000,
false,
{ dbConnection: db, timestampCacheKey: `entity_one_key_${entityID}_query` },
);
};

export const addToSubscription = async <S extends Schema>(
client: ToriiClient,
components: Component<S, Metadata, undefined>[],
entityID: string[],
db: IDBDatabase,
position?: { x: number; y: number }[],
) => {
const start = performance.now();
Expand Down Expand Up @@ -121,6 +103,8 @@ export const addToSubscription = async <S extends Schema>(
[],
[],
30_000,
false,
{ dbConnection: db, timestampCacheKey: `entity_${entityID}_query` },
Comment on lines +106 to +107
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Ensure stable or unique cache keys for multiple entities.
When an array of entities is passed, the interpolation 'entity_${entityID}_query' might produce unexpected collisions if entityID is itself an array. Consider a more robust approach, such as JSON-stringifying the array or hashing it.

);
const end = performance.now();
console.log("AddToSubscriptionEnd", end - start);
Expand All @@ -129,6 +113,7 @@ export const addToSubscription = async <S extends Schema>(
export const addMarketSubscription = async <S extends Schema>(
client: ToriiClient,
components: Component<S, Metadata, undefined>[],
db: IDBDatabase,
) => {
const start = performance.now();
await getEntities(
Expand All @@ -145,6 +130,7 @@ export const addMarketSubscription = async <S extends Schema>(
[],
30_000,
false,
{ dbConnection: db, timestampCacheKey: "market_query" },
);
const end = performance.now();
console.log("MarketEnd", end - start);
Expand Down
Loading
Loading