Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bridges of LI.FI across chains, the contract is LiFiDiamond_v2 #7172

Merged
merged 46 commits into from
Dec 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
a26fe6c
Add Odos on Arbitrum to dex_aggregator
lequangphu Sep 6, 2024
1d5e1ef
Change to WETH address on Arbitrum
lequangphu Sep 6, 2024
33ebaf0
Add sources of Odos on Arbitrum
lequangphu Sep 6, 2024
906a5ef
Merge branch 'main' into main
Hosuke Sep 8, 2024
2ca58da
Add odos/arbitrum seeds
lequangphu Sep 9, 2024
786b086
Merge branch 'main' of github.com:lequangphu/duneanalytics-spellbook
lequangphu Sep 9, 2024
a3a2c51
Change data type
lequangphu Sep 9, 2024
aa9fba9
Merge branch 'main' into main
lequangphu Sep 10, 2024
b07e65b
Merge branch 'main' into main
Hosuke Sep 10, 2024
60630de
Merge branch 'main' into main
Hosuke Sep 11, 2024
8cdfbd6
Merge branch 'duneanalytics:main' into main
lequangphu Sep 18, 2024
a8f7192
Merge branch 'duneanalytics:main' into main
lequangphu Nov 9, 2024
db1fe05
Merge branch 'duneanalytics:main' into main
lequangphu Nov 22, 2024
85a0295
Add bridges of LiFi across chains
lequangphu Nov 23, 2024
1addb58
replace tests which is deprecated
lequangphu Nov 24, 2024
8c51ec5
fix name error of avalanche source
lequangphu Nov 24, 2024
23f5516
add index to unique test
lequangphu Nov 24, 2024
986119e
fix concat issue
lequangphu Nov 24, 2024
06ccc49
fix unique test of the main model
lequangphu Nov 24, 2024
e78574c
try to fix unique key
lequangphu Nov 24, 2024
2fa1d10
use surrogate key for data tests
lequangphu Nov 25, 2024
338117e
use generate_surrogate_key inside the models
lequangphu Nov 25, 2024
6691146
fix and remove redundant schema properties
lequangphu Nov 25, 2024
50dbfd5
data_tests
lequangphu Nov 25, 2024
72af65d
move generate_surrogate_key to schema.yml
lequangphu Nov 25, 2024
8a5a222
1. create macro lifi_extract_bridge_data_macro.sql
lequangphu Nov 26, 2024
6d6a6f0
data_ again
lequangphu Nov 26, 2024
d904ab2
fix naming of the main model
lequangphu Nov 26, 2024
8f89bad
replace avalanche with avalanche_c
lequangphu Nov 26, 2024
2d46cb1
avalanche_c again
lequangphu Nov 26, 2024
92f3d1d
1. add block_date column to the macro to use it in
lequangphu Nov 26, 2024
1dcdbb5
rename columns to use add_tx_columns macro
lequangphu Nov 26, 2024
c6bfdbf
remove evt_
lequangphu Nov 26, 2024
43978c9
Merge branch 'duneanalytics:main' into main
lequangphu Nov 26, 2024
6a159a8
Merge branch 'main' into main
jeff-dude Nov 27, 2024
02c8aee
a few minor changes
lequangphu Nov 28, 2024
44883fb
add amount_usd column
lequangphu Nov 28, 2024
bea8d6b
data_tests again
lequangphu Nov 28, 2024
8f0852f
map native token to wrapped token to avoid null
lequangphu Dec 3, 2024
6c210a9
update the main model and schema
lequangphu Dec 3, 2024
48ef240
move type conversion from models to macro
lequangphu Dec 4, 2024
695320a
fix source name
lequangphu Dec 4, 2024
ab36cfc
varbinary type doesn't need single quotes
lequangphu Dec 5, 2024
561e083
correct collumn name
lequangphu Dec 5, 2024
ae53e6c
add post_hook
lequangphu Dec 6, 2024
b449ce7
Merge branch 'main' into main
lequangphu Dec 7, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
{% macro lifi_extract_bridge_data(blockchain) %}

{% set bridge_data_fields = [
'transactionId',
'bridge',
'integrator',
'referrer',
'sendingAssetId',
'receiver',
'minAmount',
'destinationChainId'
] %}

select
contract_address,
evt_tx_hash as tx_hash,
evt_index,
evt_block_time as block_time,
evt_block_number as block_number,
cast(date_trunc('day', evt_block_time) as date) as block_date,
{% for field in bridge_data_fields %}
{% if field in ['transactionId', 'referrer', 'sendingAssetId', 'receiver'] %}
from_hex(json_extract_scalar(bridgedata, '$.{{ field }}')) as {{ field }},
{% elif field == 'minAmount' %}
cast(json_extract_scalar(bridgedata, '$.{{ field }}') as double) as {{ field }},
{% else %}
json_extract_scalar(bridgedata, '$.{{ field }}') as {{ field }},
{% endif %}
{% endfor %}
'{{ blockchain }}' as source_chain,
{{ dbt_utils.generate_surrogate_key(['evt_tx_hash', 'evt_index']) }} as transfer_id
from {{ source('lifi_' ~ blockchain, 'LiFiDiamond_v2_evt_LiFiTransferStarted') }}
{% if is_incremental() %}
where {{ incremental_predicate('evt_block_time') }}
{% endif %}

{% endmacro %}
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
{{ config(
schema = 'lifi_arbitrum',
alias = 'transfers',
materialized = 'incremental',
file_format = 'delta',
incremental_strategy = 'merge',
unique_key = ['transfer_id'],
incremental_predicates = [incremental_predicate('DBT_INTERNAL_DEST.block_time')]
)
}}

with source_data as (
{{ lifi_extract_bridge_data('arbitrum') }}
),

tokens_mapped as (
select
*,
case
when sendingAssetId = 0x0000000000000000000000000000000000000000
then 0x82af49447d8a07e3bd95bd0d56f35241523fbab1 -- WETH
when sendingAssetId = 0x3405a1bd46b85c5c029483fbecf2f3e611026e45
then 0xff970a61a04b1ca14834a43f5de4533ebddb5cc8 -- USDC
else sendingAssetId
end as sendingAssetId_adjusted
from source_data
),

price_data as (
select
tokens_mapped.*,
p.price * minAmount / power(10, p.decimals) as amount_usd
from tokens_mapped
left join {{ source('prices', 'usd') }} p
on p.contract_address = tokens_mapped.sendingAssetId_adjusted
and p.blockchain = 'arbitrum'
and p.minute = date_trunc('minute', tokens_mapped.block_time)
jeff-dude marked this conversation as resolved.
Show resolved Hide resolved
{% if is_incremental() %}
and {{ incremental_predicate('p.minute') }}
{% endif %}
)

{{
add_tx_columns(
model_cte = 'price_data'
, blockchain = 'arbitrum'
, columns = ['from']
)
}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{{ config(
schema = 'lifi_avalanche_c',
alias = 'transfers',
materialized = 'incremental',
file_format = 'delta',
incremental_strategy = 'merge',
unique_key = ['transfer_id'],
incremental_predicates = [incremental_predicate('DBT_INTERNAL_DEST.block_time')]
)
}}

with source_data as (
{{ lifi_extract_bridge_data('avalanche_c') }}
),

tokens_mapped as (
select
*,
case
when sendingAssetId = 0x0000000000000000000000000000000000000000
then 0xb31f66aa3c1e785363f0875a1b74e27b85fd66c7 -- WAVAX
else sendingAssetId
end as sendingAssetId_adjusted
from source_data
),

price_data as (
select
tokens_mapped.*,
p.price * minAmount / power(10, p.decimals) as amount_usd
from tokens_mapped
left join {{ source('prices', 'usd') }} p
on p.contract_address = tokens_mapped.sendingAssetId_adjusted
and p.blockchain = 'avalanche_c'
and p.minute = date_trunc('minute', tokens_mapped.block_time)
{% if is_incremental() %}
and {{ incremental_predicate('p.minute') }}
{% endif %}
)

{{
add_tx_columns(
model_cte = 'price_data'
, blockchain = 'avalanche_c'
, columns = ['from']
)
}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{{ config(
schema = 'lifi_bnb',
alias = 'transfers',
materialized = 'incremental',
file_format = 'delta',
incremental_strategy = 'merge',
unique_key = ['transfer_id'],
incremental_predicates = [incremental_predicate('DBT_INTERNAL_DEST.block_time')]
)
}}

with source_data as (
{{ lifi_extract_bridge_data('bnb') }}
),

tokens_mapped as (
select
*,
case
when sendingAssetId = 0x0000000000000000000000000000000000000000
then 0xbb4CdB9CBd36B01bD1cBaEBF2De08d9173bc095c -- WBNB
else sendingAssetId
end as sendingAssetId_adjusted
from source_data
),

price_data as (
select
tokens_mapped.*,
p.price * minAmount / power(10, p.decimals) as amount_usd
from tokens_mapped
left join {{ source('prices', 'usd') }} p
on p.contract_address = tokens_mapped.sendingAssetId_adjusted
and p.blockchain = 'bnb'
and p.minute = date_trunc('minute', tokens_mapped.block_time)
{% if is_incremental() %}
and {{ incremental_predicate('p.minute') }}
{% endif %}
)

{{
add_tx_columns(
model_cte = 'price_data'
, blockchain = 'bnb'
, columns = ['from']
)
}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{{ config(
schema = 'lifi_ethereum',
alias = 'transfers',
materialized = 'incremental',
file_format = 'delta',
incremental_strategy = 'merge',
unique_key = ['transfer_id'],
incremental_predicates = [incremental_predicate('DBT_INTERNAL_DEST.block_time')]
)
}}

with source_data as (
{{ lifi_extract_bridge_data('ethereum') }}
),

tokens_mapped as (
select
*,
case
when sendingAssetId = 0x0000000000000000000000000000000000000000
then 0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2 -- WETH
else sendingAssetId
end as sendingAssetId_adjusted
from source_data
),

price_data as (
select
tokens_mapped.*,
p.price * minAmount / power(10, p.decimals) as amount_usd
from tokens_mapped
left join {{ source('prices', 'usd') }} p
on p.contract_address = tokens_mapped.sendingAssetId_adjusted
and p.blockchain = 'ethereum'
and p.minute = date_trunc('minute', tokens_mapped.block_time)
{% if is_incremental() %}
and {{ incremental_predicate('p.minute') }}
{% endif %}
)

{{
add_tx_columns(
model_cte = 'price_data'
, blockchain = 'ethereum'
, columns = ['from']
)
}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{{ config(
schema = 'lifi_fantom',
alias = 'transfers',
materialized = 'incremental',
file_format = 'delta',
incremental_strategy = 'merge',
unique_key = ['transfer_id'],
incremental_predicates = [incremental_predicate('DBT_INTERNAL_DEST.block_time')]
)
}}

with source_data as (
{{ lifi_extract_bridge_data('fantom') }}
),

tokens_mapped as (
select
*,
case
when sendingAssetId = 0x0000000000000000000000000000000000000000
then 0x21be370d5312f44cb42ce377bc9b8a0cef1a4c83 -- WFTM
else sendingAssetId
end as sendingAssetId_adjusted
from source_data
),

price_data as (
select
tokens_mapped.*,
p.price * minAmount / power(10, p.decimals) as amount_usd
from tokens_mapped
left join {{ source('prices', 'usd') }} p
on p.contract_address = tokens_mapped.sendingAssetId_adjusted
and p.blockchain = 'fantom'
and p.minute = date_trunc('minute', tokens_mapped.block_time)
{% if is_incremental() %}
and {{ incremental_predicate('p.minute') }}
{% endif %}
)

{{
add_tx_columns(
model_cte = 'price_data'
, blockchain = 'fantom'
, columns = ['from']
)
}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{{ config(
schema = 'lifi_gnosis',
alias = 'transfers',
materialized = 'incremental',
file_format = 'delta',
incremental_strategy = 'merge',
unique_key = ['transfer_id'],
incremental_predicates = [incremental_predicate('DBT_INTERNAL_DEST.block_time')]
)
}}

with source_data as (
{{ lifi_extract_bridge_data('gnosis') }}
),

tokens_mapped as (
select
*,
case
when sendingAssetId = 0x0000000000000000000000000000000000000000
then 0xe91d153e0b41518a2ce8dd3d7944fa863463a97d -- WXDAI
else sendingAssetId
end as sendingAssetId_adjusted
from source_data
),

price_data as (
select
tokens_mapped.*,
p.price * minAmount / power(10, p.decimals) as amount_usd
from tokens_mapped
left join {{ source('prices', 'usd') }} p
on p.contract_address = tokens_mapped.sendingAssetId_adjusted
and p.blockchain = 'gnosis'
and p.minute = date_trunc('minute', tokens_mapped.block_time)
{% if is_incremental() %}
and {{ incremental_predicate('p.minute') }}
{% endif %}
)

{{
add_tx_columns(
model_cte = 'price_data'
, blockchain = 'gnosis'
, columns = ['from']
)
}}
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
{{
config(
schema = 'lifi',
alias = 'transfers',
materialized = 'view',
post_hook='{{ expose_spells(\'[
"arbitrum"
, "avalanche_c"
, "bnb"
, "ethereum"
, "fantom"
, "gnosis"
, "zksync"
]\',
"project",
"lifi",
\'["lequangphu"]\') }}'
)
}}

{% set chains = [
'ethereum',
'arbitrum',
'avalanche_c',
'bnb',
'fantom',
'gnosis',
'zksync'
] %}

with chain_transfers as (
{% for chain in chains %}
select
contract_address,
tx_hash,
evt_index,
block_time,
block_number,
block_date,
transactionId,
bridge,
integrator,
referrer,
sendingAssetId,
receiver,
minAmount,
destinationChainId,
source_chain,
transfer_id,
sendingAssetId_adjusted,
amount_usd,
tx_from
from {{ ref('lifi_' ~ chain ~ '_transfers') }}
{% if not loop.last %}
union all
{% endif %}
{% endfor %}
)

select *
from chain_transfers
Loading
Loading