Skip to content

Commit

Permalink
Update website/docs/sql-reference/aggregate-functions/sql-array-agg.md
Browse files Browse the repository at this point in the history
  • Loading branch information
nghi-ly authored Nov 10, 2023
1 parent c77fe1e commit 5013b67
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -59,4 +59,4 @@ Looking at the query results—this makes sense! We’d expect newer orders to l
There are definitely too many use cases to list out for using the ARRAY_AGG function in your dbt models, but it’s very likely that ARRAY_AGG is used pretty downstream in your <Term id="dag" /> since you likely don’t want your data so bundled up earlier in your DAG to improve modularity and <Term id="dry">dryness</Term>. A few downstream use cases for ARRAY_AGG:

- In [`export_` models](https://www.getdbt.com/open-source-data-culture/reverse-etl-playbook) that are used to send data to platforms using a <Term id="reverse-etl" /> tool to pair down multiple rows into a single row. Some downstream platforms, for example, require certain values that we’d usually keep as separate rows to be one singular row per customer or user. ARRAY_AGG is handy to bring multiple column values together by a singular id, such as creating an array of all items a user has ever purchased and sending that array downstream to an email platform to create a custom email campaign.
- Similar to export models, you may see ARRAY_AGG used in [mart tables](https://docs.getdbt.com/best-practices/how-we-structure/4-marts) to create final aggregate arrays per a singular dimension; performance concerns of ARRAY_AGG in these likely larger tables can potentially be bypassed with use of [incremental models in dbt](https://docs.getdbt.com/docs/build/incremental-models).
- Similar to export models, you may see ARRAY_AGG used in [mart tables](/best-practices/how-we-structure/4-marts) to create final aggregate arrays per a singular dimension; performance concerns of ARRAY_AGG in these likely larger tables can potentially be bypassed with use of [incremental models in dbt](/docs/build/incremental-models).

0 comments on commit 5013b67

Please sign in to comment.