Skip to content

Commit

Permalink
Merge pull request #169 from poap-xyz/development
Browse files Browse the repository at this point in the history
Release of 12 December 2023
  • Loading branch information
actuallymentor authored Dec 12, 2023
2 parents fc21e15 + 2631ec6 commit bc6b613
Show file tree
Hide file tree
Showing 45 changed files with 2,721 additions and 840 deletions.
7 changes: 7 additions & 0 deletions .github/workflows/deploy-changes.yml
Original file line number Diff line number Diff line change
Expand Up @@ -192,6 +192,13 @@ jobs:
echo -e "${{ secrets.DOTENV_PRODUCTION }}" > .env
sed -i "s;%%cypress_cloud_service_url%%;${{ secrets.CYPRESS_CLOUD_SERVICE_URL }};g" currents.config.js
# Disable emulator for testing
- name: Disable emulator for testing
if: steps.changed-frontend-files.outputs.any_changed == 'true'
run: |
sed -i 's/^REACT_APP_useEmulator=.*$//g' .env
sed -i 's/^VITE_useEmulator=.*$//g' .env
# Check linting
- name: Check for linting errors
if: steps.changed-frontend-files.outputs.any_changed == 'true'
Expand Down
12 changes: 12 additions & 0 deletions .ncurc.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
/* ///////////////////////////////
// .ncurc configures how ncu behaves by default
// docs: https://www.npmjs.com/package/npm-check-updates
// /////////////////////////////*/
module.exports = {
upgrade: false,
target: 'minor',
doctorTest: 'npm test',
reject: [
'cypress' // After cypress 11 a bunch of issues are introduced. It no longer can access non localhost:3000 urls (like emulators at localhost:8080) and doesn't allow for "cross-origin" redirects to localhost:5051
]
}
42 changes: 37 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,19 @@ Product owner: @actualymentor

[Internal documentation](https://www.notion.so/poap/POAP-Kiosk-formerly-QR-Dispenser-3956e66a0b0742d49dab58e7b2fd0644)

[!Password vault (Engineering folder)](https://team-poap.1password.com/vaults/qo2bydganq3dw7dzedyi6h3fwu/allitems/2ey6mleluwnp4mnxq4rkqcknpm)

## Table of contents

- [Requirements](#requirements)
- [Frontend usage](#frontend-usage)
- [Backend usage](#backend-usage)
- [Maintenance](#maintenance)
- [Tech stack](#tech-stack)
- [Developer flow](#developer-flow)
- [Setting up new Firebase projects](#setting-up-new-firebase-projects)


## Requirements

- Have [nvm](https://github.com/nvm-sh/nvm) installed as a Node.js version manager
Expand All @@ -25,19 +38,37 @@ Product owner: @actualymentor
1. Populate `.env.development`
1. CHOICE MOMENT. Either:
1. Run the firebase functions backend locally, see next section
1. Comment out the `REACT_APP_useEmulator` line in `.env.development` (this will make the frontend use the live backend)
1. Comment out the `VITE_useEmulator` line in `.env.development` (this will make the frontend use the live backend)
1. `npm start`
1 `npm run cypress` to run tests locally, if they all pass then your setup was successful

## Backend usage

1. `cd functions`
2. `nvm use`
3. `npm i`
4. Populate `functions/.env.development` with the 1Password note `[ function/.env.development ] POAP Kiosk functions`
5. `npm run serve`

## Maintenance

Most of the internal and external packages we use are semver compliant. Which is `major.patch.minor`, which you can read as `danger.attention.safe`. So `v1.2.5` -> `v1.9.535` is sage, but `v1.2.5` -> `v2.0.0` is dangerous.

Minor versions are upgraded in bulk, blindly, and tested automatedly. Major versions require you to read the release notes of the dependency to check if the announced breaking changes affect us. Only after that do we upgrade and test them. Usually you can find version updates by going to the package github or searching for `packge_name changelog`, sometimes you need to find the `CHANGELOG.md` in a repo. Pay special attention to major version bumps (scroll to the part where the major version changes, don't just read the latest release notes).

**Dependency bumps (frontend)**

- Minor updates are done by running `npm run upgrade`
- Major updates are done by running `ncu --doctor -u PACKAGE1,PACKAGE2,...`, this will incrementally install a package and then run tests to check if everything works
- double check that the emulator suite is running if your `.env.development` includes `VITE_useEmulator=true`

**Dependency bumps (backend)**

- Updates are done assisted by `ncu`, and tested locally after bumps, then tested in CI through pull requests

## Tech stack

App code based on `create-react-app`, styling based on `styled-components`, routing using `react-router`.
App code based on `vite`, styling based on `styled-components`, routing using `react-router`.

Backend runs on a Firebase project.

Expand All @@ -56,9 +87,10 @@ This app follows the [POAP git flow]( https://app.gitbook.com/o/-Mdt3oJeD814je5S
To prevent the psycho from pursuing you:

1. Write for clarity and comprehension.
2. Leave a "what this does" comment for **every** function.
3. Be as descriptive as possible in commits and PRs.
1. Write for clarity and comprehension, self documenting code does not exist
2. Leave a "what this does" comment for **every** function, preferably in JSDoc
3. Be as descriptive as possible in commits and PRs
4. Assume the next person working on your code is much more junior than you

## Setting up new Firebase projects

Expand Down
29 changes: 15 additions & 14 deletions cypress/e2e/user-claimer/4-mint-poap-within-kiosk.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,25 @@ context( "Minting POAPs within Kiosk", () => {

it( "Rejects invalid claim codes", () => {

cy.on( 'window:alert', response => {
expect( response ).to.contain( 'Failed' )
} )

cy.visit( `/mint/not-a-valid-claim-code` )
cy.get( "#address-to-mint-to" ).type( eth_address )
cy.contains( "Collect your POAP" ).click()
// Show error Toast
cy.contains( "Failed" )

// Show failed landing
cy.contains( "Oh no! Something went wrong." )
} )

it( "Auto-mint fails with an invalid claim code", () => {

cy.visit( `/mint/not-a-valid-claim-code?user_address=${ eth_address }` )

// Show error Toast
cy.contains( "Failed" )

// Show failed landing
cy.contains( "Oh no! Something went wrong." )
} )

it( "Mints valid test claim codes", () => {
Expand All @@ -20,25 +31,15 @@ context( "Minting POAPs within Kiosk", () => {
cy.get( "#address-to-mint-to" ).type( ens_address )
cy.contains( "Collect your POAP" ).click()
cy.contains( "The minting process has started" )

} )

it( "Auto-mints when the address is passed through the query string", () => {

cy.visit( `/mint/testing-123?user_address=${ eth_address }` )
cy.contains( "The minting process has started" )

} )

it( "Auto-mint fails with an invalid claim code", () => {

cy.on( 'window:alert', response => {
expect( response ).to.contain( 'Failed' )
} )

cy.visit( `/mint/not-a-valid-claim-code?user_address=${ eth_address }` )

} )


} )
19 changes: 8 additions & 11 deletions functions/index.js
Original file line number Diff line number Diff line change
@@ -1,12 +1,6 @@
// V1 Dependencies
const functions = require( "firebase-functions" )

// V1 Runtime config
const generousRuntime = {
timeoutSeconds: 540,
memory: '4GB'
}

const { log, dev } = require( './modules/helpers' )
log( `⚠️ Verbose mode on, ${ dev ? '⚙️ dev mode on' : '🚀 production mode on' }` )

Expand All @@ -27,10 +21,10 @@ exports.getEventDataFromCode = v1_oncall( getEventDataFromCode )
exports.check_code_status = v1_oncall( check_code_status )

// Refresh all codes ( trigger from frontend on page mount of EventView )
exports.requestManualCodeRefresh = v1_oncall( [ 'memory_512MiB', 'long_timeout' ], refresh_unknown_and_unscanned_codes )
exports.requestManualCodeRefresh = v1_oncall( [ 'memory_512MB', 'long_timeout' ], refresh_unknown_and_unscanned_codes )

// Allow frontend to trigger updates for scanned codes, ( triggers on a periodic interval from EventView ), is lighter than requestManualCodeRefresh as it checks only scanned and claimed == true codes
exports.refreshScannedCodesStatuses = v1_oncall( [ 'memory_512MiB', 'long_timeout' ], refreshScannedCodesStatuses )
exports.refreshScannedCodesStatuses = v1_oncall( [ 'memory_512MB', 'long_timeout' ], refreshScannedCodesStatuses )

// Directly mint a code to an address
const { mint_code_to_address } = require( './modules/minting' )
Expand All @@ -40,12 +34,16 @@ exports.mint_code_to_address = v2_oncall( [ 'memory_512MiB', 'long_timeout' ], m
const { recalculate_available_codes_admin } = require( './modules/codes' )
exports.recalculate_available_codes = v2_oncall( recalculate_available_codes_admin )

// On event registration, recalculate available codes
const { recalculate_available_codes } = require( './modules/codes' )
exports.recalculate_available_codes_on_event_registration = functions.runWith( { timeoutSeconds: 540, memory: '4GB' } ).firestore.document( `events/{eventId}` ).onCreate( ( { params } ) => recalculate_available_codes( params.eventId ) )

// ///////////////////////////////
// Event data
// ///////////////////////////////

const { registerEvent, deleteEvent, getUniqueOrganiserEmails } = require( './modules/events' )
exports.registerEvent = v1_oncall( [ 'memory_1GiB', 'long_timeout' ], registerEvent )
exports.registerEvent = v2_oncall( [ 'memory_1GiB', 'long_timeout' ], registerEvent )
exports.deleteEvent = v1_oncall( deleteEvent )

// Email export to update event organisers
Expand All @@ -72,14 +70,13 @@ const { delete_data_of_deleted_event, updatePublicEventData } = require( './modu
const { clean_up_expired_items } = require( './modules/health' )

// Delete items where parents were deleted
exports.clean_up_expired_items = functions.runWith( generousRuntime ).pubsub.schedule( 'every 24 hours' ).onRun( clean_up_expired_items )
exports.clean_up_expired_items = functions.runWith( { timeoutSeconds: 540, memory: '4GB' } ).pubsub.schedule( 'every 24 hours' ).onRun( clean_up_expired_items )
exports.delete_data_of_deleted_event = functions.firestore.document( `events/{eventId}` ).onDelete( delete_data_of_deleted_event )

// Update items where parents were updated
exports.updatePublicEventData = functions.firestore.document( `events/{eventId}` ).onWrite( updatePublicEventData )
exports.updateEventAvailableCodes = functions.firestore.document( `codes/{codeId}` ).onUpdate( updateEventAvailableCodes )


/* ///////////////////////////////
// Security
// /////////////////////////////*/
Expand Down
97 changes: 73 additions & 24 deletions functions/modules/events.js
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ async function validate_and_write_event_codes( event_id, expiration_date, codes,

// Parse out codes that are expected to be new, so keep only codes that are not found in the existing_code array
const new_codes = saneCodes.filter( ( { qr_hash } ) => !existing_codes?.find( existing_code => existing_code.qr_hash == qr_hash ) )

// First check if all codes are unused by another event
const code_clash_queue = new_codes.map( code => async () => {

Expand All @@ -75,30 +76,78 @@ async function validate_and_write_event_codes( event_id, expiration_date, codes,

} )

/* ///////////////////////////////
// Step 2: Throttled code writing, see https://cloud.google.com/firestore/docs/best-practices and https://cloud.google.com/firestore/quotas#writes_and_transactions */

// Check for code clashes in a throttled manner
await Throttle.all( code_clash_queue, { maxInProgress } )

// Load the codes into firestore
const code_writing_queue = saneCodes.map( code => async () => {
/* ///////////////////////////////
// Step 2: Throttled code writing using firestore patches */

// Batch config
const batch_size = 499

// Split into chunks of batch_size
const code_chunks = []
for( let index = 0; index < saneCodes.length; index += batch_size ) {
const chunk = saneCodes.slice( index, index + batch_size )
code_chunks.push( chunk )
}

return db.collection( 'codes' ).doc( code.qr_hash ).set( {
claimed: !!code.claimed,
scanned: false,
amountOfRemoteStatusChecks: 0,
created: Date.now(),
updated: Date.now(),
updated_human: new Date().toString(),
event: event_id,
expires: new Date( expiration_date ).getTime() + weekInMs
}, { merge: true } )
// Create batches for each chunk
const code_batches = code_chunks.map( chunk => {

// Make a batch for this chunk
const batch = db.batch()

// For each entry in the chunk, add a batch set
chunk.forEach( code => {

if( !code ) return

const ref = db.collection( `codes` ).doc( code.qr_hash )
batch.set( ref, {
claimed: !!code.claimed,
scanned: false,
amountOfRemoteStatusChecks: 0,
created: Date.now(),
updated: Date.now(),
updated_human: new Date().toString(),
event: event_id,
expires: new Date( expiration_date ).getTime() + weekInMs
}, { merge: true } )

} )

// Return batch
return batch

} )

// Write codes to firestore with a throttle
await Throttle.all( code_writing_queue, { maxInProgress } )
// Create writing queue
const writing_queue = code_batches.map( batch => () => batch.commit() )

// Write the watches with retry
const { throttle_and_retry } = require( './helpers' )
await throttle_and_retry( writing_queue, maxInProgress, `validate_and_write_event_codes`, 2, 5 )

// Old non-batchified way
// // Load the codes into firestore
// const code_writing_queue = saneCodes.map( code => async () => {

// return db.collection( 'codes' ).doc( code.qr_hash ).set( {
// claimed: !!code.claimed,
// scanned: false,
// amountOfRemoteStatusChecks: 0,
// created: Date.now(),
// updated: Date.now(),
// updated_human: new Date().toString(),
// event: event_id,
// expires: new Date( expiration_date ).getTime() + weekInMs
// }, { merge: true } )

// } )

// // Write codes to firestore with a throttle
// await Throttle.all( code_writing_queue, { maxInProgress } )

// Return the sanitised codes
return saneCodes
Expand Down Expand Up @@ -217,9 +266,10 @@ const update_event_data_of_kiosk = async ( kiosk_id, public_kiosk_data ) => {

exports.update_event_data_of_kiosk = update_event_data_of_kiosk

exports.registerEvent = async function( data, context ) {
exports.registerEvent = async request => {

let new_event_id = undefined
const { data } = request

try {

Expand All @@ -229,9 +279,6 @@ exports.registerEvent = async function( data, context ) {
// Add a week grace period in case we need to debug anything
const weekInMs = 1000 * 60 * 60 * 24 * 7

// Appcheck validation
throw_on_failed_app_check( context )

// Validations
const { name='', email='', date='', dropId, codes=[], challenges=[], game_config={ duration: 30, target_score: 5 }, css, collect_emails=false, claim_base_url } = data
if( !codes.length ) throw new Error( 'Csv has 0 entries' )
Expand All @@ -241,7 +288,7 @@ exports.registerEvent = async function( data, context ) {

// Get the ip this request came from
const { get_ip_from_request } = require( './firebase' )
const created_from_ip = get_ip_from_request( context ) || 'unknown'
const created_from_ip = get_ip_from_request( request ) || 'unknown'

// Create event document
const authToken = uuidv4()
Expand Down Expand Up @@ -280,9 +327,11 @@ exports.registerEvent = async function( data, context ) {
// Check code validity and write to firestore
await validate_and_write_event_codes( id, date, formatted_codes )

// NOTE: this was moved to an onCreate trigger in index.js, leaving this here as a debug breadcrumb in case it did not solve out issue with creating large events
// NOTE: safe to delete after jan 2024
// Calculate publicly available codes for this new event
const { recalculate_available_codes } = require( './codes' )
await recalculate_available_codes( id )
// const { recalculate_available_codes } = require( './codes' )
// await recalculate_available_codes( id )

// Grab the latest drop data form api, adding the event_data is important because the publicEventData does not exist yet
await update_event_data_of_kiosk( id, event_data )
Expand Down
Loading

0 comments on commit bc6b613

Please sign in to comment.