Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update SSO index table (resolver) #898

Merged
merged 2 commits into from
Oct 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 5 additions & 3 deletions scheduler/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,21 +3,23 @@
Operations for the night N start at 21.45pm UTC on the night N-1. There are 2 cronjobs scheduled:

```bash
# Paris time
# Paris time @ VD

# Fink real-time
45 23 * * * /home/julien.peloton/fink-broker/scheduler/launch_fink.sh

# Database service
05 20 * * * /home/julien.peloton/fink-broker/scheduler/database_service.sh
35 20 * * * /home/julien.peloton/fink-broker/scheduler/database_service.sh
30 21 * * * /home/julien.peloton/fink-broker/scheduler/database_auxilliary.sh

# Fink MM
0 01 * * * /home/julien.peloton/Fink_MM/scheduler/science2grb.sh
0 01 * * * /home/julien.peloton/Fink_MM/scheduler/grb2distribution.sh
1 17 * * * /home/julien.peloton/Fink_MM/scheduler/science2grb_offline.sh

# SSOFT
# SSOFT - once a month
0 0 1 * * /home/julien.peloton/fink-broker/scheduler/launch_ssoft.sh
0 12 1 * * /home/julien.peloton/fink-broker/scheduler/launch_sso_resolver.sh
```

The first script is for live operations:
Expand Down
26 changes: 26 additions & 0 deletions scheduler/launch_sso_resolver.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/bin/bash
set -e

source ~/.bash_profile

source ${FINK_HOME}/conf_cluster/fink.conf.ztf_stream2raw

NCORES=8

FINK_VERSION=`fink --version`
PYTHON_VERSION=`python -c "import platform; print(platform.python_version()[:3])"`
PYTHON_EXTRA_FILE="--py-files ${FINK_HOME}/dist/fink_broker-${FINK_VERSION}- py${PYTHON_VERSION}.egg"

spark-submit \
--master mesos://vm-75063.lal.in2p3.fr:5050 \
--conf spark.mesos.principal=$MESOS_PRINCIPAL \
--conf spark.mesos.secret=$MESOS_SECRET \
--conf spark.mesos.role=$MESOS_ROLE \
--conf spark.executorEnv.HOME='/home/julien.peloton'\
--driver-memory 8G --executor-memory 4G \
--conf spark.cores.max=$NCORES --conf spark.executor.cores=2 \
--conf spark.sql.execution.arrow.pyspark.enabled=true\
--conf spark.kryoserializer.buffer.max=512m\
--packages ${FINK_PACKAGES} --jars ${FINK_JARS} ${PYTHON_EXTRA_FILE} \
${FINK_HOME}/bin/index_sso_resolver.py
~
69 changes: 69 additions & 0 deletions scheduler/launch_ssoft.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
#!/bin/bash
set -e

source ~/.bash_profile

NCORES=100
AGGREGATE="--pre_aggregate_data"
EXTRA_OPT=

# In case of trouble
#AGGREGATE=""
#EXTRA_OPT="-version 2023.12"

FINK_VERSION=`fink --version`
PYTHON_VERSION=`python -c "import platform; print(platform.python_version()[:3])"`
PYTHON_EXTRA_FILE="--py-files ${FINK_HOME}/dist/fink_broker-${FINK_VERSION}-py${PYTHON_VERSION}.egg"

spark-submit \
--master mesos://vm-75063.lal.in2p3.fr:5050 \
--conf spark.mesos.principal=$MESOS_PRINCIPAL \
--conf spark.mesos.secret=$MESOS_SECRET \
--conf spark.mesos.role=$MESOS_ROLE \
--conf spark.executorEnv.HOME='/home/julien.peloton'\
--driver-memory 8G --executor-memory 4G \
--conf spark.cores.max=$NCORES --conf spark.executor.cores=2 \
--conf spark.sql.execution.arrow.pyspark.enabled=true\
--conf spark.kryoserializer.buffer.max=512m\
${PYTHON_EXTRA_FILE}\
${FINK_HOME}/bin/generate_ssoft.py \
-model SHG1G2 $AGGREGATE $EXTRA_OPT > ${FINK_HOME}/broker_logs/ssoft_SHG1G2.log 2>&1

spark-submit \
--master mesos://vm-75063.lal.in2p3.fr:5050 \
--conf spark.mesos.principal=$MESOS_PRINCIPAL \
--conf spark.mesos.secret=$MESOS_SECRET \
--conf spark.mesos.role=$MESOS_ROLE \
--conf spark.executorEnv.HOME='/home/julien.peloton'\
--driver-memory 8G --executor-memory 4G \
--conf spark.cores.max=$NCORES --conf spark.executor.cores=2 \
--conf spark.sql.execution.arrow.pyspark.enabled=true\
--conf spark.kryoserializer.buffer.max=512m\
${PYTHON_EXTRA_FILE}\
${FINK_HOME}/bin/generate_ssoft.py \
-model HG1G2 $EXTRA_OPT > ${FINK_HOME}/broker_logs/ssoft_HG1G2.log 2>&1

spark-submit \
--master mesos://vm-75063.lal.in2p3.fr:5050 \
--conf spark.mesos.principal=$MESOS_PRINCIPAL \
--conf spark.mesos.secret=$MESOS_SECRET \
--conf spark.mesos.role=$MESOS_ROLE \
--conf spark.executorEnv.HOME='/home/julien.peloton'\
--driver-memory 8G --executor-memory 4G \
--conf spark.cores.max=$NCORES --conf spark.executor.cores=2 \
--conf spark.sql.execution.arrow.pyspark.enabled=true\
--conf spark.kryoserializer.buffer.max=512m\
${PYTHON_EXTRA_FILE}\
${FINK_HOME}/bin/generate_ssoft.py \
-model HG $EXTRA_OPT > ${FINK_HOME}/broker_logs/ssoft_HG.log 2>&1

sudo su livy <<'EOF'
source ~/.bashrc
YEAR=`date +"%Y"`
MONTH=`date +"%m"`
/opt/hadoop-2/bin/hdfs dfs -put ssoft_SHG1G2_${YEAR}.${MONTH}.parquet SSOFT/
/opt/hadoop-2/bin/hdfs dfs -put ssoft_HG1G2_${YEAR}.${MONTH}.parquet SSOFT/
/opt/hadoop-2/bin/hdfs dfs -put ssoft_HG_${YEAR}.${MONTH}.parquet SSOFT/
EOF

mv ssoft_*.parquet /spark_mongo_tmp/julien.peloton/ssoft/
Loading