Question: kubernetes init container for spark-submit
I am trying to run a spark-submit to the Kubernetes cluster with spark 3.2.1 image and it is working. Now my question is, can I execute an init container along with the spark-submit? What am trying to achieve is that the init container check another service apache-spark error is up or not, is it up then spark-submit will run or it fail.
I can see that a conf parameter "spark.kubernetes.initContainer.image" for spark version 2.3 but not for 3.2.1 (https://spark.apache.org/docs/2.3.0/running-on-kubernetes.html)
is there any mechanism that I can use to check other services apache-spark error are up or not before I submit a spark job?
I can see init container usage for the spark in the below links apache-spark error but it is not providing an accurate answer
https://docs.bitnami.com/kubernetes/infrastructure/spark/configuration/configure-sidecar-init-containers/ https://doc.lucidworks.com/spark-guide/11153/running-spark-on-kubernetes
any help will be much appreciated, thanks.