I have a server (debian 10) with docker container for airflow and spark. Both are in the same network. I also installed a spark provider in airflow. However I am not able to run a SparkSubmitOperator task in airflow. Keeps getting error. Needs somebody to take a look at the setup and identify the issue. Or suggestion of better configuration.