python - Airflow: How to SSH and run BashOperator from a different server -
is there way ssh different server , run bashoperator using airbnb's airflow? trying run hive sql command airflow need ssh different box in order run hive shell. tasks should this:
- ssh server1
- start hive shell
- run hive command
thanks!
i think figured out:
create ssh connection in ui under admin > connection. note: connection deleted if reset database
in python file add following
from airflow.contrib.hooks import sshhook sshhook = sshhook(conn_id=<your connection id ui>)
add ssh operator task
t1 = sshexecuteoperator( task_id="task1", bash_command=<your command>, ssh_hook=sshhook, dag=dag)
thanks!
Comments
Post a Comment