python - Airflow: How to SSH and run BashOperator from a different server -


is there way ssh different server , run bashoperator using airbnb's airflow? trying run hive sql command airflow need ssh different box in order run hive shell. tasks should this:

  1. ssh server1
  2. start hive shell
  3. run hive command

thanks!

i think figured out:

  1. create ssh connection in ui under admin > connection. note: connection deleted if reset database

  2. in python file add following

    from airflow.contrib.hooks import sshhook sshhook = sshhook(conn_id=<your connection id ui>) 
  3. add ssh operator task

    t1 = sshexecuteoperator(     task_id="task1",     bash_command=<your command>,     ssh_hook=sshhook,     dag=dag) 

thanks!


Comments

Popular posts from this blog

javascript - Thinglink image not visible until browser resize -

firebird - Error "invalid transaction handle (expecting explicit transaction start)" executing script from Delphi -

mongodb - How to keep track of users making Stripe Payments -