Setup Airflow + sample DAG + simple Flask Server on Google Kubernetes Engine cluster -- 3

Setup Airflow on Google Kubernetes Engine cluster:

You need to have Airflow deployement experience and Google Kubernetes Engine experience

- Airflow 1.10.2 or newer

- Mysql DB 5.7

- Executor: KubernetesExecutor

- 1 cluster

- 1 node (to be confirmed)

I want to use pure Airflow and open source libraries. I want to have the option to move the whole project to another provider, so I prefer not to use GKEPodOperator and similar Google specific things unless it is the only option.

Part A

Prepare necessary configuration files (yaml, docker-compose, helm,..., airflow config file) + full instruction to deploy airflow on GKE ( cluster exists, instructions about setup on GKE with existing cluster) with Mysql DB (docker file will be provided by me, already deployed in cluster) and KubernetesExecutor. Need to have persistence, logging, Ingress controller/LoadBalancer, port exposure inside and to outside. I should be able to access the airflow webserver to manually run dags

- 1 pod will contain the airflow webserver and Airflow scheduler and the mysql database (with persistence, Stateful Set) and mongoDB container (docker file will be provided by me ) or use 2 pods (one for airflow and one for Mysql and MongoDB)

- 1 pod for every task instance. Workers are created in containers dynamicaly and disappear when the task/dag is finished

- DAG sync mode: PersistentVolume

- 3 PersistentVolume: 1 for logs, 1 for Dags and plugins, 1 for databases (mysql and mongoDB) or 1 persistent volume with 3 sum folders (one for each)

- Ability to trigger DAG runs with Airflow REST API

- Ability to send variables (in json/dict format) and data (in pandas Dataframe format) when calling a Dag or between tasks inside a dag

- ability to access the Airflow Webserver thought the internet (authentication required)

Part B

Create a sample dag with 3 tasks (1 python operator, 1 bash operator and 1 KubernetesPodOperator) to demo the process (should use XCOM and a sample code in plugin to demo how to import external python scripts into a dag)

Create instructions and code to test the Dag (trigger it) using Airflow REST API or calling airflow in bash/python, code running in a container deployed on GKE


Other details and info required will be discussed as needed

All code should be documented (functions should have comments explain all variables and return values, and main part of the code).

Python 3.6+ should be used

All python code should have [login to view URL] using pipreqs

Instructions should include how to update code without stopping the server (on GKE)

All access to the internet (outside the cluster) should be secure: need to create necessary secure connections and instructions to setup (certificates needed)

other skills required: Airflow, Flask, Docker, Kubernetes, Google Kubernetes Engine, MYSQL

Квалификация: Docker, Kubernetes, Python

Показать больше simple linux server, set simple linux server tasks, simple echo server, simple web server javame, setup live streaming video windows 2003 server, simple echo server java application, simple http server post, simple client server socket application, sample share point portal server 2003 home page, simple chat server, sample script bcp sql server 2005, sample windows mobile application server sql, geographical web server google maps, windows simple ftp server, setup simple exchange server outlook, smartfox server google app engine, tcp server google app engine, google app engine python 3

О работодателе:
( 3 отзыв(-а, -ов) ) Beirut, Lebanon

ID проекта: #22586045

1 фрилансер в среднем готов выполнить эту работу за $250


Hello. I am a excellent web developer.' I read your description carefully. Having different skill with python, php,and Google Kubernetes ,I can complete your project. Please send me your message and let us discuss i Больше

$250 USD за 7 дней(-я)
(1 отзыв)