This job is no longer available. Continue your job search here.
Application Developer
Hyderabad
Job No. atci-4423684-s1723486
Full-time
Job Description
Project Role : Application Developer
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : PySpark
Good to have skills : AWS Architecture, Python (Programming Language)
Minimum 5 year(s) of experience is required
Educational Qualification : Any technical graduation
Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable and efficient applications. Key Responsibilities: • Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions • Build and operate very large data warehouses or data lakes. • ETL optimization, designing, coding, & tuning big data processes using Apache Spark. • Build data pipelines & applications to stream and process datasets at low latencies. • Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: • Minimum of 1 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark • Minimum of 3 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. • Minimum 2 year of Experience in one or more programming languages Python, Java, Scala • Experience using airflow for the data pipelines in min 1 project • 1 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform Should be comfortable to work in B shift
Project Role Description : Design, build and configure applications to meet business process and application requirements.
Must have skills : PySpark
Good to have skills : AWS Architecture, Python (Programming Language)
Minimum 5 year(s) of experience is required
Educational Qualification : Any technical graduation
Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable and efficient applications. Key Responsibilities: • Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions • Build and operate very large data warehouses or data lakes. • ETL optimization, designing, coding, & tuning big data processes using Apache Spark. • Build data pipelines & applications to stream and process datasets at low latencies. • Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: • Minimum of 1 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark • Minimum of 3 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. • Minimum 2 year of Experience in one or more programming languages Python, Java, Scala • Experience using airflow for the data pipelines in min 1 project • 1 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform Should be comfortable to work in B shift
Qualifications
Any technical graduation
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.