This job is no longer available. Continue your job search here.
Data Engineer
Bengaluru
Job No. atci-4074278-s1581559
Full-time
Job Description
Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : GCP Dataflow
Good to have skills : Google BigQuery
Minimum 3 year(s) of experience is required
Educational Qualification : Any Graduate
Summary: As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will play a crucial role in managing and optimizing data infrastructure to support business needs and enable data-driven decision-making. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and develop scalable data pipelines to extract, transform, and load data from various sources. - Ensure data quality and integrity by implementing data validation and cleansing processes. - Collaborate with cross-functional teams to understand data requirements and design efficient data models. - Optimize data infrastructure and performance to support data-driven decision-making. - Troubleshoot and resolve data-related issues and provide technical support to stakeholders. Professional & Technical Skills: - - Must To Have Skills: Proficiency in GCP Dataflow, Google BigQuery. - - Strong understanding of data engineering concepts and best practices. - - Experience with cloud-based data platforms such as Google Cloud Platform (GCP). - - Hands-on experience with data integration and ETL tools. - - Familiarity with data warehousing and data modeling techniques. - - Knowledge of programming languages such as Python or Java. Additional Information: - The candidate should have a minimum of 3 years of experience in GCP Dataflow. - This position is based at our Bengaluru office. - Any Graduate is required.
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : GCP Dataflow
Good to have skills : Google BigQuery
Minimum 3 year(s) of experience is required
Educational Qualification : Any Graduate
Summary: As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. You will play a crucial role in managing and optimizing data infrastructure to support business needs and enable data-driven decision-making. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Design and develop scalable data pipelines to extract, transform, and load data from various sources. - Ensure data quality and integrity by implementing data validation and cleansing processes. - Collaborate with cross-functional teams to understand data requirements and design efficient data models. - Optimize data infrastructure and performance to support data-driven decision-making. - Troubleshoot and resolve data-related issues and provide technical support to stakeholders. Professional & Technical Skills: - - Must To Have Skills: Proficiency in GCP Dataflow, Google BigQuery. - - Strong understanding of data engineering concepts and best practices. - - Experience with cloud-based data platforms such as Google Cloud Platform (GCP). - - Hands-on experience with data integration and ETL tools. - - Familiarity with data warehousing and data modeling techniques. - - Knowledge of programming languages such as Python or Java. Additional Information: - The candidate should have a minimum of 3 years of experience in GCP Dataflow. - This position is based at our Bengaluru office. - Any Graduate is required.
Qualifications
Any Graduate
Please be informed that at any given point in time, you can only have one "Active" application.
Please be informed that at any given point in time, you can only have one "Active" application.