Key Responsibilities:
- Design, implement, and optimize Big Data Pipelines using AWS services.
- Ensure data integrity, security, and compliance.
- Collaborate with cross-functional teams to integrate solutions.
- Develop and maintain technical documentation.
- Conduct thorough testing and validation of data solutions.
- Stay updated with industry trends and best practices.
Qualifications:
- Relevant IT/Business/Engineering Degree.
- AWS Certified Cloud Practitioner or equivalent.
- Strong proficiency in Terraform, Python, SQL, and ETL.
- Experience with Docker, Linux/Unix, and Big Data technologies.
- Excellent communication and problem-solving skills.
If you're ready to shape the future of data engineering and make an impact, apply now! Exciting challenges and opportunities await you as part of our dynamic team.
Apply Now