solutions using Python, focusing on data pipelines (ETL, SQL).
Support business intelligence efforts
Python programming and building big data pipelines (ETL, SQL).
Knowledge of Qlikview and awareness of
requirements and design data models Implement and manage ETL processes for data extraction, transformation, and field 10 years of experience in data warehousing, ETL development, and data modeling Proficiency in SQL database technologies (e.g. Oracle, SQL Server), and ETL tools (e.g. Informatica, Talend) Experience with
Engineer You'll drive innovation by developing APIs and ETL pipelines using Python and AWS technologies. Dive experiences at BMW What You'll Do: Develop APIs and ETL pipelines using Python and AWS (Lambda, DynamoDB
Architecture development Data modelling including solid ETL, designing and building packages. Developing interactive
solutions using Python, focusing on data pipelines (ETL, SQL). Support business intelligence efforts with Python programming and building big data pipelines (ETL, SQL). Knowledge of Qlikview and awareness of other
(Oracle/PostgreSQL), PySpark, and Boto3.
Experience with ETL processes, Docker, Linux/Unix, and Big Data tools
What You'll Do:
Develop APIs and ETL pipelines using Python and AWS (Lambda, DynamoDB
(Oracle/PostgreSQL), PySpark, and Boto3. Experience with ETL processes, Docker, Linux/Unix, and Big Data tools
and Reporting, PowerBI and Business Intelligence, ETL, Azure data processing and relevant non-SQL data SSIS, SSAS) - Strong hands-on experience with SQL, ETL processes and data warehousing methodologies - Experience
Understanding of data flows, data architecture, ETL and processing of structured and unstructured data