business objectives. Utilise business intelligence tools (such as Tableau, Power BI, or similar) to create abreast of advancements in business intelligence tools, technologies, and methodologies, and recommend manipulation. Experience with data visualisation tools such as Tableau, Power BI, or similar. The Reference
working with large datasets and data manipulation tools (e.g., Pandas, NumPy) will be beneficial Familiarity Hadoop, Spark) Experience with data visualization tools (e.g., Tableau, Power BI, Grafana) Analyse complex
debugged Working with Enterprise Collaboration tools such as Confluence, JIRA etc Developing technical documentation and artefacts Working with Data Quality Tools such as Great Expectations Developing and working
Data Science concepts and principles Experience in tools like pandas, SciPi, ev. Pyomo At least 5 years' Experience with Continuous Integration and Delivery tools (e.g. GitLab, Terraform, Ansible, Concourse) Added
Strong working knowledge with software development tools, techniques and approaches used to build application solutions Working knowledge with software development tools, techniques and approaches used to build application
visualization/exploration tools (Power BI, Tableau etc). Familiarity with common python based ETL tools such as PySpark
warehousing, data lakes, cloud technologies , and ETL tools , we encourage you to apply. Responsibilities: Design support business needs Implement ETL processes using tools such as AWS Glue, Azure Data Factory, Matillion
of Nagios or similar technologies as a monitoring tool APPLY NOW Requirements: Min. 5 years working experience
(Handlebars) Proficient understanding of code versioning tools (Git, GitHub) Solid expertise in PostgreSQL and
development. Access to cutting-edge technologies and tools. Collaborative and inclusive company culture. If