I am a Systems engineer formed as an OOP developer, with 5 years of Data engineering experience. I have strong analysis, design, and problem-solving skills with fast-learning aptitudes.
My experience extends to work with ETLs/ELTs, Data engineering, Data warehousing, Data management and visualization with companies of different sizes working for in-house projects and third party projects.
Also, I have worked with machine learning engineers and I’m doing my post degree in this fieldto get a better understanding of the full data pipeline landscape in the moment.
I am always interested in team-working, face challenges and keep improving as a data engineer.
2021
- Support GCP data solutions (dataproc, dataflow, bigquery) with cloud monitoring, cloud operations and alerting
- Design and advice in the construction of data solutions
2021
- Integrate data from a micro-services application built over kubernetes.
- Select the right tools for the data flow, integration and visualization.
- Allow the possibility of data integrations, exploration and exploitation with multiple teams and tools (BI, data scientists, marketing teams).
2020 - 2021
- Migrate 350+ tables from 14 different data sources to the cloud (GCP) to build a data-lake and data-warehouse.
- Build data-pipelines and ELT's with Matillion
- Building store procedures on Bigquery and reports on Datastudio.
2020
- Design an end-to-end data pipeline for developing a data related project, including the extraction, data centralization and visualization.
- Integrate and visualize information in Splunk.
2019-2020
- Design, test, and maintain data in Snowflake
- Work closely with a data-science team retrieving data and creating queries to feed different predictive models builded in python.
- Create dashboards in Tableau to visualize the predictive models behavior, results and precision.
- Find user fraud out of our existing data, to increase efficiencies and reduce costs.
2017-2019
- Design, test, maintain, clean and manage a Teradata DW with 2000+ tables.
- Created efficient and scalable data models compatible with the data ecosystem of the company and integrate them in the DW.
- Oversaw the complete ETL process.
- Worked closely with team members, consultants, stakeholders, and solution architect in an agile methodology.
- Leading consultant in the building of new data models and ETLs.
- Improvement performance of existing Queries and ETLs.