My name is Alexander, but most people call me Alex. I’m 29 years old and I’m from Buenos Aires, Argentina. I’m passionate about everything related to data and programming,
Throughout my 7 years of professional experience, I’ve had the privilege of working with diverse companies and people from different parts of the world. My mission has always been to help them find value in data-driven decision-making and to automate their processes.
I have held various roles within the data field, which has given me an excellent understanding of data flow. I started as a Research Analyst, performing root cause analysis and troubleshooting issues in real time. Later, I transitioned to a Product Analyst role at Google, specifically working on the Google My Business product. After that, I took on the role of Operations Analyst, where I conducted studies both on the product and the overall operation and presented metrics directly to Google.
Seeking more technical knowledge, I worked at Fluid Truck as a Data Analyst, where I also gained experience as a Data Engineer. This allowed me to understand the entire process of data extraction, loading, and transformation (ETL), as well as data analysis and the creation of metrics and dashboards.
After three years of experience at Fluid Truck, I transitioned to a Data Analyst role at Dialpad, where I continue to perform tasks similar to my previous position but using different tools. Additionally, I am currently working with First Day Life INC. as a Looker Developer, where I develop their Looker ecosystem and ensure that the project follows industry best practices and standards. I also create content for stakeholders to support data-driven decision-making.
Extensive experience working in the GCP environment, utilizing tools such as Dataform, BigQuery, Airflow, and Dataplex to manage, process, and analyze large-scale data efficiently.
Applied MySQL in backend development and data analysis, optimizing queries to support robust and scalable data operations.
Leveraged PostgreSQL for backend development and data analysis, ensuring data integrity and performance in diverse projects.
Utilized Python for data ingestion, process automation, and the development of efficient data pipelines.
Utilizacion de dbt para la transformacion de los datos.
Designed and implemented DAGs in Apache Airflow to automate data workflows and schedule script executions seamlessly.
Applied JavaScript in web development projects and the creation of custom Dataform ecosystems for efficient data workflows.
Developed server-side applications and APIs using Node.js to support dynamic and scalable web applications.
Created interactive dashboards and data visualizations to provide actionable insights tailored to specific client needs. Mostly as a freelancer.
Built dynamic dashboards and data visualizations at Fluid Truck, utilizing LookML to model complex datasets effectively.
Developed dashboards and visualizations for clients and educational purposes, turning complex datasets into clear insights.
Designed and structured web pages using HTML to create intuitive and user-friendly interfaces.
Styled and enhanced web applications with CSS, ensuring responsive and visually appealing designs.
Performed detailed data analysis and created ad-hoc reports for fast decision-making using Excel.
Managed version control and collaborated on projects using GitHub to maintain code integrity and history.
Organized projects and governed data using Docs for documentation, planning, and collaboration.
Ingested data from Zoho into a Data Warehouse through API integrations, ensuring data availability for analysis.
Ingested data from Zoho into a Data Warehouse through API integrations, ensuring data availability for analysis.
Automated data ingestion workflows from multiple sources into a Data Warehouse using Fivetran.
Utilized Confluence for project management, documentation, task assignment, and SCRUM-based workflows.