Key Experiences
Full Experience
Python
Python - Web Scraping - FastAPI - Scrapy - Beautiful Soup - Pandas - Regex - Flask - JSON - Git - Bash - Docker - Kubernetes - Linux - Virtual Machines - ArgoCD - Kubeflow - Streamlit - Selenium - API REST - Amazon Web Services - Jupyter Notebooks - Click - PyMongo - Pika
Data Engineer
- Apache Airflow - AWS Glue - Data Modeling - ETL - Data Extraction - Data lakes - Retrieval-Augmented Generation (RAG) - Generative AI - LangChain - Flowise - Amazon EC2 - Amazon Bedrock - Github Actions
Saving Data
SQLAlchemy - PostgreSQL - MySQL - MongoDB - Microsoft SQL Server - AWS S3 - Azure Blob Storag - Neo4j - ChromaDB
AI Related
Large Language Models (LLM) - Computer Vision - Generative AI - Vector Databases - LLama-CPP - Torch - ChatGPT - Retrieval-Augmented Generation (RAG) - LangChain - Flowise - CUDA -Hugging Face Transformers - OpenAI API - GPT-3/4 - TensorFlow Hub - BERT - DALL·E - Keras - Word2Vec - OpenCV (for computer vision) - Detectron2
My Journey
2011
START
I started by creating small apps and games in Unity.
This hands-on experience helped me understand how Object-Oriented Programming (OOP) works and what it takes to build an application.
It made programming feel less like magic and more like a skill I could master.
2014
Small Applications
I created small mobile applications and a first-person shooter (FPS) game that received over 5,000 positive feedbacks on Steam Greenlight.
This experience showed me the real-world potential of programming languages and how they can bring creative ideas to life.
2016
University
I began studying Computer Science to deepen my understanding of technology and learn how to design effective systems.
Mastering the basics has helped me grasp more complex solutions with greater confidence.
2017
First Freelance Jobs
With a solid foundation in programming, I developed small software solutions, including data extraction and automation tools.
This process taught me how to understand client needs, manage time effectively, and deliver results efficiently.
2018
Intern - System Analyst
As an Intellectual Property Protection Analyst, I worked with analysis and removal of illegal content on the internet.
Additionally, I served as a Developer, creating task automation tools using Python to optimize processes and ensure an effective approach to intellectual property protection.
Dec -2019
The real start
To conclude my graduation, I contributed to the Sparrow Project: an expert system for supervised learning and regressive inference, designed to identify patterns of products violating intellectual property rights on marketplaces.
This work was presented at CONIC-SEMESP, Brazil's largest scientific initiation congress, which provided greater visibility for the entire team.
2020
Full-Time Python Developer
First big challenges envolving Data Extraction.
At Jive, my first major challenge was designing a completely new architecture for all data extraction bots.
This experience taught me how to develop bots for various industries while delivering rich datasets quickly, even with limited resources and a small team.
Despite being a Junior developer, I handled responsibilities and tasks typically assigned to Senior developers, which significantly accelerated my growth and expertise.
Mid 2020
Python Course
I developed a Python course called "Python for Jivers," which over 30 people completed.
The course enabled them to create small automation tools and low-code applications using Python.
This project was especially valuable to me, as it helped me learn how to establish clear, accessible communication with people who had never coded before.
Mar 2021
New Challenges
I joined Kognita Lab, a company focused on AI-driven solutions, where my main challenge was managing anti-bot systems, proxy management, captchas, and capturing over one billion rows of data each month.
This experience helped me develop the skills needed to create scalable applications, ensure they remain undetected by anti-bot systems, and maintain high-quality data.
2022
Year of Optimization
At Kognita Lab, the company was undergoing a radical transformation to optimise the infrastructure of multiple processes. As data volumes grew exponentially, I played a key role in migrating geospatial data from PostgreSQL to AWS data lakes.
I used Docker and Kubernetes to manage and scale data extraction pipelines and created a custom private library to handle common functions and project-specific requirements.
This experience enabled me to manage over 70 cost-effective and scalable data pipelines.
Jan 2023
Year of Experiments
As an enthusiast of AI and the early advancements of Large Language Models (LLMs), I began experimenting with LLMs on my personal computer.
I conducted numerous experiments using ChatGPT-3 for data extraction processes, which made me confident that this combination could shape the future of the field.
May 2023
Exploring the unknown
This led me to join GEP Worldwide, where my focus expanded beyond data extraction to integrating AI into challenging solutions that the company had previously been unable to address.
Working with unstructured documents and images allowed me to sharpen my skills in computer vision and natural language processing, while also applying effective methodologies to maximise the potential of AI.
2024
Year of Accomplishments
As AI models became more efficient and cost-effective, they enabled greater accuracy and scalability in AI-driven processes.
I developed several experimental AI tools, including:
Code-Regen: An AI-powered tool that automatically fixes code while it is still running.
Advanced Data Extraction Tools: Combining LLMs’ vision capabilities with OCR tools to extract data from complex images.
Advanced Agentic AI Workflows: Designed sophisticated workflows for data extraction from challenging documents and texts that require deep interpretation.
Additionally, I remained active in AI forums and dedicated over 150 hours to English conversations on Preply, which helped improve my communication skills and expand my global network.
2025
What's next?
I continue to take on challenging projects because I view them as valuable opportunities for growth.
Alongside this, I’ve worked to improve my communication skills. Having achieved a C2 level in English, my next goals are to learn Spanish and French.
I now see less focus on the AI hype and more emphasis on creating practical, effective solutions with AI.