Creation of a Static Professional Profile Page
This project involved creating a static page to present my professional profile using AWS cloud infrastructure.
Several technologies and services were used: Amazon S3 to host static files, CloudFront for content distribution with high availability and low latency,
Route 53 for domain management and ACM (AWS Certificate Manager) for SSL/TLS security. Pages were built with HTML, CSS and JavaScript following web development best practices.
Additionally, I used Artificial Intelligence during development to assist in code creation and to translate content into English,
enabling a more global reach.
Technologies Used: AWS S3, CloudFront, Route 53, ACM, HTML, CSS, JavaScript, AI for development and translation.
Integration of Azure Environment with Company Headquarters
In this project we interconnected the Azure environment with the company's headquarters using a Virtual Gateway for site-to-site VPNs with BGP in an active-active model, routing links through Barracuda firewalls and configuring link weights for contingency.
A new AD DS instance with Windows Server 2025 was created in Azure and the service was added to the forest by raising the functional level. We migrated Entra ID Connect to the new AD DS instance and propagated Azure networks via BGP to partner datacenter networks,
reaching our other cloud environment in GCP. On the AD server in Azure, we configured the Power BI Data Gateway so dataflows could access private endpoints such as databases, Samba shares, Trino, MongoDB and others.
All VPN infrastructure was properly tagged and allocated into separate resource groups and connected via peering to the VPNs.
Technologies Used: Azure, Azure Virtual Gateway, BGP, Barracuda Firewall, AD DS, Windows Server 2025, Azure AD Connect, Power BI, GCP.
Automation and Monitoring of the Critical QlikView Environment
QlikView powered the company’s main products, accessed daily by external customers for critical business reports and dashboards.
The infrastructure was based on a 3-node Windows Server failover cluster connected to a shared storage.
Although designed for high availability, this legacy environment presented several fragilities in practice.
Main pain points included: manual maintenance processes, lengthy cluster and load balancing validations, difficulties restoring services after failures and limited monitoring that often only detected issues after customer complaints.
This required on-call shifts (including weekends), making QlikView the weakest link and creating high unavailability risk for customers.
Interim solutions such as manual checklists and isolated PowerShell/Bash scripts provided limited gains.
The turning point was implementing an automation and monitoring ecosystem orchestrated by Rundeck, running on an Ubuntu server in GCP connected via Interconnect to datacenters and partners.
The environment included scheduled and manual pipelines using multiple technologies: Bash, Python, PowerShell,
Node.js with Puppeteer, and Ansible (playbooks and ad-hoc via SSH and WinRM). Implemented automations included:
- Monitored and forced reboots via iDRAC in case of failure.
- Automated movement of Cluster Shared Volumes (CSV) to maintain continuity.
- Service startup and validation after reboots.
- Integrity checks via QlikView Server health endpoints.
- Load balancing tests with synthetic traffic generated by Puppeteer.
- Chained error handlers to restart web services and recover clusters.
- Continuous monitoring in Python validating product loads and triggering alerts via email and Microsoft Teams.
- Recognition of known failures by hashing error screenshots and executing automatic fixes.
Additionally, a Rundeck credential vault was used to protect credentials and keys, and job-level variables were declared to promote code reuse and standardization.
All scripts included timeouts, retries and fail-safe mechanisms.
Job logs and statuses were integrated into Power BI Service data pipelines via Gateway, creating real-time failure and availability dashboards.
Code was versioned in Azure DevOps with Markdown documentation for each job and script.
The result was transformative: availability rose to 99.997%, on-call duties were eliminated, and the team gained full predictability and visibility over the environment.
The team could focus on strategic projects while the legacy QlikView environment ceased to be a critical bottleneck.
The approach was later expanded to QlikSense servers with significant improvements as well.
Technologies Used: Rundeck, Ubuntu Server (GCP), Bash, Python, PowerShell, Node.js, Puppeteer, Ansible (SSH/WinRM), iDRAC, Power BI, Azure DevOps.
Expansion of Automations for Corporate Environments and Services
Following the success of automating the legacy QlikView environment, we expanded the approach to other critical environments and services across the company.
The Ubuntu server with Rundeck hosted in GCP and connected via Interconnect with BGP already had access to corporate environments,
enabling centralization and standardization of automation execution.
The first focus was monitoring SQL Server jobs, which frequently hung. Pipelines were created to validate job execution and, when a hang was detected, automatically apply fixes based on previously mapped errors.
Corrections triggered email notifications and Teams alerts between IT Operations and Operations teams who were impacted by those specific job failures.
Active monitoring of FTP and SFTP services was also implemented, including tests for access, upload, write and connectivity to identify issues before they impacted customers and partners.
Another milestone was creating facilities jobs that automated day-to-day operational tasks previously done manually by interns or junior analysts.
This avoided the need to grant critical access to portals like GCP, Cloudflare and internal DNS.
Jobs required input variables, validated syntax and data coherence, and executed tasks in a controlled manner. Examples:
- Automated download of files from GCP buckets.
- DNS registration in Cloudflare with propagation testing and evidence generation for tickets.
- Internal DNS registration with logs and audit trails.
For security, execution ACLs based on AD profiles were applied. Users authenticated via LDAP only saw jobs permitted to their group and could execute them only if input data passed validation.
The expansion also included integrated processes with Power BI Service: updating dataflows and datasets via Power Automate API, triggering pipeline steps automatically.
Jobs for server data cleanup, automatic log backups to GCP buckets and many other automations reduced manual effort and increased operational reliability.
As a result, the company evolved from manual and decentralized processes to a highly automated environment with standardization, security,
traceability and scalability. Gains were not only in availability and response time but also cultural, establishing automation as a core operational pillar.
Technologies Used: Rundeck, Ubuntu Server (GCP), Bash, Python, PowerShell, Node.js, Puppeteer, Ansible, SQL Server, FTP/SFTP,
Cloudflare API, GCP Storage, AD/LDAP, Power Automate, Power BI Service.
Lift-and-Shift Migration to Cloud (Hyper-V → GCP/Azure)
Project to decommission the on-premises Hyper-V environment and perform a lift-and-shift migration of main servers to the cloud,
preserving application compatibility and configuring the new infrastructure with static IPs, automatic snapshots and preserved integrations.
Main scope: Rundeck and Bookstack (KB) migrated to GCP; OCS Inventory NG migrated to Azure.
Planning and Preparation
- Inventory of servers, dependencies and application mapping in Hyper-V.
- Definition of network architecture and static IP assignment per workload at destinations.
- Creation of GCP buckets for staging images during migration.
- Complete budgeting (storage, VMs, traffic, licenses) and presentation to responsible areas.
Execution and Conversions
- Export of Hyper-V VHDX files and upload to migration project buckets.
- Disk conversion to formats required by destinations (e.g., VHDX → RAW/IMG → compatible with GCE/Azure).
- Provisioning VMs in Google Compute Engine (Rundeck/Bookstack) and Azure (OCS Inventory NG), with static IPs and firewall rules.
- Due to Linux version incompatibility on the OCS server, a rebuild of the Linux VM was performed and the MariaDB database migrated to the new environment.
Configuration and Tests
- Connectivity validations (internal/external), database dependencies and integration checks.
- Refactoring data pipelines for new IPs/hosts to ensure access to dependent databases and services.
- DNS updates (Cloudflare and internal) to point to new IPs and controlled shutdown of old servers.
Service and Operational Adjustments
- OS and application package updates during migration.
- Installation of Webmin on Linux instances for browser-based administration.
- Configuration of automatic snapshots for quick recovery.
Results
- Consolidation in cloud with high availability and simplified recovery via snapshots.
- Modernization of the technology stack and elimination of dependence on local hardware.
- Standardization and simplified management via Webmin and adjusted data pipelines.
- OCS Inventory NG redeployed and fully operational in Azure with MariaDB migrated.
Technologies Used: Hyper-V, VHDX, Google Cloud Platform (Compute Engine, Cloud Storage), Microsoft Azure, static IP, Cloudflare DNS, internal DNS,
Webmin, Linux, Windows Server, LDAP/AD, Rundeck, Bookstack, OCS Inventory NG, MariaDB, automatic snapshots, data pipelines.
Management Dashboard for Monitoring Email Response Time — Finance
This project originated from the finance team's need to monitor and manage response times for emails sent to the team's standard address.
The goal was to provide the area manager with a clear view of SLA, efficiency and per-customer handling.
Planning and Design
- Creation of an Azure Entra ID app for secure authentication and permissions to access the Exchange API.
- Granting the necessary permissions to read email metadata without exposing sensitive content.
- Definition of management metrics: response time, SLA, business days, holidays, view by customer and by email.
ETL and Data Processing
- Connection to Exchange API to extract email metadata (sender, recipient, send/receive timestamps, subject, etc.).
- Development of dataflows in Power BI Service to perform ETL: filter, clean and transform data for relevant analyses.
- Application of rules to calculate response time considering business days, holidays and SLA defined by the manager.
Management Dashboard
- Creation of interactive dashboards in Power BI Service, with charts, detailed email lists and key performance indicators (KPIs).
- Dynamic filters by customer, email and period.
- Automatic updates via scheduling, keeping data current without manual intervention.
- Calculated metrics: average response time, pending emails, emails responded within SLA, individual performance per owner.
Results and Benefits
- Complete visibility of the finance team's email handling performance.
- Reduced manual analysis and monitoring time for the manager.
- Easy identification of bottlenecks and customers with higher demand.
- Project delivered successfully, generating positive feedback and increased confidence in data-driven strategic decisions.
Technologies Used: Azure Entra ID, Exchange Online API, Power BI Service, Power BI Dataflows, ETL, interactive dashboards, SLA calculations and management metrics.
Commercial and Negotiation Dashboard — RD Station Integration
Project requested by the commercial area to fill gaps in analysis and specific views that were not directly available in RD Station.
The goal was to generate strategic insights for Sales, Finance and Executive Management, centralizing information in Power BI Service dashboards.
Planning and Design
- Mapping RD Station API endpoints relevant to sales, negotiations and opportunities.
- Controls to avoid rate limits and excessive requests, ensuring ETL process stability.
- Development of an API monitoring dashboard with alerts about approaching request limits to enable preventive action.
- Separation of dimension dataflows and fact dataflows for organization and later consolidation in the dataset.
ETL and Data Processing
- Advanced ETL to clean, transform and unify data, creating complex measures and indicators.
- Historical capture of negotiations via Power Automate, calling the API weekly to retrieve previous states.
- Storage of semi-structured JSON files in SharePoint with access control, avoiding NoSQL complexity.
- Mashups inside Power BI Service dataflows to consolidate historical and current data, allowing analysis of date, value and negotiation changes.
Dashboard and Power BI App
- Interactive dashboards with KPIs, charts and advanced indicators.
- Detailed filters by seller, customer, lost deals, time periods and other relevant criteria.
- An app within Power BI Service centralizing all commercial area analyses.
- Automated updates via Power Automate to ensure shared Azure server data was refreshed regularly, even under instability.
Results and Benefits
- The dashboard became the central reference for commercial strategic meetings.
- Ability to analyze past negotiations, price changes and delays, providing actionable insights.
- API monitoring ensures ETL continuity and prevents interruptions in the data flow.
- Support for sales, finance and executive decisions, making the project a pillar of data analysis.
Technologies Used: Power BI Service, Power BI Dataflows, Power Automate, SharePoint, RD Station API, ETL, JSON, interactive dashboards, KPIs, API monitoring.
Inventory and License Control Dashboard — Microsoft 365 & Security
Project aimed at centralizing and automating inventory and license control, as well as monitoring security resources like Intune, Defender and Defender for Endpoint Server.
The goal was to provide comprehensive management reports for Finance and IT, giving visibility into costs, usage and registry inconsistencies.
Planning and Design
- Creation of an Azure Entra ID app for secure authentication and endpoint access.
- Mapping Microsoft endpoints (users, licenses, direct managers, roles and registry metadata).
- Integration with OCS Inventory NG's MariaDB, the Bookstack API (corporate KB) and Kanbanize (card system) for cross-referencing information.
- Building advanced data mashups to consolidate user, machine, license and cost information.
ETL and Data Processing
- Creation of dataflows in Power BI Service to transform, clean and consolidate data.
- Automated cross-referencing to produce financial reports for Microsoft 365 license allocation and inventory, considering depreciation, warranty, brand, model and device condition.
- Querying Microsoft endpoints to obtain SKUID table and generate friendly names for reports.
- Development of per-department and per-user cost calculations with detailed resource and license cost views.
Data Monitoring and Governance
- Implementation of an inconsistency monitoring system identifying:
- Names out of standard
- Unused or depreciated machines
- Devices assigned to more than one person
- Outdated operating systems
- Users without manager, department or KB profile
- Name mismatches in Kanbanize
- Daily alerts sent by email to the responsible team for quick correction and to keep the environment organized.
Dashboard and Reports
- Interactive dashboards published on SharePoint for detailed inventory, license and cost views.
- Managed and automated reports for Finance with scheduled distribution.
- Compliance and inventory health indicators, improving governance and reducing registry errors.
Results and Benefits
- Complete visibility of the technology park and Microsoft 365 licenses, with cost and usage breakdowns.
- Proactive detection of inconsistencies and registry issues, preventing audit and financial process failures.
- Automated reports and dashboards reducing manual compilation and analysis time.
- Centralized information in dashboards and SharePoint enabling faster, more reliable strategic decisions.
Technologies Used: Azure Entra ID, Power BI Service, Power BI Dataflows, MariaDB (OCS Inventory), Bookstack API, Kanbanize API, SharePoint, interactive dashboards, inconsistency monitoring, advanced ETL.
Mapping License Inconsistencies — QlikView and QlikSense
Project aimed at identifying and correcting inconsistencies in QlikView and QlikSense license assignments to avoid exceeding contracted licenses and to ensure active users have proper access.
Planning and Design
- Identification of issues: deactivated users still assigned licenses and active users without licenses.
- Definition of extraction and cross-referencing processes between systems to produce inconsistency reports.
Automation with Rundeck
- All reports were extracted via Rundeck jobs, ensuring scheduled and manual execution when needed.
- AD and application user reports were collected automatically and saved to Samba shares or CSV files.
- These datasets were consumed by Power BI Service to cross-check active users against assigned licenses and generate consistent paginated reports.
QlikSense
- Rundeck connected to QlikSense's PostgreSQL database and extracted licensed users.
- Reports were combined with active OU users from AD and sent to Power BI Service via Azure Data Gateway.
- Daily paginated reports were generated and distributed to the responsible team for proactive license correction.
QlikView
- Because there was no API or direct database access, a Node.js script with Puppeteer was created to access QMC, navigate to licensed users and export the HTML table to CSV.
- The CSV was processed similarly: crossed with AD, consumed by Power BI Service and used to generate a daily paginated report.
Results and Benefits
- Effective control of QlikView and QlikSense licenses, preventing contract overruns.
- Complete automation of data collection and processing via Rundeck, reducing manual errors.
- Consistent automatic reports in Power BI Service enabling proactive license maintenance.
- Integration with management dashboards for continuous analysis and monitoring.
Technologies Used: Rundeck, PostgreSQL, Node.js, Puppeteer, Power BI Service, Azure Data Gateway, Active Directory, CSV, report automation, data mashups.
Hybrid Active Directory Automation — Dynamic Groups & Provisioning
Project focused on automating hybrid Active Directory management by synchronizing additional fields and enabling dynamic groups based on user metadata rules.
Planning and Design
- Synchronization of additional local AD fields to Azure AD, expanding automation and provisioning possibilities.
- Creation of dynamic groups based on rules derived from user metadata.
- Automation of user provisioning to SSO and third-party applications, including auto-provisioning systems like GCP's Cloud Identity.
Integration with Microsoft 365
- Dynamic groups feed both security groups and mail-enabled groups in Microsoft 365.
- Automatic assignment of Microsoft 365 licenses and Defender for Business based on dynamic group membership.
- Implementation of retention rules and management of mail group size limits to prevent delivery issues caused by excessive members.
Automation and Benefits
- Reduction of manual work in group and license administration.
- Ensured consistency and compliance with security and provisioning policies.
- Full automation of licensing, SSO and group management processes, minimizing errors and increasing efficiency.
- Integration with hybrid infrastructure, allowing provisioning policies to be applied transparently across on-premises and cloud environments.
Technologies Used: Azure AD, Hybrid Active Directory, PowerShell, Microsoft 365, Defender for Business, Dynamic Groups, SSO, Cloud Identity (GCP), email retention rules, provisioning automation.
Immersion Project in Networks and Telecommunications
This project aimed to provide technical and theoretical training for internal employees, exploring networks and telecommunications.
Besides increasing sector understanding, the initiative aligned participants with the products and solutions sold by the organization.
The program was structured as biweekly two-hour sessions covering topics from basics to advanced subjects, always contextualized with practical applications.
Each session's materials were converted into digital content to ensure knowledge preservation and accessibility.
Methodology and Deliverables:
- Creation of a technical glossary and detailed handouts.
- Development of didactic content and interactive presentations.
- Recording and editing presentations, published on a private YouTube channel.
- Integration of content into a Moodle platform for learning management.
- Application of exams and automatic issuance of digital certificates.
Results:
- High participant engagement rates.
- Overall rating "very good", highlighting a significant increase in technical knowledge.
- Strengthening of a continuous learning culture and alignment with company products.
Technologies and Resources Used: Moodle, YouTube (private channel), video production and editing, interactive presentations, glossary and digital handouts.