$ shohanuzzaman
Infrastructure & Cloud Dhaka, Bangladesh·available for work
kubectl get engineer

Md. Shohanuzzaman

Associate Software Engineer

DevOps Engineer — Kubernetes, Terraform & CI/CD pipelines

800+
CP problems
3
AWS certs
4+
stacks deployed
2y+
industry exp
MS
online
md.-shohanuzzaman@portfolio:~
$ cat profile.yaml
name:      Md. Shohanuzzaman
role:      Associate Software Engineer
location:  Dhaka, Bangladesh
focus:     Infrastructure & Cloud
email:     shishirshohanuzzaman@gmail.com
phone:     +880 1713772335
$ cat intro.md

Short introduction

I'm Shohanuzzaman — a software engineer focused on turning raw operational data into reliable, queryable intelligence. Day to day I design data lakehouse systems, write Terraform for full AWS stacks, and automate multi-stack deployments with GitLab CI/CD. I love DevOps ergonomics, Agentic AI patterns, and shipping infrastructure that teammates actually enjoy using. Currently based in Dhaka, Bangladesh, open to remote engagements.
Solved
800+
CP problems
AWS certs
3
Academy
Stacks
4+
multi-stack CI/CD
Experience
2y+
industry
Data Lakehouse

Medallion-architecture pipelines with ClickHouse, dbt, Airbyte, MinIO and Airflow.

Cloud & IaC

Config-driven Terraform for full-stack AWS environments. Kubernetes on self-managed EC2.

CI/CD Automation

Shared GitLab pipeline libraries covering .NET, Flutter, React Native, Angular with Docker.

Agentic AI

LLM-assisted dbt model generation from source schemas; MindsDB natural-language SQL.

$ cat about.md

About me

DevOps & AI/ML Engineer — cloud infrastructure, data pipelines, and intelligent automation.

Motivated software engineer with hands-on experience building and deploying cloud-native data platforms on Kubernetes, specialising in Medallion Architecture (Bronze → Silver → Gold) for banking-scale analytical workloads. Proficient in CDC-based ELT ingestion with Airbyte, columnar data modelling in ClickHouse, dbt-driven transformation pipelines, and pipeline orchestration with Apache Airflow. Experienced in AI-assisted automation — including LLM-based dbt model generation from source schemas — and integrating natural language query interfaces via MindsDB. Skilled in AWS infrastructure provisioning with Terraform and GitLab CI/CD automation across heterogeneous technology stacks. Passionate about competitive programming and driven by the challenge of turning raw operational data into reliable, queryable intelligence.

Tools & Technologies

Data & Warehousing
ClickHouse
ClickHouse
dbt
dbt
Airbyte OSS
Airbyte OSS
Apache Airflow
Apache Airflow
Apache Superset
Apache Superset
MindsDB
MindsDB
MinIO
MinIO
Infrastructure & Cloud
Terraform
Terraform
AWS
AWS
GCP
GCP
Docker
Docker
Kubernetes
Kubernetes
CI/CD
GitLab CI/CD
GitLab CI/CD
Nginx
Nginx
Databases
Oracle
Oracle
MongoDB
MongoDB
MySQL
MySQL
PostgreSQL
PostgreSQL
SQLite
SQLite
Languages
Python
Python
Bash
Bash
C++
C++
Dart
Dart
JavaScript
JavaScript
C#
C#
Frameworks
ASP.NET Core
ASP.NET Core
Flutter
Flutter
React
React
Next.js
Next.js

Education

Shahjalal University of Science and Technology, Sylhet
B.Sc. in Computer Science & Engineering

Certifications

Interests

DevOpsData LakehouseCloud ComputingAgentic AIWorkflow AutomationCompetitive Programming
$ ls ./projects

Selected work

Hand-picked projects covering infrastructure, data, and full-stack development.

Banking Data Lakehouse on Kubernetes with CDC-driven Medallion Architecture.

CLClickHouseDBdbtAIAirbyteAirflowAirflowMIMinIO+5

Modular config-driven Terraform framework for full-stack AWS environments.

TerraformTerraformAWAWS IAMVPVPCECEC2AUAurora+5

Shared GitLab CI/CD pipeline library for .NET, Angular, React Native and Flutter.

GIGitLab CI/CDAWAWS S3ECEC2NginxNginxLELet's Encrypt+2

Production desktop healthcare management app used by a medical practitioner.

.NET.NETWPWPFC#C#SQLSQL

On-demand mechanic platform with real-time availability.

FlutterFlutterDADartSQLSQL
$ git log --experience

Work experience

Aug 2025Present

Associate Software Engineer

Nifty Coders Pvt. Ltd. · Dhaka, Bangladesh

Designing banking-scale data lakehouse systems and CI/CD automation across multi-stack projects.

  • Designed and deployed a Banking Data Lakehouse on a self-managed single-node Kubernetes cluster (kubeadm + Calico) on AWS EC2, implementing a Kappa-aligned Medallion Architecture with Airbyte, MinIO, ClickHouse, dbt, Airflow, Apache Superset, and MindsDB.
  • Configured dual CDC sources in Airbyte OSS — Oracle XE via LogMiner and MySQL 8.0 via GTID binlog — landing Parquet into MinIO; ClickHouse reads directly from MinIO via S3-compatible external tables with dbt materialising Silver and Gold.
  • Orchestrated pipelines with Apache Airflow; implemented AI-assisted dbt model generation from source schema definitions.
  • Integrated MindsDB AI agent for natural-language SQL over Gold tables and built Apache Superset dashboards for business metrics and anomaly reporting.
  • Developed Deploy Fusion, a modular config-driven Terraform framework covering the full AWS stack — networking, IAM, compute, managed DBs, CDN, DNS — deployable from a single JSON config.
  • Built a shared GitLab CI/CD pipeline library for multi-stack deployments (.NET, Angular, React Native, Flutter) to AWS EC2 via Docker and ECR, with automated Nginx, DNS, Let's Encrypt SSL and SES notifications.
  • Wrote reusable Bash libraries for AWS STS role assumption, Secrets Manager integration, and EC2 lifecycle management, shared across pipelines.
KubernetesKubernetesCLClickHouseDBdbtAIAirbyteAirflowAirflowMIMinIOMIMindsDBSUSupersetTerraformTerraformAWSAWSGIGitLab CI/CDDockerDocker
$ curl -X POST /contact

Get in touch

Whether it's a DevOps engagement, a data platform build-out, or an AI-automation project — drop a line and I'll get back to you within a day.

Response time
< 24h
on weekdays
Location
Dhaka, Bangladesh
UTC+6 · remote-friendly
Availability
Open
accepting new engagements
FAQ
> What time zones do you work in?

Dhaka (UTC+6). I overlap comfortably with EU mornings and APAC all day; US evenings work fine for syncs.

> Do you take short engagements?

Yes — from a week-long Terraform audit to multi-month platform builds. Whatever fits the problem.

> How fast do you reply?

Usually within 24 hours on weekdays. If it's urgent, mention it in the subject.