Required Skills and Experience:
• 5+ years of experience in data engineering or software development.
• Strong proficiency in Java and experience building scalable data pipelines.
• Hands-on experience with Apache NiFi for data ingestion and orchestration.
• Expertise in Apache Flink for real-time data processing and streaming.
• Familiarity with distributed systems, data lake architectures, and big data ecosystems (e.g., Hadoop, Kafka, Spark).
• Strong understanding of data integration, ETL processes, and data transformation techniques.
• Experience with cloud platforms (, Azure) is a plus.
• Excellent problem-solving skills and a proactive approach to identifying and resolving issues.
• Strong communication and collaboration skills to work effectively in a cross-functional team.