Hadoop Developer

Candidate must be willing to work in Columbus, IN on a daily basis.

Fusion Alliance is looking for a good Hadoop Developer that has SQL skills, has experience working in Azure, and is familiar working on an Agile project! If you also have AWS web services experience, that is considered a plus for this role!

Unique Qualifications and Competencies:

  • 3 years of minimum experience in design and development for a Big Data platform using open source and third-party tools including but not limited to:
  • SPARK, Databricks, Scala, Map-Reduce, Hive, and event hub or equivalent classwork with and advanced degree.
  • ODBC and SQL query language is required.
  • Experience implementing on Microsoft Azure Cloud environment is required.
  • Experience developing on Amazon Web Services Cloud environment is a plus.
  • Experience developing SFTP and large file movement protocols for a Cloud-based environment, is required.
  • Experience with Agile software development desired.

Unique Key Responsibilities:

  • Implement Data Ingestion, Transformation and Storage for various, evergreen data sources.
  • Implement Data Quality and veracity assurance methods, rules, measures.
  • Implement data pipeline development methods that combine internal Cummins' data sources as well as external and third-party sources.