Data Engineer (Geospatial)
JOB DESCRIPTION
The Data Engineer will manipulate data and data flows for both existing and new systems. Their previous experience must include a geospatial and telemetry focus. Additionally, they will provide support in the areas of data extraction, transformation and load (ETL), data mapping, data extraction, analytical support, operational support, database support, and maintenance support of data and associated systems. As a member of the team, candidates will work in a multi-tasking, quick-paced, dynamic, process-improvement environment that requires experience with the principles of large-scale (terabytes) database development, large-scale file manipulation, data modeling, data mapping, data testing, data quality, and documentation preparation.
QUALIFICATIONS
Bachelor’s Degree in Computer Science, Electrical or Computer Engineering or a related technical discipline, or the equivalent combination of education, technical training, or work/military experience
Minimum eight (8) years of related software engineering and ETL experience
REQUIRED KNOWLEDGE/SKILLS
Experience building and maintaining data flows in NiFi, Pentaho, and Kafka
Experience with the following languages: Java/J2EE, C, C++, SQL, XML, XQuery, XPath, Python, JSON
Experience with data visualization with geospatial data, ESRI, and related technologies (Elastic)
Analytic and targeting methodologies related to geospatial data
Familiarization with NoSQL datastores
Excellent organizational, coordination, interpersonal and team building skills
DESIRED KNOWLEDGE/SKILLS
Familiarization executing jobs in Big Data Technologies (i.e., Hadoop or Spark)
Knowledge of servers operating systems; Windows, Linux, Distributed Computing, Blade Centers, and cloud infrastructure
Strong problem-solving skills
Ability to comprehend database methodologies
Focus on continual process improvement with a proactive approach to problem solving
KEY RESPONSIBILITIES
Research, design, and develop better ways of leveraging geospatial and telemetry data flows in our enterprise-wide systems and/or applications
Use Java to manage and improve current NiFi code
Troubleshoot Oracle and Elastic datastores in the event of an outage
Develop complex data flows, or makes significant enhancements to existing pipelines
Resolves complex hardware/software compatibility and interface design considerations
Conducts investigations and tests of considerable complexity
Provides input to staff involved in writing and updating technical documentation
Troubleshoots complex problems and provides customer support for the ETL process
Prepares reports on analyses, findings, and project progress
Provides guidance and work leadership to less-experienced engineers