Full time Remote / Telecommute

Data Engineer

Dayton, Ohio
0 other recent jobs
Created: September 6, 2022

Description

Description

We are looking for an experienced Data Engineer to join our team. You will use various methods to transform raw data into useful data systems. You will build understanding of existing data systems and structures and their relationships. You will strive for efficiency by aligning data systems with business goals. 

 

To succeed in this data engineering position, you should have strong analytical skills and the ability to combine data from different sources. Desired skills also include familiarity with programming languages and knowledge of analysis methods and tooling.  Candidate should be able to handle backend data engineering tasks with minimal supervision.

Requirements

  • Proficiency with at least one object-oriented/object function scripting language: Python, Java, C++, Scala, etc.
  • Proficiency in SQL.
  • Ability to work as part of a team and to break down requirements into actionable tasks.
  • 3-7 years of related experience.
  • BS degree or equivalent experience.
  • Ability to obtain a DoD security clearance (or have one).

DESIRED:

  • Experience with relational SQL and NoSQL databases, such as Postgres and MongoDb.
  • Experience with data pipeline and workflow management tools: Dagster, Airflow, etc.
  • Working knowledge of message queuing, stream processing.
  • Ability to build data systems and pipelines.
  • Implement data pipelines (typically designed by others).
  • Evaluate business needs and objectives.
  • Ability to work with Software Engineers to write connectors to data sources.
  • Ability to identify and execute good data governance practices, as well as interpret trends and patterns.
  • Ability to analyze and organize raw data.
  • Conduct complex data analysis and report on results.
  • Prepare data for prescriptive and predictive modeling.
  • Combine raw information from different sources.
  • Explore ways to enhance data quality and reliability.
  • Identify opportunities for data acquisition.
  • Develop analytical tools and programs.
  • Collaborate with data scientists and architects on several projects.
  • Experience with message queuing tools like Kafka, Logstash, RabbitMQ.
  • Experience with stream processing tools like Spark.
  • Experience with search engines like Elastic.
  • Experience with visualization tools like Tableau or Kibana.
  • Experience with cloud offerings GCP or Azure.
  • Knowledge of data lake and data lakehouse architectures.

Metadata

Published: Thursday, September 8, 2022 15:08 UTC


Last updated: Thursday, September 8, 2022 15:08 UTC