Senior Data Engineer
Xecta
Houston, TX US
Posted ago
Xecta

We are looking for dynamic and passionate data engineers to join the Xecta team.

Xecta is where physics intersects artificial intelligence to create the next-gen, cognitive digital solutions for the energy industry.

As a data engineer working within our Data Platform and Infrastructure team, the candidate will be responsible for data ingestion solutions, expanding and optimizing the data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler, who enjoys optimizing data systems and building them from the ground up.

The data engineer will support our software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or re-designing data architecture to support our next generation of products and data initiatives.

The solutions will typically be based on commonly-used ETL/Ingestion tools such as Apache Airflow, Kafka, or similar. The candidate will use their skills to engineer data pipelines that provide batch, scheduled, and in specific cases, push-based data pipelines for relational data, unstructured files, REST, and Socket-based sources. The candidate with use their programming skills (e.g. python) and knowledge of the Data Engineering landscape to apply the best solution to meet our customer(s) requirements. The candidate will work closely with our Lead Data Engineers to design, articulate and implement solutions, working with both Xecta Engineers and the Customer data teams.

Responsibilities

  • Design data engineering pipelines that ingest data from customer hosted systems to Xecta Cloud Platform
  • Code python scripts as needed to apply logic and replace missing/bad data on a per-pipeline basis
  • Deploy, test and refine data engineering solution
  • Work with customer data teams to ensure the robustness and security of data pipeline solution
  • Setup triggers, ingestion schedules for batch loading on a per-pipeline basis

Qualifications

  • Good working experience with Python and Scripting
  • Good understanding of data cleansing concept
  • Prior experience with one or more of Apache Airflow, Kafka, Apache NiFi, Databricks or similar
  • Strong fundamentals and practical experience with Object-Oriented Programming
  • Broad experience working with SQL Server, Oracle, AWS Redshift, MongoDB
  • Experience using Python to retrieve data from SQL and NoSql Databases
  • Experience with Pandas and Numpy
  • Experience consuming REST/Web APIs data using python frameworks (e.g. flask)

Compensation
$100,000 - $150,000 Yearly
About Xecta

Xecta Digital Labs is where oil and gas physics intersects artificial intelligence to create the next-gen cognitive digital oilfield. Our software products and digital solutions provide oil and gas companies with practical and robust workflows for upstream, midstream, and downstream engineering.

APPLY NOW
Acceptable file formats include .doc, .docx, .pdf, .txt.
By applying, you consent to WizeHire's Privacy Policy.