back Back to Jobs

Data Engineer

Location: San Diego, CA, United States
Job # 12159848
Date Posted: 04-04-2019
Data Engineer- San Diego, CA 
Pay Rate: DOE
Direct Hire
Start Date: ASAP

Our client is a leading IT consulting service headquartered in San Diego, CA.  With well known clients, they seek to build their team of data experts to work directly with their clients. At this time, they are looking for a Data Engineer with key skill sets: (Kafka/Confluent and StreamSets) who will be responsible for building infrastructure as code, automate ingestion pipelines to speed ingestion of new data sources and increase process repeatability. 

Ideal candidates are experienced data pipeline builders and data wranglers who enjoy optimizing and building data systems from the ground up, getting data from source systems to analytic systems, and modeling data for analytical reporting. They have also learned quite a bit about managing customers and the relevance of external and internal perceptions of their work product and how they relate to satisfaction. Having the confidence and knowledge to recommend solutions and the experience to know what will and won’t work are important traits for this consultant.
Tasks:
 
1.  Topic Discovery
     Deliverables: 
  • Build service to search and find topics defined by a naming  standard to ingest into data services architecture.
 2.  Kafka Topic Destinations
       Deliverables:
  • Build Kafka connect sink automation for the archiving of Kafka topics to s3 upon ingestion and post transformation to new data models
  • Build Kafka connect sink automation for loading of Elasticsearch from Kafka topics. One sink per cluster to control ingestion speeds and consistency of Elasticsearch ingestion
3.  Pipeline standardization tooling
      Deliverables:
  • Develop tooling to create common component creation in Streamsets data collectors ex: Snowflake to Kafka pipeline creation with standardized names, Kafka error topics, metadata and monitoring; Kafka to Kafka cleansing pipeline to deduplicate, trim and standardize incoming data to utf-8 chars
4.  Documentation
      Deliverables:
  • Diagrams of all process automations 
  • How to documentation on how to use automations

Requirements:

Universal Skills
Must possess the following set of fundamental skills:
  • Uses technology to contribute to development of customer objectives and to achieve goals in creative and effective ways.
  • Communicates clearly and effectively in careful consideration of the audience, and in terms and tone appropriate to them.
  • Accepts responsibility for the successful delivery of a superior work product.
  • Gathers requirements and composes estimates in collaboration with the customer.
  • Respects coworkers and has a casual, friendly attitude.
  • Has an interest and passion for technology. This is not a joke, and yes, it’s a requirement.
Technology:
  • The primary skill sets are Kafka/Confluent and StreamSets (or something similar to Streamsets such as Kinesis.
    • Terraform, Lambda, Dynamo, Athena architecture
  • Experience with data warehouse tools (Teradata, Oracle, Netezza, SQL, etc.) as well as cloud-based data warehouse tools (Snowflake, Redshift, Google BigQuery).
  • Experience building and optimizing traditional and/or event driven data pipelines.
  • Advanced working SQL knowledge and experience working with relational databases.
  • Familiarity with data processing tools such as Hadoop, Apache Spark, Hydra, etc.
  • Knowledge of cloud-based or streaming solutions such as Confluent and Kafka, Databricks and Spark Streaming.
  • Experience with ETL/ELT tools such as Matillion, FiveTran, Talend, Informatica, Oracle Data Integrator, or IBM Infosphere, and understands the pros/cons of transforming data in ETL or ELT fashion.
  • Good understanding of data warehouse concepts of schemas, tables, views, materialized views, stored procedures, and roles/security.
  • Adept at building processes to support data transformation, data structures, metadata, dependency and workload management.
  • Experience with BI tools such as Looker, Tableau, PowerBI, and Microstrategy.
  • Familiarity with StreamSets a plus.
  • Investigate emerging technologies.
  • Research most appropriate technology solution to solve complex and unique business problems.
  • Research and manage important and complex design decisions.
Consulting:
  • Direct interaction with the customer regarding significant matters often involving coordination among groups.
  • Work on complex issues where analysis of situations or data requires an in-depth evaluation of variable factors.
  • Exercise good judgment in selecting methods, techniques and evaluation criteria for obtaining solutions.
  • Attend sales calls as technical expert and offer advice or qualified recommendations based on clear and relevant information.
  • Research and vet project requirements with customer and technical leadership.
  • Assist in the creation of SOWs, proposals, estimates and technical documentation.
  • Act as vocal advocate for Fairway and pursue opportunities for continued work from each customer.
Supervision:
  • Determine methods and procedures on new or special assignments.
  • Requires minimal day-to-day supervision from the client management team.
Experience:
  • Typically requires 5+ years of related experience.
  • Typically requires BS in computer science or higher.

Benefits:

  • Work from Home
  • Flexible Hours
  • 100% covered employee health insurance
  • 401(k) with employer match
  • Fun team building events/days/activities
  • New HQ with adjustable desks
#ZR
this job portal is powered by CATS