We are looking for an exceptional Software Engineer to join our Data Services team. The individual in this role will work closely with product, business, and technology teams to source, integrate, and structure data using AWS from a variety of data sources into CNB’s core data platform.
You will have the opportunity to create something BIG, as we work on building the next generation data & analytics foundation in the company that gives CNB greater speed to insights and the ability to scale.
● Design, build and support big data pipelines, and service layers within data platform
o Partner with business, technology, and product teams to understand needs, provide estimates and create design specs
o Build big data pipelines following software engineering best practices and utilizing the right tool for the job
o Push code to production using continuous Integration, continuous deployment and build automation
o Build the data serving layer within the core data platform utilizing appropriate approaches (REST APIs, extracts, views, etc.)
o Document the design and architecture of the data pipeline
o Provide day to day production support and on call as needed
● Define, document, and enforce data engineering standards
o Define best practices and standards for coding in Python, Airflow, AWS, and other tools as applicable
o Define quality metrics for the code, and create code review checklist(s)
o Define standards for version control, continuous Integration, continuous deployment and build automation
o Document your findings as specifications and implement the corresponding ETL
● Bachelor’s degree in computer science, information systems, or a related technical field. Relevant experience in lieu of a degree will be considered.
● 1+ years of experience building data pipelines as part of modern cloud based data stack (example tools: AWS, Matillion, Airflow, Segment, Snowflake, DBT)
● Experience of data ingestion from a variety of data sources (Relational, APIs, Files, Spreadsheets, Queues, Unstructured, etc.)
● Knowledge of ETL and ELT, data lake and data warehousing concepts
● Experience with unstructured data (XML, JSON)
● Experience of Python or Scala
● Advanced level skills in writing and optimizing SQLs
● Experience of working in an Agile environment
● Highly accountable self-starter and a team player
● Excellent communication skills, both verbal and written
● Able to independently manage multiple work streams while paying strict attention to detail and producing quality code
● Experience of distributed computing systems, and serverless architecture
● Knowledge of data modeling techniques (Star, Snowflake, 3NF, etc.)
● Experience working with CI/CD pipeline (example tools: GitHub Actions, Terraform, AWS CloudFormation) and knowledge of IaC
● Familiarity with message queue systems, e.g., SQS/SNS or Kafka
● Understanding of the data lifecycle and concepts such as lineage, governance, privacy, retention, anonymity, etc.
● Knowledge of building RESTful API layer on top of data stores
● Experience with marketing, advertisement, and lead data
Career Now Brands brings creative marketing solutions to recruitment and enrollment practices in the trucking, education and job distribution industries. Our flagship products - CDL Job Now, Career School Now and RocketPost - help build our clients' businesses while impacting millions of lives by boosting employment and enrollments across the U.S.
This position will be in our new downtown Royal Oak headquarters. We recently renovated a large industrial warehouse into a modern, new office that embodies the culture and character of Detroit’s most innovative tech company.
Career Now Brands is an equal opportunity employer and does not discriminate on the basis of race, color, gender, religion, age, sexual orientation, national or ethnic origin, disability, marital status, veteran status, or any other occupationally irrelevant criteria.