Remote Data Engineer (Python, AWS, ETL, RDS, RedShift, and S3), Turing.com | #python | #jobs

[ad_1]

Job Title

Remote Data Engineer (Python, AWS, ETL, RDS, RedShift, and S3)

Job Description

Apply to this job here:

Only candidates with 5+ years of software development experience are eligible for this role.

Turing.com is looking for a Data Engineer on behalf of a US-based company that connects beauty and wellness professionals virtually to clients. The developer will be collaborating with different teams for various project-related tasks. The company enables professionals to showcase their businesses and talents while enabling customers to discover new services, providers, and book appointments online. The company has managed to raise more than $35mn during their Series C round of funding. This role requires a 6+ hours overlap with the PST time zone and will be a full-time, long-term position.

Here’s everything you need to know to become a Remote Data Engineer at Turing: turing.com/s/5rhm56

All geared up to become a Turing remote developer at a leading company? Take our exciting coding challenge:

Job Responsibilities:

  • Contribute to new and existing Data and Infrastructure projects
  • Should have an interest in Big Data technologies and services
  • Develop and further develop Big Data processing pipelines for data sources containing structured and unstructured data
  • Work on monitoring and optimizing key infrastructure components like Databases, EC2

Clusters, Containers, and other aspects of the stack

  • Actively try to implement best practices for Big Data development
  • Function as a bridge between the Data Engineering department and the wider engineering organization
  • Actively collaborate with Data Analytics and Data Science teams
  • Adopt an agile approach for business users, data analysts, and data scientists for understanding and discovering the potential business value of new and existing Data Sets
  • Work on analyzing requirements and architecture specifications for creating detailed designs
  • Keep researching on areas of interest help facilitate better solutions

How to Become a Turing Developer:

  • Create your account on
  • Fill in your basic information (name, number, location, previous salary, experience, etc.)
  • Solve multiple-choice questions
  • Schedule a technical interview
  • Final Onboarding

Perks & Benefits:

  • Earn salaries higher than local standards
  • Work alongside a community of Google, Facebook, Microsoft engineers
  • Experience rapid career growth
  • No visa requirements to work with the best US companies
  • Better work-life balance

Apply today before the vacancies are filled. Looking for more exciting job opportunities? Find more Turing US software jobs now:

Hear from developers themselves. Read Turing.com reviews here:

Restrictions

  • Telecommuting is OK
  • No Agencies Please

Requirements

Job Requirements:

  • Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
  • Must have at least 5 years of experience in back-end development using Python and other scripting languages
  • Must possess a thorough knowledge of Python in a functional and modular capacity
  • At least 2 years of experience working with AWS Data Infrastructure, including RDS, RedShift & S3
  • Ability to design, develop, and own ETL pipelines building processes for delivering data with measurable quality under a pre-defined SLA
  • Ability to build data pipelines in a high-ingestion environment utilizing different data infrastructure technologies
  • Experience working with in-demand technologies like SQL to scale and optimize schemas, and performance-tuned ETL pipelines
  • Extensive experience working with AWS Lambda and Docker
  • Familiarity with technologies like Kinesis, RedShift, and more would be an added advantage
  • The developer should be fluent in English to communicate efficiently with teammates and other stakeholders
  • Should be able to monitor metrics and analyze data while partnering with team members to resolve critical problems
  • Familiarity with data streaming technologies like Spark, Storm, and Flink along with message queue systems like Kafka and Kinesis will be a bonus advantage
  • Familiarity with a message or file formats like Parquet, Avro, and Protocol Buffer would be a bonus
  • Ability to work with Redis, Cassandra, MongoDB, or similar NoSQL databases is preferred

Contact Info

[ad_2]

Source link