Skip to content

Data Engineer

  • Remote
    • Bangalore, Karnātaka, India
  • Engineering

Job description

About Toku

At Toku, we create enterprise cloud communications and customer engagement solutions to reimagine customer experiences for enterprises. We provide an end-to-end approach to help businesses overcome the complexity of digital transformation in APAC markets and enhance their CX with mission-critical cloud communication solutions. Toku combines local strategic consulting expertise, bespoke technology, regional in-country infrastructure, connectivity and global reach to serve the diverse needs of enterprises.

About the Role

As we continue creating momentum for our products in the APAC region and helping customers with their communications needs, we are looking for a Data Engineer to ensure the high quality and reliability of our cutting-edge contact center and unified communication platforms, and contribute to the seamless delivery of exceptional customer experiences.

This is an impactful position during a growth phase for the business. You will be instrumental in shaping new processes, bringing new ideas, and selecting tools in a collaborative and highly visible team environment. You will thrive in this role if you have a passion for quality, an eye for detail, and the experience to help excel a growing Engineering function to the next level

What you will be doing

As a Data Engineer, reporting to an Engineering Manager or potentially the VP of Engineering, you will collaborate with stakeholders across the organization. You will be responsible for Data Pipeline Design and Development, Data Infrastructure Management, Data Quality Management, and Data Security and Privacy management.

Delivery

This axis refers to the reliability in delivering impactful results across various scopes, including tasks, features, projects, initiatives, teams, the organization. For a Data Engineer, this involves:

  • Building robust and efficient data pipelines to extract, transform, and load data from various sources.

  • Ensuring optimal performance and scalability of the data warehouse.

  • Designing and implementing a scalable data architecture to support future growth and innovation.

  • Providing clean and reliable datasets to stakeholders to assist them in building and optimizing products into innovative industry leaders.

  • Identifying opportunities to automate manual tasks and optimize data delivery.

  • Assisting stakeholders in leveraging data to drive product innovation

Strategic Alignment

This involves the ability to prioritize work and influence goals and direction for oneself, the team, the organization. In the Data Engineer role, this means:

  • Ensuring the data infrastructure can handle future growth and maintain high availability.

  • Maintaining data accuracy, integrity, and consistency to support reliable decision-making.

  • Adhering to data standards, security protocols, and compliance regulations.

  • Staying informed about emerging technologies and their potential benefits.

  • Following industry best practices to optimize data pipelines and processes.

Talent

This axis focuses on contributions to raising the bar by strengthening oneself and others, and by attracting talent. Data Engineers are expected to:

  • Actively participate in knowledge sharing and contribute to the growth and development of the Data Engineering team.

  • Provide guidance and mentorship to fellow data engineers, offering support and training to enhance their skills and performance

Culture

This describes the level of participation in Toku's culture and collaboration across different functions, teams, and organizations. For a Data Engineer, this means:

  • Maintaining excellent interpersonal skills, with strong written and oral communication abilities in English.

  • Ability to work independently and in a fast-paced, dynamic startup environment.

  • Fostering a continuous learning mindset, staying up-to-date with the latest trends and technologies.

Technical Excellence

This refers to the knowledge and fluency within one's technical functional area of expertise that enables engineering and operational excellence. Key technical proficiencies for a Data Engineer include:

  • Programming Languages: Proficiency in languages like Python and SQL for data manipulation, analysis, and automation.

  • Data Technologies: Expertise in tools like Databricks, Spark, and Kafka for handling large and complex datasets. Experience with Amazon Redshift is also beneficial.

  • Data Warehousing and ETL/ELT: Knowledge of data warehousing concepts and ETL/ELT processes to design and implement data pipelines.

  • Cloud Platforms: Familiarity with cloud platforms (AWS) for deploying and managing data infrastructure. Toku's broader architecture leverages containerized services, serverless computing, and modern deployment DevOps practices for scalability and resilience.

  • Database Systems: Understanding of both relational (SQL) and NoSQL databases.

  • Data Modeling: Ability to design efficient data models to support business needs.

Expected Collaborations

  • Work closely with Data Manager to align on data strategies and goals for Toku.

  • Collaborate with BI team on data initiatives and will ensure optimal data delivery is consistent throughout ongoing projects.

  • Provide technical inputs to data stakeholders and assist them in building and optimizing pipelines for their data needs.

  • Partner with Infra Team to provisioning, capacity planning, monitoring and maintenance.

  • Discuss with Security team to implement security policies and privacy concerns

  • Share knowledge and best practices related to Data Engineering tools and techniques with fellow team members.

Job requirements

We would love to hear from you if you have:

  • At least a Bachelor’s degree in Data Science / Information Technology or a relevant field.

  • Around 3+ years of total relevant experience in Data Engineering.

  • Significant experience with Databricks, SQL Query language, Python, ETL processes, and best practices for data engineering.

  • Proficiency in languages like Python, SQL for data manipulation, analysis, and automation.

  • Working knowledge and familiarity with a variety of databases (SQL and NoSQL).

  • Good to have exposure/experience in building and optimizing ‘big data’ data pipelines, architectures, and data sets.

  • Experience in tools like Databricks, Spark, Kafka for handling large and complex datasets.

  • Knowledge on Data Warehousing concepts and ETL, ELT processes to design and implement data pipelines.

  • Familiarity on working with Cloud Platforms (AWS).

  • Strong analytical and problem-solving skills to resolve data-related challenges.

  • Ability to work collaboratively in cross-functional teams.

  • Able to think critically and innovate to improve data processes.

  • Effective Communication skills to collaborate with business stakeholders.

  • Knowledge with Agile methodologies and experience working in Agile environments.

If you would love to experience working in a fast-paced, growing company and believe you meet most of the requirements, come join us!

Remote
  • Bangalore, Karnātaka, India
Engineering

or

Apply with Linkedin unavailable
Apply with Indeed unavailable