Insighture is a leading technology consultancy that drives digital transformation for businesses worldwide. With a team of over 85 expert consultants, the company delivers tailored, high-impact strategies and solutions, enabling scalable product engineering. As an AWS partner, Insighture excels in co-integrated cloud services. It has collaborated with more than 50 clients globally, guiding them through cloud adoption, DevOps transformation, enterprise modernisation, and more.
The team’s expertise spans Cloud-Native Development, Solutions Architecture, UI/UX, Quality Engineering, Data Engineering, AI/ML, and DevSecOps. These capabilities empower businesses to achieve impactful and innovative outcomes.
In 2024, Insighture achieved ISO certification and was recognised as a Great Place to Work, earning three prestigious awards: Best Workplace in Sri Lanka, Best Workplace for Technology, and Best Workplace for Young People. Insighture's technology and expertise are embedded in the work of internationally recognised care providers, global freight operations, child protection systems, and health tech platforms across Australia, the UK, and Singapore.
We are seeking a motivated and detail-oriented Data Engineer to join our growing team.
Key Responsibilities:
- Build and maintain data pipelines(ELT and ETL), for optimal extraction, transformation and loading of data from a wide variety of data sources using SQL and GCP data warehousing technologies.
- Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment
Qualifications:
- Experience with SQL / Python : Performance optimization through SQL/Python
- Experience building and optimizing data warehousing data pipelines (ELT and ETL), architectures and data sets.
- Minimum of 3 years’ Experience with Building Pipelines using GCP-Big-Query.
- Cloud experience as a GCP Data Engineer.
- Min 2-3 years of experience in working on Large Scale Data warehouses like Teradata.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience with CICD pipeline and workflow management tools: GitHub Enterprise, Cloud Build etc.
- Performance tuning on BigQuery/Any large scale Warehouses.
- Basic Experience with object-oriented/object function scripting languages: Python/Java.
- Experience with GCP cloud services: Kubernetes, Cloud Function, Cloud Composer, PubSub etc.
- Basic Knowledge of Airflow/Google Cloud Composer or any other Job Orchestration tool.
- Experience with data pipeline and workflow management tools: Data Build Tool (DBT) or any other ETL/ELT tool, data flow , BigData.
- Basic Knowledge on shell scripting etc.
- Experience with Data Analytics and Visualization Tools: Tableau BI Tool (OnPrem and SaaS), Data Analytics Workbench (DAW), Visual Data Studio etc.