Skip to main content

Job Description

  • Exposure to international customers & willingness to work in shifts (24 x 7)
  • Very strong technical skills, Process compliance, Professional English & general communication skills.
  • Should be an expert with data lakes technical components (e.g. data Modeling, ETL and Reporting).
  • Should have hands on experience on AWS based environments.
  • Should have deep understanding of the architecture for enterprise level data warehouse solutions.
  • Should be passionate about working with huge data sets and someone who loves to bring datasets together to answer business questions and drive growth.


  • Analyze and solve problems at their root, stepping back to understand the broader context.
  • Interface with customers, understanding their requirements and delivering complete data solutions.
  • Provide Access to large datasets.
  • Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use.
  • Tune application and query performance using profiling tools and SQL.
  • Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.
  • Model data and metadata to support ad hoc and pre-built reporting.


  • Bachelor's degree in Computer Science, Information Systems or related field
  • 3+ years of SQL development experience
  • 3+ years of experience in data modeling, ETL, and Data Warehousing
  • 3+ years of experience architecting, designing, developing and implementing cloud solutions on AWS platforms
  • Demonstrated experience with designing and implementing solutions using AWS platform and tools including EC2, AWS console, CloudWatch, S3, Presto, Lambda functions, Python programing, Redshift SQL, Linux, DynamoDB, CloudFormation, RDS, VPC, IAM and security.
  • Experience with Big Data technologies such as Hive/Spark.
  • Server management and administration including basic Linux scripting
  • An ability to work in a fast-paced environment where continuous innovation is occurring and ambiguity is the norm.
  • SQL scripting on Presto and Redshift.
  • Strong business communication skills
  • 3+ years of Python scripting (or other platform-agnostic language)
  • AWS Certifications (such as AWS solutions architect or other specialty certifications) are a plus
  • BI Reporting tools such as Tableau is a plus.
  • AWS console, S3, Lambda functions, python programing, Redshift SQL and Linux).
  • Redshift Administration skills is a plus.
  • Exposure to Informatica Cloud or Informatica PowerCenter is a plus.
  • Strong organizational and multi-tasking skills with ability to balance competing priorities.