Data Engineer

Apply now »

Date: Mar 8, 2023

Location: İstanbul, TR

Company: Coca Cola İçecek

Job Objective

CCI is looking for a Data Engineer who will be a part of our The Enterprise Data and Analytics team is responsible for managing and developing our strategic data assets, empowering all other parties and business partners to further develop consistent organizational knowledge and insights. This means we are responsible for bridging core data engineering workloads and robust repeatable analytical solutions. As a Data Engineer you will partner engineering manager, product managers and analysts across our company to ensure data is conducted and delivered in a secure and predictable manner throughout the org. We put data engineering and software development best practice at the core of our product and strive to deliver an intuitive product foundation that analysts and business stakeholders can rely on.

Main Responsibilities

  • To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
  • To be able to clearly articulate pros and cons of various technologies and platforms;
  • To be able to document use cases, solutions and recommendations;
  • To have excellent written and verbal communication skills;
  • To be able to explain the work in plain language;    
  • To be able to help program and product managers in the design, planning and governance of implementing projects of any kind;
  • To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
  • To be able to work creatively and analytically in a problem-solving environment;
  • To be a self-starter;                
  • To be able to work in teams, as a big data environment is developed in a team of employees with different disciplines;
  • To be able to work in a fast-paced agile development environment.
  • To be able to design ETL structure if needed        

Capabilities

  • Bachelor's Degree/Masters in Computer Engineering/Science.
  • 5+ years of experience in developing conceptual, logical and physical data models supporting transactional, operational and analytical based solutions.
  • Experience with Stream processing (e.g. Kafka, Kinesis, Pub/Sub, Spark Streaming, Dataflow, Flink)
  • Experience with at least some of NoSQL databases like Dynamodb, Hbase, Cassandra, BigTable or Spanner is required.
  • Passionate about learning new technologies and working on massive scale data with data warehouses like Redshift, BigQuery, Athena
  • Experience with at least one of SQL databases like MySQL, SQL Server, PostgreSQL, Oracle is required.
  • Familiarity with serverless architectures.        
  • Experience gathering and analyzing system requirements
  • Experience in Cloud Technologies
  • Experience in Airflow  or Google Cloud Composer
  • Experience in GCP and its services Cloud Storage, Cloud Run, Cloud SQL, Cloud Dataflow ,Cloud Machine Learning Engine, Vertex AI, 
  • Cloud Scheduler, API Gateway ,Cloud Build ,Cloud Functions ,Data Catalog.
  • Experience in SAP ecosystem (SAP DataServices, BO, BW)
  • Desing and makes standarts for ETL processes
  • Willingness to learn more about new languages and frameworks.
  • Creative and innovative problem-solving skills.
  • Good team player, result oriented attitude and analytical mind.

 

"We're an equal opportunity employer. All applicants will be considered for employment without attention to race, sex, color, national or social origin, ethnicity, religion, age, pregnancy, disability, sexual orientation, gender expression or political opinion."