Job Responsibility】
An ETL Engineer for Cloud integration is a professional who designs,
develops,tests,and maintains the Extract Transform Load (data integration) workflows that enable the smooth and efficient transfer of data on the Cloud or between the cloud and on-premise systems.
The role of an ETL Engineer involves understanding the source data, its format, and quality, mapping the data requirements to the target system, and ensuring data quality, accuracy, and consistency.
ETL Engineer uses various tools and technologies such as SQL, ETL frameworks, data modeling. ETL Engineer works closely with local as well as remote peer IT teams, Data Governance team.
Key Tasks & Accountabilities】
ETL Engineer is expected to design, setup, develop, test, and maintain the ETL pipeline from ingestion to destination.
He / she is expected to own data integration as project level on a need basis.
Relationships】
Internal: Project Managers, Business Analysts, QA, Infrastructure team, IT Architecture, Business team members,
Group / global team architecture and technology peers
External : IT development vendor teams, Software product vendors
Qualifications / Experience】
Technical]
- Hands-on experience with designing and developing data extract, transformation and load pipeline on cloud using AWS Glue. Knowledge and experience on Informatica Power Center would be an added advantage
- Understand the effects of volume, variety, and velocity on data ingestion, transformation, modeling, security, governance, privacy, schema design, and optimal data store design for AWS cloud-based solutions.
- Hands-on Data Modeling, Data storage on Cloud (AWS)
- SQL programming, tuning (optimization)
- Base AWS concepts such as AWS Lambda, Serverless, Monitoring, Cost calculations
- Automation of data pipelines, cloud infrastructure creation, maintenance using terraform
- Understanding of Infrastructure as code (IaC) for repeatable deployments (AWS CloudFormation or similar)
- Understanding of how to analyze data, verify data quality, and ensure data consistency by using AWS services
- Ability to drill down into database issues, pipeline issues, data model design as needed.
- Ability to design ETL pipelines as per business requirements, while aligning with standards, roadmap and strategic direction.
- Source code management using standard tools like GIT / GIT Hub
Nice to have]
Bilingual (English + Japanese) is preferred. However,for strong technical candidates, English only or Japanese only skills can be considered.AWS Certified Data EngineerFamiliarity or experience with DWH, DataLakeInsurance business knowledgeExperience]
5+ Years in ETL in design, dev roles. At least 3+ years on Cloud related ETL development including AWSHands-on experience on Cloud TechnologiesWork experience in waterfall and / or agile implementationsExperience in working with internal as well as vendor teamsPersonal]
Proactive, self-driven, motivated to learn and share new technical skillsStrong analytical, conceptual, and problem-solving abilitiesStrong written and oral communication skillsStrong presentation and interpersonal skillsAbility to effectively prioritize and execute tasks in a high-pressure environmentExperience working in a team-oriented, collaborative environment as a team playerEducation]
Software Engineering would be ideal
3-14-20)または在宅勤務のハイブリッド勤務
7時間)
6時00分から22時00分、コアタイム なし、1日における最低勤務時間 4時間
60分(1日の実労働時間が4時間を超える場合)
6週間の特別有給休暇。社員の性別を問わず取得できる制度)、慶弔休暇、産前産後休暇、ボランティア休暇、Re-Creation休暇
EAP : Employee Assistance Program(従業員支援プログラム)、育児休職制度、育児時間制度、介護休職制度、フレックスタイム制度、短時間勤務制度、時差勤務制度、育児休職者の早期復職支援手当、シフト勤務に対する育児支援手当、社内クラブ活動、ドレスコードフリー(服装自由)、オンライン自己啓発プログラム、副業、兼業(届出制)