Job Snapshot


Location:
Hong Kong Central
Employment Type:
Permanent
Job ID:
522016

Job Summary

Key Responsibilities

  • Coordinate and collaborate with business SME’s and CoE’s to understand different forms of data available from various systems.
  • Build and Manage Data Warehouse and Data Lakes strategies across the group.
  • Setting up scopes for EDW projects, choosing the right technology tools to support the requirements and ensuring all data needs are being met.
  • Collect, analyze, mines, and helps the business leverage the information stored in data warehouse and data lakes.
  • Provisioning data connectivity to the business whenever it requires.
  • Train and support business to make use of the data from the EDW.
  • Device plans on sunsetting legacy databases and provision users with access on the historical information and data through reports or analysis dashboards.
  • Develop and execute database queries and conduct analyses.
  • Create visualizations and reports for requested projects.
  • Develop and update technical documentation.
  • Ensure project deliverables to be on time and meet quality standards.
  • Introduce tools for effective monitoring, alerts and insights for ensuring no service interruptions.
  • Participates on vendor selection and manage them during the project.

Skills and Professional Requirements

  • Bachelor's degree in Computer Science, or a related discipline required.
  • Ability to articulate and translate technological concepts into business terms.
  • Experience working with various market standards relational databases (Oracle PL/SQL, MS Access, Postgr SQL and MS SQL databases, etc.).
  • Knowledgeable on various database formats (structured - RDBMS, unstructured – flat files and documents and semi-structured – json, xml, csv).
  • Well-versed in SQL Server BI Suite (SSIS, SSAS, SSRS).
  • Knowledge in NO-SQL schema-free database is an advantage. (MongoDB, HBase, Cassandra, etc.)
  • Working knowledge with various API interfaces (Web Services, RESTful and
  • Wide range of understanding with Cloud-Based technology i.e. Azure SQL, AWS and Google Cloud services.
  • Knowledge in Hadoop cluster nodes storage and other execution framework (i.e. Spark, Hadoop Map Reduce, Hive, etc.)
  • Working experience with data visualization tools (i.e. Microsoft Power BI, QlikView, Tableau, etc.)
  • Strong knowledge in data warehouse design (e.g. dimensional modeling), data mining as well as other EDW techniques (i.e. data mart models, data lake catalogues, data ingestions and job processing).
  • In-depth understanding of database management systems, online analytical processing (OLAP), online transactional processing (OLTP) and ETL (extract, transform and load) framework.
  • Experience in running Python scripts and machine learning modelling on Anaconda or Google Colab is a plus.
  • Shipping industry experience is a plus.
  • Excellent written and verbal communication skills, interpersonal and collaborative skills.
  • Poise and ability to act calmly and competently in high pressure, high stress situations.
  • Must be a critical thinker, with strong problem-solving skills.
  • Excellent analytical skills and ability to manage multiple projects under strict timelines, as well as the ability to work well in a demanding, dynamic environment and meet over all objectives.
  • High level of personal integrity, ability to professionally handle confidential matters, and an appropriate level of judgment and maturity.