Job Description
The Company
Through the power of AI and Big Data, this company has developed a number of data-driven tools and platforms aimed at delivering a transparent financial solution, focusing purely on sustainable investment. If you want to leave the planet clean and beautiful for the next generation, then you share the ethos of this company.
This group has 4 arms:
The Job
Take charge of the processes, from data ingestion to calculation and delivery. Responsibilities include:
Minimum qualifications include:
The Package
Offers include:
Please provide the following information:
Through the power of AI and Big Data, this company has developed a number of data-driven tools and platforms aimed at delivering a transparent financial solution, focusing purely on sustainable investment. If you want to leave the planet clean and beautiful for the next generation, then you share the ethos of this company.
This group has 4 arms:
- A Data Provider: Assessing the performance and sustainability of companies worldwide.
- AI: An advisory and technology company with an AI engineer for forecasting investment opportunities.
- Asset Management: Investment management company investing in sustainable equities.
- R&D: Focused on AI, ML, Data, Finance, and sustainability.
The Job
Take charge of the processes, from data ingestion to calculation and delivery. Responsibilities include:
- Working with modern tech to maintain and develop the data pipelines that extract, transform, and load data from a variety of sources into their systems.
- Keeping track of and guaranteeing a high level of data quality and consistency.
- Maintaining the processes for obtaining and importing data for immediate use/storage.
- Guiding and mentoring junior colleagues.
- Solving problems related to data processing and collection.
- Experimenting with data.
- Staying up to date with the latest data tools and technologies.
Minimum qualifications include:
- 5+ years of Data Engineering experience.
- Strong programming skills (ideally Python; Go is a plus).
- Experience with cloud technologies (e.g., GCP, AWS).
- Experience with ElasticSearch.
- Proficiency in SQL, PostgreSQL, NoSQL, and distributed databases.
- Knowledge of ETL and DWH (concepts, tools, architectures, technologies).
- Familiarity with Kafka, Pub/Sub, or similar technologies.
- Analytical and driven with enthusiasm for modern data and Big Data technology.
- Experience and confidence in taking the lead in projects.
- Professional English proficiency.
The Package
Offers include:
- Company share program.
- The opportunity to work with a team of experts in their field.
- Control over your own tasks and activities.
- Work in areas of R&D and cutting-edge tech.
- Relocation and visa support.
Please provide the following information:
- Name
- Contact number
- Message
- Upload your CV or job description (PDF or DOC only).