AI Summary
Design, develop, and maintain scalable infrastructure for processing large-scale batch and real-time data workloads. Collaborate with AI and data engineering teams to monitor data pipelines and ensure data quality and integrity.
Key Highlights
Design and develop scalable infrastructure
Collaborate with AI and data engineering teams
Monitor data pipelines and ensure data quality
Technical Skills Required
Benefits & Perks
100% remote work environment
Competitive compensation package
Equity or stock options (where applicable)
Access to a culture that promotes autonomy, experimentation, and continuous learning
Opportunity to work on innovative projects with a direct impact on the company's growth and success
Job Description
About The Company
AlphaPoint's AI Labs is at the forefront of technological innovation within the financial technology sector. As a company powering digital asset exchanges and brokerages worldwide, AlphaPoint is committed to transforming the landscape of digital finance through cutting-edge solutions. Our team of engineers and AI scientists work diligently to bridge the gap between groundbreaking AI advancements and the highly competitive markets we serve. By developing and applying the latest generative AI, data, and knowledge modeling technologies, we tackle large-scale, complex business problems, pushing the boundaries of what is possible in AI and data processing. Our mission is to leverage innovative technologies to create scalable, efficient, and secure financial solutions that empower our clients globally.
About The Role
We are seeking a highly skilled and experienced Backend Infrastructure Engineer to join our AI Labs team. In this role, you will be responsible for building and maintaining a scalable, high-performance infrastructure capable of processing both batch and real-time workloads. You will collaborate closely with our AI engineering team and external engineering partners to monitor, extract, and process data from a wide array of sources. Your expertise will be essential in designing and implementing robust ETL data pipelines, architecting backend data solutions to support microservices, and developing third-party integrations with legacy systems. This position offers a unique opportunity to work on innovative projects that directly impact the company's technological capabilities and overall strategic growth.
Qualifications
- Bachelor's degree in computer science, engineering, or a related field
- 7-10 years of experience in software development
- Proficiency in Python, Node.js, and/or Java
- Strong understanding of distributed computing principles and data modeling
- Experience building ETL pipelines using Apache Airflow, Spark, Databricks, or similar tools
- Hands-on experience with NoSQL databases such as MongoDB, Cassandra, DynamoDB, or CosmosDB
- Knowledge of real-time stream processing systems like Kafka, AWS Kinesis, GCP Data Flow
- Familiarity with Redis, Elasticsearch, Solr
- Experience with messaging systems including RabbitMQ, AWS SQS, GCP Cloud Tasks
- Ability to extract and model data from unstructured formats through scraping, modeling, and ingestion into semantic databases and graphs
- Knowledge of Delta Lake and Parquet file formats
- Experience working with cloud providers such as AWS, GCP, or Azure
- Proficiency in Test Driven Development (TDD)
- Experience with version control systems like Git (Github, Bitbucket)
- Design, develop, and maintain scalable infrastructure for processing large-scale batch and real-time data workloads
- Collaborate with AI and data engineering teams to monitor data pipelines and ensure data quality and integrity
- Implement and optimize ETL data pipelines using industry-standard tools and frameworks
- Architect backend data solutions to support various microservices and application components
- Develop integrations with third-party systems, including legacy platforms, to facilitate seamless data exchange
- Ensure system reliability, scalability, and security across all infrastructure components
- Identify and implement innovative solutions for harvesting and ingesting unstructured data sources
- Participate in code reviews, testing, and documentation to maintain high-quality standards
- Stay updated with emerging technologies and best practices in data engineering and infrastructure development
- 100% remote work environment providing flexibility and work-life balance
- Competitive compensation package
- Equity or stock options (where applicable)
- Access to a culture that promotes autonomy, experimentation, and continuous learning
- Opportunity to work on innovative projects with a direct impact on the company's growth and success
- Collaborative and inclusive work environment that encourages professional development
AlphaPoint is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate based on race, ethnicity, gender, age, religion, disability, sexual orientation, or any other protected characteristic. We believe that a diverse team fosters innovation and drives better business outcomes, and we are dedicated to providing equal employment opportunities to all applicants and employees.