- Develop ETL pipelines to integrate and test large alternative datasets for the Commodities desk, collaborating with quant researchers and data engineering teams.
- Architect, deploy, and manage cloud-based systems for storing and exploring large alternative datasets with the AWS infrastructure team.
- Monitor, support, debug, and extend existing Commodities trading and research infrastructure alongside Researchers and Support Engineers.
Your Present Skillset
- Proficient in Python, especially numerical libraries (numpy, pandas, matplotlib, etc.)
- Basic knowledge of AWS and databases (e.g., SQL)
- Familiar with development practices such as version control (Git) and unit testing
- Quantitative mindset
- Team player with a collaborative attitude
Nice to Have
- Experience creating dashboards or using data visualization software (e.g., Tableau, Dash)
- Advanced AWS experience (e.g., DynamoDB, RDS, S3, Lambda, AWS CDK)
- Advanced database knowledge (query optimization, relational vs. non-relational databases, etc.)
- Parallel computation experience
- Experience with geographic data using geopandas, xarray
- Financial knowledge is a plus but not required
This organization is an equal opportunity employer, valuing diversity as essential to success. Employees are empowered to work openly and respectfully to achieve collective success. In addition to professional achievement, initiatives and programs are offered to help employees maintain a healthy work-life balance.