As a Data Platform Cloud/DevOps Engineer in the Data Engineering team you will: -
Work alongside our systems engineers and UI developers to help design and build scalable, automated CI/CD pipelines.
Help prove out and productionize infrastructure and tooling to support scalable cloud-based applications
Working/Unlocking myriad generative AI/ML use cases for Aladdin Data and thus for BlackRock
Have fun as part of an awesome team
Specific Responsibilities:
Working as part of a multi-disciplinary squad to establish our next generation of data pipelines and tools
Be involved from inception of projects, understanding requirements, designing & developing solutions, and incorporating them into the designs of our platforms
Mentor team members on technology and best practices
Build and maintain strong relationships between DataOps Engineering and our Technology teams
Contribute to the open source community and maintain excellent knowledge of the technical landscape for data & cloud tooling
Assist in troubleshooting issues, support the operation of production software
Write technical documentation
Desirable Skills
Data Operations and Engineering
Comfortable reading and writing python code for data acquisition, ETL/ELT
Experience orchestrating data pipelines with AirFlow and/or Argo Workflows
Experience implementing and operating telemetry-based monitoring, alerting, and incident response systems. We aim to follow Site Reliability Engineering (SRE) best practices.
Experience supporting database or datastores e.g. MongoDB, Redis, Cassandra, Ignite, Hadoop, S3, Azure Blob Store; and various messaging and streaming platforms such as NATS or Kafka
Cloud Native DevOps Platform Engineering
Knowledge of the Kubernetes (K8s) APIs with a strong focus on stateful workloads
Templating with Helm, ArgoCD, Ansible, and Terraform
Understanding of the K8s Operator Pattern -- comfort and courage to wade into (predominantly golang based) operator implementation code bases
Comfortable building atop K8s native frameworks including service mesh (Istio), secrets management (cert-manager, HashiCorp Vault), log management (Splunk), observability (Prometheus, Grafana, AlertManager).
Experience in creating and evolving CI/CD pipelines with GitLab or Github following GitOps principles
Natural/Large Language Models (Good to have)
Experience with NLP coding tasks like tokenization, chunking, tagging, embedding, and indexing supporting subsequent retrieval and enrichment
Experience with basic prompt engineering, LLM fine tuning, and chatbot implementations in modern python SDKs like langchain and/or transformers
Any question or remark? just write us a message
If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.