Overview

Getty Images is dedicated to moving the world with images. Our mission is to empower our customers by providing fresh and distinctive content. The Enterprise Data Services (EDP) team plays a critical role in building and maintaining Getty’s data, AI, and engineering platform. The EDP team collaborates with several partner teams, including Customer Data Science, Search, Business Intelligence (BI), and more, to deliver scalable, reliable, and innovative data solutions.

About the Team:

The EDP team at Getty Images is responsible for developing and maintaining the data infrastructure that supports various partner teams. Our work ensures that Getty Images continues to deliver top-notch services and products to our customers by leveraging cutting-edge technologies and innovative data solutions. Join the EDP team at Getty Images and contribute to building the data and AI platform that empowers our customers to move the world with images.

What You Will Do:

    • Develop and Maintain: Backend engineering to develop and maintain microservices serving as access points to Getty Images’ data and AI platform.
    • Build Scalable Solutions: Create scalable software solutions to enhance data processing and analytics capabilities.
    • Cross-Functional Collaboration: Form strong cross-functional relationships with engineering, data science, and product teams to deliver high-quality solutions.
    • Focus on Reliability and Security: Ensure the reliability, resiliency, and security of data services.
    • Ownership and Quality: Take ownership of changes and ensure high code quality from concept to production.
    • Automate Processes: Lead efforts to automate manual processes and build efficient data pipelines.

What We Are Looking For:

    • Minimum 5 years of experience in data engineering.
    • Proficient in Python and Spark.
    •  Proficient in database management and administration, particularly with MySQL.
    • Extensive experience with AWS cloud engineering, including MWAA (Managed Airflow), MSK (Managed Kafka), Hive/EMR.
    • Expertise in Databricks, especially in migrating from EMR to Databricks.
    • Strong knowledge of Snowflake warehousing and data modeling.
    • Proficient in DevOps using infrastructure as code services like Terraform.
    • Experience with monitoring tools such as Grafana, Prometheus, and Splunk.
    • Strong ability to get things done and lead with empathy.
    • Proven track record of building and leading software engineering teams.
    • Tenacity and the ability to balance competing priorities.
    • Orientation towards long-term impact while balancing short-term goals.
    • Exceptional written and verbal communication skills.
    • Curiosity and pragmatism in problem-solving.

Qualifications & Experience:

    • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
    • Proven experience in building and operating complex distributed systems at scale.
    • Empathy for users and their business goals, and a commitment to inclusion and diversity within teams.

Nice to Have:

    • Containers and container schedulers (e.g., Docker, ECS).
    • Search algorithms and their practical applications.
    • CI/CD and Gitlab.
    • Knowledge or experience with LLMs, vector search, or NLP.