African-American Jobs
close

Equifax, Inc.

Apply for this job

Global Platform Big Data Architect (Finance)



We are seeking a highly experienced and visionary Global Platform Big Data Architect to spearhead the design, development, and evolution of our next-generation Data Fabric platform. This pivotal role will be responsible for defining the architectural roadmap, establishing best practices, and providing expert guidance to engineering teams building scalable, reliable, and secure data solutions across both Google Cloud Platform (GCP) and Amazon Web Services (AWS). The ideal candidate will possess deep technical expertise in big data technologies, cloud-native data services, and a proven track record of delivering complex data platforms.

Equifax has a hybrid work schedule that allows for 2 days of remote work (Monday and Friday), with 3 required onsite days (Tuesday, Wednesday, Thursday) every week.

This role will work the required onsite days at our Equifax office in Alpharetta, Georgia.

This position does not offer immigration sponsorship (current or future) including F-1 STEM OPT extension support.

This position is not open to third-party vendors or C2C.

What you will do

  • Data Fabric Vision & Strategy: Define and champion the architectural vision and strategy for our enterprise-wide Data Fabric platform, enabling seamless data discovery, access, integration, and governance across disparate data sources.
  • Architectural Leadership: Lead the design and architecture of highly scalable, resilient, and cost-effective data solutions leveraging a diverse set of big data and cloud-native services in GCP and AWS.
  • Technical Guidance & Mentorship: Provide expert architectural guidance, technical leadership, and mentorship to multiple engineering teams, ensuring adherence to architectural principles, best practices, and design patterns.
  • Platform Development & Evolution: Drive the selection, implementation, and continuous improvement of core data platform components, tools, and frameworks.
  • Cloud-Native Expertise: Leverage deep understanding of GCP and AWS data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, S3, EMR, Kinesis, Redshift, Glue, Athena) to design optimal solutions.
  • Data Governance & Security: Architect and implement robust data governance, security, privacy, and compliance measures within the data platform, ensuring data integrity and regulatory adherence.
  • Performance & Optimization: Identify and address performance bottlenecks, optimize data pipelines, and ensure efficient resource utilization across cloud environments.
  • Innovation & Research: Stay abreast of emerging big data and cloud technologies, evaluate their potential impact, and recommend their adoption where appropriate.
  • Cross-Functional Collaboration: Collaborate closely with data scientists, data engineers, analytics teams, product managers, and other stakeholders to understand data requirements and translate them into architectural designs.
  • Documentation & Standards: Develop and maintain comprehensive architectural documentation, standards, and guidelines for data platform development.
  • Proof-of-Concepts (POCs): Lead and execute proof-of-concepts for new technologies and architectural patterns to validate their feasibility and value.

What experience you will need
  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.
  • 10+ years of progressive experience in data architecture, big data engineering, or cloud platform engineering.
  • 5+ years of hands-on experience specifically designing and building large-scale data platforms in a cloud environment.
  • Expertise in designing and implementing data lakes, data warehouses, and data marts in cloud environments.
  • Proficiency in at least one major programming language for data processing (e.g., Python, Scala, Java).
  • Deep understanding of distributed data processing frameworks (e.g., Apache Spark, Flink).
  • Experience with various data modeling techniques (dimensional, relational, NoSQL).
  • Solid understanding of DevOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation).
  • Experience with real-time data streaming technologies (e.g., Kafka, Kinesis, Pub/Sub).
  • Strong understanding of data governance, data quality, and metadata management concepts.
  • Excellent communication, presentation, and interpersonal skills with the ability to articulate complex technical concepts to both technical and non-technical audiences.
  • Proven ability to lead and influence technical teams without direct authority.

What could set you apart
  • Strong, demonstrable experience with Google Cloud Platform (GCP) big data services (e.g., BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Cloud Functions). GCP certifications (e.g., Professional Data Engineer, Professional Cloud Architect).
  • Strong, demonstrable experience with Amazon Web Services (AWS) big data services (e.g., S3, EMR, Kinesis, Redshift, Glue, Athena, Lambda). AWS certifications (e.g., Solutions Architect Professional, Big Data Specialty).
  • Experience with data mesh principles and implementing domain-oriented data architectures.
  • Familiarity with other cloud platforms (e.g., Azure) or on-premise data technologies.
  • Experience with containerization technologies (e.g., Docker, Kubernetes).
  • Knowledge of machine learning operationalization (MLOps) principles and platforms.
  • Contributions to open-source big data projects.

#LI-Hybrid

#LI-KD1 Apply

Apply Here done

© 2025 African-American Jobs