Eucloid is a leader in AI and Data Science, creating solutions for Hi-tech, D2C, Healthcare, and SaaS industries. They partner with companies like Databricks, Google Cloud, and Adobe, focusing on next-gen technology and delivering transformative results for Fortune 100 clients.
About Our Leadership
Anuj Gupta – Former Amazon leader with over 22 years of experience in building and managing large engineering teams. (B.Tech, IIT Delhi; MBA, ISB Hyderabad).
Raghvendra Kushwah – Business consulting expert with 21+ years at Accenture and Cognizant (B.Tech, IIT Delhi; MBA, IIM Lucknow).
Data Solutions Architect
Miejsce pracy: Warszawa
What You’ll Do
As a Data Solutions Architect, you will play a key role in designing, governing, and scaling enterprise data platforms for global BFSI clients.You will be responsible for:
- Design and own enterprise-scale data architectures aligned to business, regulatory, and scalability requirements
- Lead the migration of legacy data warehouse platforms (on-prem or cloud DWH) to Databricks-based lakehouse architectures, ensuring architectural consistency and minimal business disruption
- Architect end-to-end solutions across ingestion, transformation, storage, and consumption layers
- Define and enforce non-functional requirements (NFRs), including high availability (HA), disaster recovery (DR), and clearly defined RTO/RPO objectives
- Establish and maintain architecture governance, including reference architectures, design standards, and technical review processes
- Guide data engineering teams on implementation approaches, best practices, and architectural trade-offs
- Reviewing platform performance, reliability, and scalability, and driving optimisation initiatives
- Ensure data platforms adhere to security, governance, access control, and auditability requirements typical of regulated banking environments
- Review platform performance, reliability, scalability, and cost efficiency, driving optimisation initiatives
- Create and maintain clear architecture documentation, design artefacts, and decision logs
- Collaborate closely with senior client stakeholders, delivery leaders, and cross-functional teams
What Makes You a Fit
Academic Background:- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related discipline
- 7–10+ years of experience designing and delivering large-scale data platforms
- Strong expertise in data architecture and data modelling (conceptual, logical, and physical models)
- Deep hands-on experience with Databricks Lakehouse architecture, including Apache Spark and Delta Lake
- Strong experience with Delta Lake optimisations, including file sizing, compaction, schema evolution, and performance tuning
- Hands-on experience with Databricks Unity Catalog for enterprise data governance, access control, and lineage management
- Experience defining and enforcing Databricks cost governance and optimisation strategies in large, multi-workspace environments
- Strong experience with at least one major cloud platform (AWS, Azure, or GCP)
- Strong proficiency in SQL and at least one programming language such as Python or Scala
- Proven experience delivering enterprise data platforms in regulated BFSI / banking environments, aligned with requirements such as BCBS 239, GDPR, and internal risk and compliance standards
- Experience working with risk, finance, AML, or regulatory reporting data domains
- Strong understanding of data lineage, auditability, reconciliation, and data quality controls required for banking-grade platforms
- Experience operating and governing data platforms with strict SLAs and regulatory-driven reliability expectations
- Experience translating business and regulatory requirements into secure, scalable technical architectures
- Prior experience working in client-facing, delivery-oriented roles within consulting or large enterprise environments
- Experience mentoring senior and mid-level engineers and guiding architecture decisions across multiple delivery teams
- Strong stakeholder communication skills, with the ability to explain complex architectural concepts to both technical and non-technical audiences
- Experience contributing to architecture governance, technical standards, and design reviews
- Strong decision-making skills, balancing architectural rigor with pragmatic delivery in large transformation programs
- Exposure to Data Mesh or domain-oriented data architecture paradigms
- Experience integrating data platforms with AI/ML or advanced analytics use cases
- Prior exposure to UK or European banking environments
- Experience contributing to multi-year, enterprise-scale data transformation programs
Engagement Details:
- Employment Type: Full-time, fixed-term (12 months)
- Location: Poland (EU-based candidates; hybrid/on-site based on client requirements)
- Start Date: Early February 2026
- Extension: Possible based on performance and program continuity

Najnowsze komentarze