Critères de l'offre
Métiers :
- Data architect
- + 1 métier
Secteur :
- Toutes entreprises
Lieux :
- Minneapolis
Conditions :
- CDI
- CDD
- Temps Plein
- Télétravail partiel
L'entreprise : Expleo
Acteur mondial de l'ingénierie, de la technologie et du conseil, Expleo accompagne des entreprises reconnues dans leur innovation afin d'accélérer leur réussite.
Nous nous appuyons sur plus de 40 ans d'expérience dans le développement de produits complexes, l'optimisation des processus de fabrication et la performance des systèmes d'information. Notre expérience sectorielle nous permet d'apporter à nos clients une expertise approfondie propre à stimuler l'innovation à chaque étape de la chaîne de valeur. Le groupe réalise un chiffre d'affaires annuel d'un milliard d'euros.
Expleo est un groupe responsable qui s'engage à placer l'éthique et la diversité au centre de ses pratiques, ainsi qu'à œuvrer pour une société plus durable et plus sûre.
Chez Expleo, épanouissez-vous au cœur d'une communauté de 19 000 collaborateurs hautement qualifiés qui fournissent des solutions à forte valeur ajoutée dans 30 pays.
Notre politique de recrutement est engagée en faveur de l'intégration et du maintien dans l'emploi des personnes en situation de handicap.
Description du poste
Location: Remote
Employment Type: Full-Time
Join Trissential and Help Shape a Cloud-Ready Future for Operational Technology Data
If you're a seasoned Data Architect who thrives at the intersection of industrial data, cloud architecture, and secure data movement-this is your opportunity. At Trissential, we partner with forward-thinking organizations that are modernizing how operational technology (OT) data is leveraged for analytics, automation, and AI. You'll join our client's team as the technical leader responsible for turning complex OT environments into governed, AI-ready cloud platforms built on Databricks.
What's in It for You?
- High-impact architecture ownership across OT, cloud, and enterprise data domains
- A role where your expertise shapes reference patterns, data governance, and long-term platform strategy
- Opportunity to work across multiple senior stakeholder groups-Security, Networking, Operations, and Data Architecture
- A collaborative project environment backed by Trissential's culture of growth, transparency, and support
- A chance to help an enterprise build an industrial-grade data fabric that scales for analytics and AI
Your Role & Responsibilities
- Partner with business and technology leaders to translate requirements into secure, scalable cloud architectures
- Define target-state designs for safe and governed data movement from on-prem OT networks into Databricks
- Evaluate and select approaches for ingesting/virtualizing historian data (especially OSI PI and AVEVA Connect)
- Architect streaming, micro-batch, and batch data pipelines from edge to lakehouse
- Design data layers (landing, curated, serving) aligned with Databricks lakehouse and Unity Catalog governance
- Define AWS network and cloud security controls-VPC patterns, subnet designs, routing, encryption, private endpoints
- Ensure Databricks E2 control plane and data plane security standards are followed, with compensating controls documented
- Develop canonical time-series and asset-centric data models to support analytics and AI
- Establish data quality SLAs, lineage standards, and AI data readiness frameworks
- Produce ADRs, architecture blueprints, and engineering playbooks
- Coach engineering teams and participate in architecture reviews
- Collaborate with Security, Networking, and Compliance to validate controls and guide remediation
- Measure and optimize cost, performance, and reliability across data pipelines and platforms
Skills & Experience You Should Possess
- Extensive background architecting data solutions within operational technology (OT) environments
- Expertise designing solutions for industrial/asset-centric data domains
- Deep experience with OSI PI, AVEVA, or similar historian platforms
- Strong hands-on knowledge of Databricks, Spark, Delta Lake, and Unity Catalog
- Proven mastery of data pipeline architecture-batch, micro-batch, streaming, CDC, and edge-to-cloud patterns
- Advanced data modeling experience with time-series data and asset hierarchies
- Strong AWS networking and security knowledge: VPCs, subnets, routing, IAM, KMS, private connectivity
- Ability to interpret and implement enterprise Databricks security guidance (E2 architecture)
- Excellent communication and negotiation skills with senior business and technical leaders
- Familiarity with ML/AI platform requirements such as feature stores, lineage, and observability
Bonus Points If You Have
- Experience integrating AVEVA PI AF or AVEVA Data Views with Databricks
- Prior contributions to enterprise data fabric or Databricks governance standards
- Experience in regulated industrial or utility environments
- Background working with safety, reliability, or compliance-heavy data ecosystems
Education & Certifications You Need
- Bachelor's degree in Computer Science, Engineering, Information Systems, or related field
- Cloud or Databricks certifications beneficial but not…

