The Senior Information Architect will be responsible for designing, developing, and maintaining scalable, efficient, and well-documented data models that support analytical, operational, and reporting needs across CIM's $35B+ portfolio spanning real estate equity, infrastructure, and private credit investments.
This role will establish data modeling standards and techniques from the ground up, developing and managing detailed data models deployed across Databricks Lakehouse as the primary platform, with integration to Snowflake and MongoDB for specific use cases. The Senior Information Architect will play a foundational role in building enterprise data architecture capabilities where mature standards do not yet exist.
This highly collaborative role requires extensive partnership with business stakeholders across Fund Accounting, FP&A, Investor Relations, Sales, and Investments teams to understand data requirements and translate them into robust, high-performing data models. The Senior Information Architect will shape the enterprise data landscape, supporting analytics, governance, and operational data needs across Azure and modern data ecosystems.
RESPONSIBILITIES:
- Partners extensively with business stakeholders across Fund Accounting, FP&A, Global Client Group, and Investments teams to understand data requirements, pain points, and use cases.
- Translate complex business requirements into robust data models, asking questions first before proposing solutions to ensure alignment with actual business needs.
- Collaborate with data analysts, data scientists, MLOps engineers, and application developers to understand technical requirements and ensure models support downstream use cases.
- Build trust and influence across teams by demonstrating business value and explaining technical constraints in accessible terms.
- Design and develop conceptual, logical, and physical data models for various data initiatives, including data warehouses, data lakes, operational data stores, and transactional systems.
- Specialize in designing highly optimized dimensional models (star schemas, snowflake schemas) for analytical reporting and business intelligence applications.
- Apply various data modeling techniques as appropriate: Dimensional Modeling (Kimball), 3NF (Inmon), Data Vault, and NoSQL modeling patterns.
- Design and implement medallion architecture (bronze/silver/gold) patterns within the Databricks Lakehouse, establishing standards where none currently exist.
- Optimize data models leveraging Delta Lake features including ACID transactions, time travel, schema evolution, Z-ordering, and liquid clustering.
- Design partitioning strategies that balance query performance with file management, avoiding over-partitioning while enabling partition pruning.
- Implement Unity Catalog namespace hierarchy (catalog, schema, table) for multi-domain, multi-environment data organization and governance.
- Collaborate with MLOps engineers on data models that support ML feature stores and GenAI/RAG applications.
- Design and maintain relational database schemas for both operational and analytical workloads within Snowflake and other RDBMS, ensuring integration with Databricks Lakehouse.
- Design and optimize data structures for NoSQL databases (e.g., MongoDB), considering document structures, indexing, and query patterns for specific application needs.
- Develop frameworks for deciding when data belongs in Databricks Lakehouse vs. Snowflake vs. MongoDB based on workload characteristics and use cases.
- Provide input and recommendations on query optimization, indexing strategies, and data partitioning based on data model design.
- Diagnose and resolve performance issues including data skew, small files problems, and inefficient join strategies in Spark/Databricks environments.
- Collaborate with database administrators and data engineers on OPTIMIZE, VACUUM, and ANALYZE strategies for Delta tables.
- Work closely with Data Engineers to ensure data models are efficiently implemented and align with ETL/ELT processes using Auto Loader, Delta Live Tables, or traditional Spark jobs.
- Provide guidance on data mapping, transformation rules, schema evolution handling, and data loading strategies.
- Design slowly changing dimension (SCD) patterns using Delta Lake MERGE operations and Change Data Feed for downstream propagation.
- Establish and enforce data modeling standards, naming conventions, metadata management, and data governance policies—building these foundations where they do not currently exist.
- Implement row-level and column-level security patterns using Unity Catalog for sensitive fund and investor data.
- Design and maintain data lineage tracking from source systems through bronze/silver/gold layers to final reports.
- Contribute to the development and maintenance of a comprehensive data dictionary and metadata repository.
- Operate within compliance framework to ensure ethical data handling, regulatory compliance, and consistency across the enterprise.
- Create and maintain detailed data model documentation, including data dictionaries, entity-relationship diagrams (ERDs), data flow diagrams, and data lineage.
- Document Lakehouse design patterns, medallion architecture implementations, and platform-specific best practices for team knowledge sharing.
- Contribute to data quality initiatives by identifying potential data quality issues at the modeling stage and collaborating on solutions.
Business Partnership & Requirements Gathering
Data Model Design & Development
Databricks Lakehouse Architecture
Multi-Platform Data Architecture
Performance Optimization
ETL/ELT Collaboration & Data Pipeline Design
Data Governance & Standards
Documentation & Knowledge Management
EDUCATION/EXPERIENCE REQUIREMENTS: (including certification, licenses, etc.)
- Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field.
- 10+ years of dedicated experience as a Data Modeler or Data Architect.
- Extensive experience with various data modeling techniques: Dimensional Modeling (Kimball), 3NF (Inmon), Data Vault, and NoSQL modeling.
- Strong expertise in SQL and experience with advanced SQL concepts for data analysis and modeling validation.
- Proficiency with data modeling tools (e.g., ER/Studio, Erwin, DataGrip, SQL Developer, or similar).
- Hands-on experience designing and implementing medallion architecture (bronze/silver/gold) in Databricks Lakehouse environments.
- Deep understanding of Delta Lake features: ACID transactions, time travel, schema evolution, Z-ordering, liquid clustering, and table maintenance.
- Experience with Unity Catalog for data governance, access control, and lineage tracking.
- Understanding Spark optimization, partitioning strategies, and performance tuning for large-scale data processing.
- Familiarity with Databricks SQL Warehouses, Delta Live Tables, and data pipeline patterns.
- Proven experience with Databricks/Delta Lake technologies - Experience modeling data within a data Lakehouse environment (Required).
- Solid understanding of Azure Cloud Platform: Azure Data Lake Storage, Azure Databricks.
- Experience with data ingestion concepts: Kafka, Azure Event Hubs, CDC, APIs and their impact on data structure and modeling.
- Prior experience within financial services, private equity, or alternative investments sectors.
- Experience building data architecture standards and practices from scratch in entrepreneurial environments.
- Understanding investment data structures: fund hierarchies, investor allocations, NAV calculations, capital calls/distributions.
- Experience supporting ML/AI use cases including feature engineering and data models for GenAI/RAG applications.
- Databricks Certified Data Engineer Associate or Professional.
- Microsoft Certified: Azure Data Engineer Associate.
- Microsoft Certified: Azure Enterprise Data Analyst Associate.
- TOGAF Certification (for enterprise architecture alignment).
Required:
Databricks Platform Requirements:
Multi-Platform Experience:
Preferred:
Desirable Certifications:
ABOUT YOU:
- Entrepreneurial Mindset: Comfortable building standards, processes, and architecture where none currently exist; thrives with ambiguity and takes initiative without detailed specifications.
- Business Partnership Excellence: Exceptional ability to collaborate with non-technical business stakeholders in Fund Accounting, FP&A, Investor Relations, Sales, and Investments teams—listening to understand problems before proposing solutions.
- Relationship-Focused Collaboration: Proven track record of building trust and influencing without authority across diverse teams; success depends on partnership, not hierarchy.
- Business Value Orientation: Ties technical decisions to business outcomes; prioritizes pragmatic, value-driven solutions over architectural perfection.
- Communication Excellence: Translates complex technical concepts into accessible terms for non-technical audiences; adapts communication style to stakeholder needs.
- Self-Direction & Initiative: Proactively identifies gaps, proposes solutions, and drives progress without waiting for detailed direction or approval on every decision.
- Continuous Learner: Stays current with emerging data modeling techniques, Databricks platform evolution, and database technologies; recommends new approaches where appropriate.
- Business teams across Fund Accounting, IR, and Investments view you as a trusted partner who understands their needs.
- Data models you design are adopted, performing well, and enabling business value—not just technically elegant.
- Standards and governance frameworks you establish become the foundation for CIM's data architecture practice.
- You effectively navigate ambiguity, making progress without needing complete requirements upfront.
The ideal candidate thrives in an entrepreneurial environment where mature processes and standards do not yet exist. You are energized by the opportunity to build data architecture foundations from scratch rather than inheriting established frameworks.
Key Competencies:
What Success Looks Like:
- A variety of Medical, dental, and vision benefit plans
- Health Savings Account with a generous employer contribution
- Company paid life and disability insurance
- 401(k) savings plan, with company match
- Comprehensive paid time off, including: vacation days, 10 designated holidays, sick time, and bereavement leave
- Up to 16 hours of volunteer time off
- Up to 16 weeks of Paid Parental Leave
- Ongoing professional development programs
- Wellness program, including monthly and quarterly prizes
- And more!
Top Skills
CIM Group Los Angeles, California, USA Office
4700 Wilshire Blvd., Los Angeles, CA, United States, 90010
Similar Jobs
What you need to know about the Los Angeles Tech Scene
Key Facts About Los Angeles Tech
- Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
- Key Industries: Artificial intelligence, adtech, media, software, game development
- Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
- Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

