Analytics8 logo

Analytics8

HQ
Chicago
Total Offices: 3
200 Total Employees
Year Founded: 2002

Analytics8 Innovation & Technology Culture

Analytics8 Employee Perspectives

What’s your rule for fast, safe releases — and what KPI proves it works?

I always compare an agent to an intern when talking to a client about using AI in production. It can take a lot of work off of your plate and even surprise you with how well it handles something complex, but you must validate everything before approving output. 

I use Visual Studio Code to review changes made by the agent directly inside edited files. The agent’s updates appear inline, so I can keep, change or discard each one. It’s like reviewing a pull request with each prompt. Validating one piece of work at a time keeps accuracy high.

Model Context Protocol tools add a layer of safety by giving agents clear process tools for how to run specific actions. This cuts down unpredictable behavior. Sometimes, however, the agent skips the tool, and I need to adjust the prompt to ensure it uses the tool to complete the requested action.

What proves it works is time to value. For large tasks, it’s far quicker for engineers to review migrated SQL that the agent creates and documents than if they built it manually. Additionally, since adding MCP tools to our workflow, I’ve seen a drop in rework and re-prompting. We’re able to maintain accelerated AI speed without sacrificing quality.

 

What standard or metric defines “quality” in your stack?

Quality in my stack means the agent understands my environment and builds accurate work without forcing me into repeated re-prompts or rework. Quality also means multiple engineers get consistent results when they ask the agent for help. Hallucinations and invented logic don’t meet the standard, so I actively reduce them.

 

Name one AI/automation that shipped recently and its impact on your team.

During a recent onsite hackathon focused on clearing our toughest backlog items, we built and deployed an MCP-powered agent inside Visual Studio Code. It automates one of the most time-consuming parts of a modernization project: analyzing legacy transformation logic and generating initial dbt models. 

The agent connects to Snowflake through managed MCP servers and to Collibra metadata through a custom MCP server we built. It pulls metadata from Collibra, interprets legacy logic, and then uses dbt and Snowflake servers to recreate that logic, declare sources, build staging layers, draft first-pass intermediate models, create tests, and generate documentation.

The process used to take days of manual untangling, Now, with a few sharp prompts, the agent delivers solid first passes in minutes. In the hackathon alone, we completed what would normally be a two-week sprint in just a couple of days. The automation removed most of the repetitive translation work, produced surprisingly strong early versions of complex models, accelerated our client’s modernization timeline, and allowed them to focus on higher-value warehouse design and governance decisions instead of reviewing logic by hand.

Chris Domain
Chris Domain, Managing Consultant

What is the unique story that you feel your company has with AI? If you were writing about it, what would the title of your blog be?

The blog title would be “Clean Data In, Real Results Out: Our Ground-Level Advantage in AI.” Our story with AI starts where we’ve always been strong — data quality. Everyone talks about model performance, but we focus on the part that’s often overlooked, clean, consistent and well-defined data. That’s what makes AI actually work. 

We’ve helped clients across industries wrangle messy, siloed data into usable formats — whether it’s structured, semi-structured or unstructured — and those same skills are critical for AI success. The principles we’ve used for decades still apply; if the data isn’t clear and governed, the insights won’t be either. We’re not chasing shiny tools — we’re applying real data expertise to real AI use cases.

 

What are you most excited about in the field of AI right now?

I’m excited by how quickly AI is becoming usable. Until recently, if you wanted to build something with generative AI, you needed full-scale development — custom UIs, infrastructure and engineering support. Now, that’s changing. The frameworks are maturing and the vendors we already partner with are embedding AI into their tools in a way that lets us build fast, useful solutions for clients. Whether it’s applying RAG techniques to unstructured data or using semantic models to add context to structured data, we can now turn proprietary information into actionable insight — without reinventing the wheel. That shift is a game-changer.

 

AI is a constantly evolving field. Very few people coming into these roles have years of experience to pull from. Explain what continuous learning looks like on your team. How do you learn from one another and collaborate?

We treat AI like any evolving toolset — we stay grounded in core data principles while exploring what’s new. Everyone on the team keeps an eye on the latest model releases and vendor updates, but our real value comes from translating that into client impact. We test tools in private preview, share lessons through internal whitepapers and video walkthroughs and collaborate to figure out where new frameworks can plug into existing analytics work. It’s not about chasing trends — it’s about asking, “what’s worth trying and how do we apply it responsibly to real data problems?”

How does innovation show up in your company culture?

Our consultants work across a wide variety of client environments and challenges. That breadth of experience builds something valuable: the ability to recognize patterns and separate signals from noise when new technologies emerge.

Based on our experience, the best innovations come from practitioners solving real problems. When a consultant discovers a more effective approach on a project, we create space for them to share it with our broader team. That includes a bi-weekly internal series where consultants spotlight unique client and development work from the field, so practical ideas inform how others approach similar ideas. 

We take a deliberate effort to foster an environment where curiosity and calculated experimentation is supported, and team members are empowered to contribute ideas openly. Innovation at Analytics8 isn’t a top-down initiative. It emerges from experienced people doing the work, testing ideas and talking about what they’re learning from their client work. That practitioner-driven approach is a key part of what makes us effective.

 

What’s one recent innovation that improved user or employee experience?

We’ve developed a Model Context Protocol and agentic framework that allows our consultants to securely connect to client data sources and orchestrate complex analytical workflows using AI.

In practice, this means a consultant can configure an intelligent assistant that understands the client’s data landscape. It can very quickly query databases, analyze schemas, and generate documentation, work that previously required hours of manual exploration. The result is that our teams spend less time on discovery and more time on the higher-value activities: architecture decisions, stakeholder collaboration and solution design. 

Our agentic framework also creates project efficiency without sacrificing quality. The question we now ask is: Can we build this once and adapt it across multiple situations?” That mindset of reusable, configurable solutions allows us to invest more in more valuable data insights rather than infrastructure.

 

How do you balance experimentation with stability?

It’s a balance we think about deliberately. The primary filter we apply is scalability: Is this useful beyond a single engagement, or is it a one-off solution? Can other consultants contribute to it and build on it?

Experimentation is valuable, but stability requires real investment, either time to document and maintain something properly, or resources to build it the right way. We take that investment seriously. When something proves to be scalable, practical, and accessible enough that other team members can contribute, that’s when we commit. We formalize the tooling, create shared ownership, and build it into how we work.

Not every experiment needs to become a platform. The ones that don’t scale still teach us something. But when an approach proves its value across multiple client situations, we make the investment to ensure it is durable and well-supported.

John Bemenderfer
John Bemenderfer, Principal Consultant

Analytics8's Tech Stack

Access
Access
DATABASES
ASP.NET
ASP.NET
FRAMEWORKS
AWS (Amazon Web Services)
AWS (Amazon Web Services)
SERVICES
AWS Redshift
AWS Redshift
DATABASES
BigQuery
BigQuery
DATABASES
Cassandra
Cassandra
DATABASES
DB2
DB2
DATABASES
Google Cloud
Google Cloud
SERVICES
Hadoop
Hadoop
FRAMEWORKS
Hive
Hive
DATABASES
Java
Java
LANGUAGES
JavaScript
JavaScript
LANGUAGES
jQuery
jQuery
LIBRARIES
Microsoft Azure
Microsoft Azure
SERVICES
Microsoft SQL Server
Microsoft SQL Server
DATABASES
MongoDB
MongoDB
DATABASES
MySQL
MySQL
DATABASES
Oracle
Oracle
DATABASES
PostgreSQL
PostgreSQL
DATABASES
Python
Python
LANGUAGES
R
R
LANGUAGES
SAP HANA
SAP HANA
DATABASES
Snowflake
Snowflake
DATABASES
Spark
Spark
FRAMEWORKS
SQL
SQL
LANGUAGES
SQLite
SQLite
DATABASES
TensorFlow
TensorFlow
FRAMEWORKS
Teradata
Teradata
DATABASES
Snowflake
Snowflake
DATABASES
Cloudera
Cloudera
DATABASES
Databricks
Databricks
DATABASES
Amazon Web Services
Amazon Web Services
FRAMEWORKS
Microsoft Azure
Microsoft Azure
FRAMEWORKS
Fivetran
Fivetran
FRAMEWORKS
dbt
dbt
FRAMEWORKS
Aha!
Aha!
PROJECT MANAGEMENT
Airtable
Airtable
PROJECT MANAGEMENT
Asana
Asana
PROJECT MANAGEMENT
Basecamp
Basecamp
PROJECT MANAGEMENT
Canva
Canva
DESIGN
Google Docs
Google Docs
PROJECT MANAGEMENT
Google Drive
Google Drive
PROJECT MANAGEMENT
Illustrator
Illustrator
DESIGN
JIRA
JIRA
PROJECT MANAGEMENT
Looker
Looker
ANALYTICS
Photoshop
Photoshop
DESIGN
Tableau
Tableau
ANALYTICS
Qlik
Qlik
ANALYTICS
Microsoft PowerBI
Microsoft PowerBI
ANALYTICS
Tableau
Tableau
ANALYTICS
dbt
dbt
ANALYTICS
Alteryx
Alteryx
ANALYTICS
Sisu
Sisu
ANALYTICS
Sisense
Sisense
ANALYTICS
Snowflake
Snowflake
ANALYTICS
Databricks
Databricks
ANALYTICS
AWS
AWS
ANALYTICS
Google Cloud
Google Cloud
ANALYTICS
Microsoft Azure
Microsoft Azure
ANALYTICS
HubSpot
HubSpot
CRM
LinkedIn SalesNavigator
LinkedIn SalesNavigator
CRM
Marketo
Marketo
LEAD GEN
Microsoft Dynamics
Microsoft Dynamics
CRM
Outreach
Outreach
CRM
Salesforce
Salesforce
CRM
Wordpress
Wordpress
CMS
Asana
Asana
PROJECT MANAGEMENT
Microsoft Teams
Microsoft Teams
COLLABORATION
Slack
Slack
COLLABORATION
Zoom
Zoom
COLLABORATION
Float
Float
PROJECT MANAGEMENT
Trakstar
Trakstar
PROJECT MANAGEMENT
Recruiterbox
Recruiterbox
PROJECT MANAGEMENT
Podbean
Podbean
PROJECT MANAGEMENT
Udemy
Udemy
PROJECT MANAGEMENT