The Aspen Group (TAG) is one of the largest and most trusted retail healthcare business support organizations in the U.S., supporting over 23,000 healthcare professionals and team members at more than 1,150 locations across 48 states. Our five supported healthcare practices operate under the brands Aspen Dental, ClearChoice, WellNow, Chapter Aesthetic Studio, and Lovet. We’re committed to enabling healthcare professionals to focus on patient care while we handle the business operations that support them.
The People Analytics team delivers data-driven insights that shape workforce strategy across TAG’s five brands. We build the data pipelines, dashboards, and automated reports that help leaders make better decisions about our 23,000+ employees.
As an Analyst, People Analytics, you will build and maintain the data infrastructure that powers our analytics work, and use AI-assisted development tools to do it faster than a team this size normally could. You’ll extract data from source systems like Workday and CultureAmp, transform and load it into BigQuery, and ensure the datasets behind our dashboards and reports are accurate, fresh, and well-documented. You’ll also support analysis and reporting across the team.
This is a technical, hands-on role. On a typical day, you might use an AI coding assistant to build a new data pipeline in the morning, debug a data quality issue after lunch, and write a SQL query to answer an executive’s question before end of day. We work in an AI-assisted development workflow where agentic tools are part of the everyday toolkit - not an experiment, but how we ship.
This role is ideal for someone who is genuinely curious, likes building things that others depend on, and is excited to work at the intersection of data engineering and AI-augmented development.
Key Responsibilities
Data Pipeline Development & Maintenance
- Build and maintain Python-based ETL processes that extract data from Workday APIs and other source systems
- Load and transform data into BigQuery with appropriate schemas and structure
- Monitor pipeline runs and resolve failures or data quality issues
- Handle edge cases like API changes, schema drift, and missing data gracefully
Data Quality & Validation
- Implement validation checks to ensure accuracy between source systems and warehouse
- Investigate and resolve data discrepancies surfaced by reports or dashboard users
- Document data lineage, transformations, and known quality issues
- Build monitoring to track data freshness and pipeline health
Analysis & Reporting
- Write SQL queries and build datasets that power dashboards and executive reporting
- Support ad-hoc data requests from HR and business leaders
- Automate recurring reports and manual data processes
- Validate and QA analytical outputs before delivery
Documentation & Collaboration
- Write clear documentation for data sources, schemas, and transformation logic
- Partner with team members to understand data requirements for new projects
- Contribute to team coding standards and code reviews
- Support compliance and audit requests with accurate data documentation
How We Work
Our team uses AI-assisted development tools as a core part of how we build. That means:
- Writing code with AI assistants. We use AI to draft pipelines, debug issues, and iterate on solutions. You’ll spend more time directing and reviewing code than typing every line from scratch.
- Automating the repetitive. If something can be automated - a report, a data check, a deployment step - we automate it. We look for people who instinctively think “how do I make this run itself?”
- Shipping over perfecting. We prefer working solutions delivered quickly over polished solutions delivered slowly. We iterate in the open and improve as we go.
You don’t need to already be an expert in these tools. You need to be the kind of person who picks them up quickly, experiments on your own, and is excited by the idea that a small team can punch well above its weight with the right approach.
Skills & Qualifications
Required
- 2-4 years of experience in analytics, data engineering, or a related technical role
- Strong SQL skills (complex queries, CTEs, window functions, optimization)
- Proficiency in Python for data processing (pandas, API integrations, scripting)
- Experience building or maintaining data pipelines or automated data processes
- Familiarity with cloud data warehouses (BigQuery, Snowflake, Redshift)
- Strong troubleshooting skills and attention to detail
- Ability to communicate technical work to non-technical stakeholders
- Bachelor’s degree in a quantitative or technical field, or equivalent experience
Preferred
- Experience with BigQuery specifically
- Familiarity with Workday or other HRIS data (employee records, job history, compensation)
- Experience with GCP services (Cloud Functions, Cloud Scheduler, Cloud Run)
- Background in HR, People Analytics, or workforce data
- Hands-on experience with AI-assisted development tools (Claude Code, GitHub Copilot, Cursor, or similar)
What We Offer
- Opportunity to build the data foundation for a 23,000+ employee organization
- A growing team investing in modern analytics practices and AI-assisted workflows
- Hybrid work environment with flexibility
- Comprehensive benefits including health, dental, vision, and 401(k)
- Paid time off and company holidays
- Professional development opportunities
About the Team
You’ll join a growing People Analytics team that values curiosity, reliability, and simplicity. We believe good data infrastructure should be predictable, well-documented, and easy to maintain. We use AI-assisted tools daily to move faster and take on more than our headcount would suggest — and we’re looking for someone who’s excited by that way of working. If you like building things that people depend on and you’re always looking for a better way to do it, you’ll fit right in.
*This role is onsite 4 days/week in our Chicago office (Fulton Market District)
- A generous benefits package that includes paid time off, health, dental, vision, and 401(k) savings plan with match