How to Conduct Campus Climate Surveys That Create Inclusive Learning Environments
A comprehensive guide to designing campus climate surveys that measure belonging, assess inclusion across identities, evaluate institutional trust, and inform evidence-based DEI strategies in higher education.
How to Conduct Campus Climate Surveys That Create Inclusive Learning Environments
Campus climate shapes everything — academic performance, mental health, retention, graduation rates, and the fundamental quality of the educational experience. When students, faculty, and staff feel they belong, they thrive. When they don't, the consequences ripple through every institutional metric.
Yet many institutions struggle to move beyond anecdotal evidence about their climate. They know something is wrong from incident reports and student complaints, but lack the systematic data to understand the scope, identify root causes, and measure whether their interventions are working.
A well-designed campus climate survey provides that systematic evidence. It transforms vague concerns into measurable dimensions, identifies disparities across identity groups, and creates accountability for change. This guide covers how to build one that generates genuine insight and drives meaningful action.
What Is Campus Climate and Why Survey It?
Campus climate refers to the current attitudes, behaviors, standards, and practices of employees and students at an institution. The Association of American Universities (AAU) Campus Climate Survey — conducted across 33 major research universities — established that climate directly affects educational outcomes, persistence, and well-being.
NASPA - Student Affairs Administrators in Higher Education defines campus climate through multiple overlapping dimensions:
- Behavioral dimension: The actual experiences people have — discrimination, harassment, microaggressions, or conversely, welcoming interactions and inclusive practices
- Psychological dimension: Perceptions of belonging, safety, respect, and fairness
- Structural/compositional dimension: The demographic representation in the student body, faculty, staff, and leadership
- Historical dimension: The institution's legacy and its current relationship with that legacy
A comprehensive climate survey measures all four dimensions, recognizing that they interact with and reinforce each other.
Legal and Compliance Context
Title IX Climate Assessment
Title IX of the Education Amendments of 1972 prohibits sex-based discrimination in federally funded education programs. The 2020 Title IX regulations and subsequent guidance emphasize that institutions should proactively assess their climate around sexual harassment and gender-based discrimination.
A campus climate survey supports Title IX compliance by:
- Measuring the prevalence of sexual harassment and assault
- Assessing awareness of reporting procedures and support resources
- Evaluating perceptions of institutional response to reports
- Identifying locations, contexts, and populations at highest risk
The AAU Campus Climate Survey on Sexual Assault and Misconduct provides a validated instrument that many institutions adapt.
Clery Act Considerations
The Jeanne Clery Disclosure of Campus Security Policy and Campus Crime Statistics Act requires institutions to report campus crime statistics. Climate surveys supplement Clery data by capturing the unreported experiences that don't appear in crime statistics — research consistently shows that the vast majority of campus sexual assault and harassment goes unreported.
Core Survey Dimensions
1. Belonging and Inclusion
Belonging is the psychological experience of feeling accepted, valued, and included. Research by Gregory Walton and colleagues at Stanford demonstrates that belonging uncertainty — doubting whether one belongs — has significant negative effects on academic performance, especially for underrepresented students.
Survey questions:
- Scale (1-7): "How much do you feel you belong at this institution?" (Not at all to Very much)
- Scale (1-7): "I feel valued as an individual at this institution" (Strongly disagree to Strongly agree)
- Scale (1-5): "How comfortable are you being yourself on campus?" (Very uncomfortable to Very comfortable)
- Single choice: "Where do you feel the strongest sense of belonging on campus?" (Academic classes / Student organizations / Residence halls / Cultural centers / Athletic teams / Religious or spiritual groups / Off-campus communities / Nowhere on campus)
- Scale (1-7): "People of my background are respected and valued at this institution" (Strongly disagree to Strongly agree)
- Yes/No: "Have you ever considered leaving this institution because you felt you didn't belong?"
- Open-ended: "Describe a moment when you felt most included — or most excluded — at this institution."
2. Experiences with Discrimination and Bias
This dimension captures direct experiences of unfair treatment based on identity. The Diverse Learning Environments (DLE) Survey from UCLA's Higher Education Research Institute provides validated item banks for measuring discrimination experiences.
Survey questions:
- Yes/No: "In the past 12 months, have you personally experienced discrimination or unfair treatment at this institution based on any aspect of your identity?"
- Multiple choice: "If yes, which aspect(s) of your identity were involved? Select all that apply." (Race/ethnicity / Gender / Sexual orientation / Gender identity or expression / Religion / Disability / Socioeconomic background / National origin / Age / Political views / Immigration status / Veteran status / Other / Prefer not to answer)
- Single choice: "In what context did the discrimination primarily occur?" (In a classroom or academic setting / In campus housing / At a campus event / In interactions with staff or administration / In interactions with other students / In interactions with campus police or security / Online or on social media / Other)
- Single choice: "Who was primarily responsible?" (Faculty member / Other students / Staff or administrator / Campus police / Someone not affiliated with the institution / Prefer not to answer)
- Yes/No: "Did you report this experience to anyone at the institution?"
- Single choice: "If you did not report it, what was the primary reason?" (Didn't think it was serious enough / Didn't think anything would be done / Fear of retaliation / Didn't know how or where to report / Concerned about confidentiality / Didn't want to cause problems / Other)
- Open-ended: "If you are comfortable sharing, please describe the experience and its impact on you."
3. Microaggressions and Everyday Experiences
Derald Wing Sue's research on microaggressions at Columbia University defines them as brief, commonplace exchanges that communicate hostile, derogatory, or negative messages to members of marginalized groups. While individual microaggressions may seem minor, their cumulative effect is significant.
Survey questions:
-
Scale (1-5): "How often have you experienced the following at this institution?" (Never / Rarely / Sometimes / Often / Very often)
- "Being treated as a representative or spokesperson for your identity group"
- "Having your comments or contributions ignored or dismissed in class"
- "Being assumed to be less qualified or competent based on your identity"
- "Hearing insensitive or disparaging remarks about your identity group"
- "Being made to feel unwelcome in a campus space"
- "Having your identity questioned or invalidated"
-
Scale (1-7): "How much do these experiences affect your academic performance?" (Not at all to A great deal)
-
Scale (1-7): "How much do these experiences affect your mental health and well-being?" (Not at all to A great deal)
-
Open-ended: "Describe an everyday interaction on campus that communicated to you that you did or did not belong."
4. Institutional Trust and Response
Do members of the campus community trust the institution to respond effectively to climate concerns? This dimension is crucial because low institutional trust suppresses reporting and engagement with DEI initiatives.
Survey questions:
- Scale (1-7): "How much do you trust this institution to take reports of discrimination or harassment seriously?" (Not at all to Completely)
- Scale (1-7): "This institution's leadership is genuinely committed to diversity, equity, and inclusion" (Strongly disagree to Strongly agree)
- Scale (1-5): "How effective is this institution at addressing incidents of bias or discrimination when they occur?" (Very ineffective to Very effective)
- Single choice: "How would you characterize this institution's approach to diversity and inclusion?" (Genuinely committed and taking meaningful action / Well-intentioned but mostly performative / Doing the minimum required / Indifferent / Actively resistant)
- Scale (1-7): "If I reported a bias incident, I am confident the institution would respond fairly and effectively" (Strongly disagree to Strongly agree)
- Yes/No: "Are you aware of the institution's process for reporting bias incidents or discrimination?"
- Scale (1-5): "How well does this institution communicate about its DEI efforts and progress?" (Very poorly to Very well)
- Open-ended: "What would increase your trust in this institution's commitment to an inclusive campus climate?"
5. Classroom and Academic Climate
The classroom is where students spend the majority of their campus time, and classroom climate is a significant predictor of academic outcomes for all students.
Survey questions:
- Scale (1-7): "I feel comfortable expressing my views in class, even when they differ from others'" (Strongly disagree to Strongly agree)
- Scale (1-7): "Faculty members create an inclusive learning environment that respects diverse perspectives" (Strongly disagree to Strongly agree)
- Scale (1-5): "How often do your courses include diverse perspectives, authors, and examples?" (Never / Rarely / Sometimes / Often / Very often)
- Single choice: "Have you ever altered your academic behavior (changed courses, avoided speaking in class, switched majors) due to a hostile or unwelcoming climate?" (Yes, multiple times / Yes, once / No, but I considered it / No)
- Scale (1-7): "I believe faculty evaluate my work fairly, regardless of my identity" (Strongly disagree to Strongly agree)
- Open-ended: "Describe a classroom experience where the instructor either created a particularly inclusive environment or failed to do so."
6. Identity-Based Experiences
Different identity groups have different climate experiences. Design question modules for specific populations:
Racial and ethnic minority students:
- Scale (1-7): "How well does this institution support students of your racial/ethnic background?" (Not at all well to Extremely well)
- Yes/No: "Have you experienced racial profiling by campus police or security?"
- Scale (1-5): "How adequately is your racial/ethnic identity reflected in the curriculum?"
LGBTQ+ students:
- Scale (1-5): "How comfortable are you being open about your sexual orientation and/or gender identity on campus?" (Very uncomfortable to Very comfortable)
- Yes/No: "Are your chosen name and pronouns consistently respected by faculty and staff?"
- Scale (1-7): "This institution provides adequate support and resources for LGBTQ+ students" (Strongly disagree to Strongly agree)
Students with disabilities:
- Scale (1-5): "How effective is the accommodations process at this institution?" (Very ineffective to Very effective)
- Yes/No: "Have you ever been unable to fully participate in an academic or campus activity due to accessibility barriers?"
- Scale (1-7): "Faculty are responsive and supportive when I request accommodations" (Strongly disagree to Strongly agree)
First-generation and low-income students:
- Scale (1-7): "I feel comfortable discussing my socioeconomic background on campus" (Strongly disagree to Strongly agree)
- Yes/No: "Have you been unable to participate in a campus activity or academic opportunity due to cost?"
- Scale (1-5): "How well does this institution support students from low-income backgrounds?" (Not at all well to Extremely well)
International students:
- Scale (1-5): "How welcoming is the campus environment for international students?" (Very unwelcoming to Very welcoming)
- Multiple choice: "What challenges have you faced as an international student? Select all that apply." (Language barriers / Cultural adjustment / Discrimination or bias / Immigration and visa stress / Social isolation / Financial challenges / Academic adjustment / None of these)
Survey Design Best Practices for Campus Climate
Demographic Data Collection
Demographic questions are essential for identifying disparities but must be handled with care. Follow Williams Institute and NASPA guidelines:
- Make all demographic questions optional
- Explain why demographic data is being collected and how it will protect anonymity
- Use inclusive categories that reflect current best practices for gender identity, sexual orientation, race/ethnicity, and disability status
- Allow "select all that apply" for racial/ethnic identity
- Include "prefer not to answer" and "prefer to self-describe" options
- Never report results for demographic groups smaller than a threshold (typically 10-15 respondents) to protect anonymity
Trauma-Informed Survey Design
Climate surveys ask about potentially traumatic experiences. The Substance Abuse and Mental Health Services Administration (SAMHSA) trauma-informed care principles should guide your design:
- Provide advance notice: Tell participants what topics will be covered before they begin
- Offer opt-out options: Allow participants to skip any question without penalty
- Provide resources: Include links to counseling services, crisis hotlines, and support resources throughout the survey (not just at the end)
- Use careful language: Avoid graphic descriptions in question stems. Focus on impact rather than incident details
- Control and choice: Let participants control the pace and take breaks
Ensuring Representative Participation
Campus climate surveys are only valid if they represent the campus community. Strategies for representative participation:
- Institutional endorsement: Have the president or chancellor publicly support participation
- Targeted outreach: Use identity-based student organizations, cultural centers, and affinity groups to encourage participation from underrepresented groups
- Multiple access points: Online, mobile, and in-person options
- Adequate time: Keep the survey open for 3-4 weeks
- Incentives: Consider entry into a raffle or small gift cards, ensuring incentives don't coerce participation
- Reminders: Send 2-3 reminders, personalized when possible
- Response rate targets: Aim for 30%+ overall, with adequate representation from each demographic group
Analyzing Campus Climate Data
Disaggregated Analysis
The most important analysis is disaggregated by identity. Overall averages can mask significant disparities. For example, an overall belonging score of 5.2/7.0 might represent 5.8 for white students and 3.9 for Black students. The overall score suggests a mildly positive climate; the disaggregated data reveals a crisis.
Always analyze by:
- Race/ethnicity
- Gender identity
- Sexual orientation
- Disability status
- First-generation status
- Socioeconomic background
- Student vs. faculty vs. staff
- Undergraduate vs. graduate
Intersectional Analysis
Identities don't exist in isolation. Kimberlé Crenshaw's intersectionality framework demonstrates that people with multiple marginalized identities may have experiences that differ from any single identity group. Where sample sizes allow, analyze intersections (e.g., Black women, LGBTQ+ students of color, first-generation students with disabilities).
Benchmarking
Compare your results against:
- HERI/UCLA CIRP Freshman Survey and College Senior Survey national norms
- AAU Campus Climate Survey benchmarks for peer institutions
- National Survey of Student Engagement (NSSE) inclusion indicators
- Your own longitudinal data — tracking change over time is more actionable than static comparison
From Data to Action: Closing the Loop
The biggest risk in campus climate surveying is generating data that sits in a report and changes nothing. This erodes trust and makes future survey participation unlikely.
Action framework:
- Transparent reporting: Share results publicly with the entire campus community, including uncomfortable findings
- Disaggregated findings: Show results by identity group so disparities are visible
- Community forums: Host open forums where community members can discuss findings and propose solutions
- Priority setting: Identify 3-5 specific, measurable action items based on findings
- Accountability structure: Assign responsibility, timelines, and resources to each action item
- Progress reporting: Provide regular updates on implementation
- Re-survey: Conduct follow-up assessment to measure change (typically every 2-3 years for comprehensive surveys, annually for targeted pulse surveys)
How Koji Transforms Campus Climate Research
Traditional campus climate surveys face critical limitations: low response rates from marginalized populations (the very groups whose experiences matter most), superficial quantitative data that reveals what but not why, and survey fatigue that undermines longitudinal tracking.
Koji addresses each of these limitations:
-
Conversational depth on sensitive topics: Climate surveys ask about discrimination, harassment, and belonging — topics where participants need space to share their experiences fully. Koji's AI interviewer creates a private, non-judgmental conversational space where participants share more openly than they would on a form. The structured questions capture the quantitative data (scales, frequencies, yes/no) while the AI's natural follow-ups capture the stories and context that give that data meaning.
-
Higher participation from underrepresented groups: Students from marginalized communities often have the lowest response rates on traditional surveys — due to survey fatigue, distrust of institutional data collection, or difficulty expressing complex identity-based experiences in checkbox formats. Koji's conversational approach is more accessible, more engaging, and feels less institutional.
-
Multilingual accessibility: International students and non-native English speakers can participate in their preferred language, ensuring their experiences are captured with full nuance rather than filtered through language barriers.
-
Voice-based participation: For students with visual impairments, learning disabilities that affect reading, or simply those who express themselves better verbally, Koji's voice interview option removes barriers that written surveys create.
-
Trauma-informed by design: Koji's conversational format naturally allows participants to control pacing, skip topics they're not comfortable with, and share at whatever depth feels right. The AI responds empathetically without being voyeuristic — it doesn't push for graphic details but does create space for full expression.
-
Thematic analysis at scale: With hundreds or thousands of qualitative responses, manual analysis is prohibitive. Koji identifies themes, patterns, and disparities across identity groups automatically, while preserving individual voices for narrative reporting.
-
Anonymity with nuance: Participants can share detailed identity-based experiences without fear of identification, because the AI interview format doesn't require linking to institutional records, and aggregate analysis protects individual anonymity.
Implementation Timeline
| Phase | Timeline | Activities |
|---|---|---|
| Planning | 3-4 months before launch | Form advisory committee, review institutional data, select/adapt instrument, secure IRB approval |
| Design | 2-3 months before launch | Draft survey in Koji, pilot test with representative students, refine questions and flow |
| Communication | 1 month before launch | Campus-wide announcement, endorse through student government, identity organizations, and academic leaders |
| Administration | 3-4 weeks | Launch survey, send reminders, monitor response rates by demographic group |
| Analysis | 1-2 months after close | Disaggregated analysis, intersectional analysis, qualitative theme identification |
| Reporting | 1 month after analysis | Public report, community forums, media summary |
| Action planning | 1-2 months after reporting | Priority setting, resource allocation, accountability assignments |
| Implementation | Ongoing | Execute action items with regular progress reporting |
| Follow-up | 12-18 months | Targeted pulse surveys on specific action areas |
A campus where every person — regardless of race, gender, sexual orientation, disability status, socioeconomic background, or any other identity — feels they genuinely belong is not just a moral aspiration. It's a measurable outcome. Your campus climate survey is the instrument that makes it measurable, and the catalyst that makes it achievable.
Related Articles
How to Build an Employee Engagement Survey That People Actually Answer Honestly
The definitive guide to employee engagement surveys that surface real sentiment. Learn why traditional surveys fail, how conversational AI eliminates social desirability bias, and how to design studies that drive meaningful organizational change.
How to Build DEI Surveys That Drive Meaningful Change
The complete guide to Diversity, Equity, and Inclusion surveys. Learn how to measure belonging, identify systemic barriers, and create safe spaces for honest feedback using conversational AI that reduces social desirability bias.
How to Build Course Evaluation Surveys That Actually Improve Teaching
The complete guide to course evaluations for universities and training programs. Learn how conversational AI produces 2x response rates and 10x richer feedback compared to traditional end-of-course surveys.
How to Measure Student Satisfaction and Improve Institutional Outcomes
A comprehensive guide to designing student satisfaction surveys that capture meaningful feedback across academic, social, and administrative dimensions to drive institutional improvement.
How to Build an NPS Survey That Actually Drives Action
A comprehensive guide to designing, deploying, and acting on Net Promoter Score surveys. Learn the best practices that separate vanity metrics from actionable insights, and how Koji's conversational approach unlocks the "why" behind every score.
How to Build a CSAT Survey That Improves Customer Satisfaction
The complete guide to Customer Satisfaction Score surveys. Learn when to measure CSAT vs NPS, how to design questions that reveal improvement opportunities, and how Koji turns satisfaction data into actionable insights.