Survey Logic and Display Conditions - Complete Guide
What are Display Conditions?
Display conditions allow you to show or hide questions based on how participants answer previous questions. Instead of showing every question to every participant, you can create more relevant, personalized surveys that adapt to each person's responses.
Benefits:
Shorter, more relevant surveys for each participant
Higher completion rates
Better quality data
Improved participant experience
Reduced survey fatigue
When to use display conditions:
Follow-up questions for negative ratings (e.g., ask "Why?" only if dissatisfied)
Role-specific questions (show manager questions only to managers)
Department-specific sections
Demographic-based questions
Optional deep-dive questions based on interest
How Display Conditions Work
Basic Concept
Simple example:
The Participant Experience
Without conditions:
Every participant sees every question
Many questions feel irrelevant
Survey feels longer than necessary
With conditions:
Participants only see relevant questions
Survey adapts to their situation
Feels personalized and efficient
Setting Up Display Conditions
Step 1: Identify Which Questions Need Conditions
Good candidates for conditions:
Follow-up questions asking "Why?" or "What would improve...?"
Questions only relevant to specific roles (managers, specific departments)
Deep-dive questions for specific response patterns
Optional questions based on previous interest/experience
Questions that should NOT have conditions:
Core engagement or satisfaction items needed from everyone
Demographic questions used for reporting
Critical data points required for all participants
Questions at the start of the survey (nothing to condition on yet)
Step 2: Access the Conditions Panel
To add a condition to a question:
Select the question in the designer
Look at the right sidebar
Find the "Conditions" section (orange icon)
Click "Make the question visible if" with the pencil icon
💡 The Conditions panel shows the question text you're adding conditions to at the top for easy reference.
Step 3: Build Your Condition
The condition builder has three parts:
1. If [Source Question]
Click "Select..." to choose which previous question to reference
Dropdown shows all questions that appear before the current one
You can only reference questions that participants have already answered
2. [Condition Type]
Equals: Show if answer exactly matches a specific value
Does not equal: Show if answer is anything except a specific value
Any of: Show if answer matches any item in a list (useful for multiple options)
Empty: Show if question was left blank/not answered
Not empty: Show if question has any answer
Greater than / Less than: For numeric comparisons
Greater than or equal to / Less than or equal to: For numeric ranges
3. [Value]
Select the specific answer(s) that trigger the condition
For "Equals" and "Does not equal": Choose one option
For "Any of": Select multiple options
For numeric conditions: Enter the number to compare against
Step 4: Apply the Condition
After configuring your condition, click "Apply"
The condition is now active
You'll see a summary in the Conditions panel
Edit icon allows you to modify the condition anytime
Common Condition Patterns
Pattern 1: Show Follow-up for Negative Ratings
Use case: Ask for details only when someone rates something poorly
Setup:
Why this works:
Satisfied people don't need to answer improvement questions
Captures specific feedback from those who need it
Shorter survey for satisfied respondents
Pattern 2: Role-Based Questions
Use case: Show different questions to managers vs. individual contributors
Setup:
Why this works:
Each participant only sees relevant questions
Maintains survey focus for each audience
Prevents frustration from irrelevant questions
Pattern 3: Department-Specific Sections
Use case: Ask department-specific questions only to relevant employees
Setup:
Why this works:
Tailored questions for each department
Doesn't overwhelm participants with irrelevant sections
Better quality responses from targeted questions
Pattern 4: Interest-Based Deep Dives
Use case: Offer optional detailed questions based on indicated interest
Setup:
Why this works:
Respects participant time and interest
Gathers rich data from engaged respondents
Keeps survey concise for those not interested
Pattern 5: Experience-Based Questions
Use case: Ask questions relevant to tenure or experience level
Setup:
Why this works:
Questions match participant's stage in employee lifecycle
Avoids asking about distant past experiences
Focuses on currently relevant topics
Multiple Conditions
Adding Multiple Conditions to One Question
You can add multiple conditions to make questions show only when several criteria are met.
Click "Add Condition" to add additional rules:
Each condition is evaluated independently
Question shows when ANY condition is true (OR logic)
All conditions appear in the Conditions panel
Example with multiple conditions:
When to Use Multiple Conditions
Good uses:
Catching multiple types of negative responses
Including multiple demographic groups
Several pathways to the same follow-up question
Example scenarios:
Page-Level Display Conditions
Hiding Entire Pages
Instead of adding conditions to individual questions, you can hide entire pages based on responses.
When to use page-level conditions:
Multiple related questions for a specific segment (5+ questions)
Cleaner than conditioning each question individually
Entire section only relevant to one group
How to set page-level conditions:
Click on the page in the left sidebar (not an individual question)
Find the Conditions section in the right panel
Add conditions the same way as question-level
Example:
Benefits of page-level conditions:
Easier to manage than 10 separate question conditions
Clearer intent (this whole section is for managers)
Easier to test (one condition to verify)
Better participant experience (page doesn't appear at all, not even in page count)
Condition Types Reference
Equals
When to use: Most common condition - show question when answer matches exactly one option
Example:
Best for: Radio buttons, dropdowns, single-select questions
Does Not Equal
When to use: Show question when answer is anything EXCEPT a specific option
Example:
Best for: Excluding one specific option while including all others
Any Of
When to use: Show question when answer matches ANY option from a list (multiple possibilities)
Example:
Best for:
Multiple options that should trigger the same follow-up
Grouping similar responses (all negative ratings, all tech departments)
Empty / Not Empty
When to use: Rarely used - typically for optional questions
Empty: Show if question was not answered at all
Not empty: Show if question has any answer
Best for:
Prompting participants who skipped important questions
Follow-ups only if someone provided initial information
Greater Than / Less Than
When to use: For numeric comparisons (rare in engagement surveys)
Example:
Best for:
Tenure-based questions
Team size-based questions
Score threshold questions
Note: Most engagement surveys use text scales, not numeric, so these conditions are less common.
Testing Display Conditions
Preview Testing is Critical
Display conditions only work correctly if thoroughly tested. A broken condition means participants either see irrelevant questions or miss important ones.
To test your conditions:
Click Preview tab
Complete the survey multiple times with different answer patterns
Verify questions appear and hide correctly
Test every condition you created
Creating Test Scenarios
Before testing, document what should happen:
Example test plan:
What to Verify During Testing
For each condition:
✅ Question appears when it should
✅ Question is hidden when it should be
✅ Condition triggers on the correct answer(s)
✅ Multiple conditions work together correctly
✅ Page-level conditions hide entire sections
Common test cases:
Testing Tips
Test extreme cases:
All positive responses (shortest possible survey)
All negative responses (longest possible survey)
Mixed responses (typical participant experience)
Test multiple times:
First pass: Follow your own expected path
Second pass: Choose opposite answers
Third pass: Mix it up randomly
Ask a colleague to test without seeing your test plan
Document results:
Troubleshooting Common Issues
"Question isn't showing when it should"
Checklist:
☐ Is there a condition on the question? (Check Conditions panel)
☐ Is the condition referencing the correct source question?
☐ Is the condition set to the correct answer value?
☐ Did I test with the exact answer that should trigger it?
☐ Is the page also hidden by a page-level condition?
☐ Is the source question appearing and being answered?
Debug process:
Remove the condition temporarily
Verify question appears without condition
Re-add condition step by step
Test after each change
Example fix:
"Question is showing when it shouldn't"
Checklist:
☐ Is there a condition on the question at all?
☐ Is the condition using "equals" when it should use "does not equal"?
☐ Is there a second condition that's triggering it?
☐ Is the condition value typed exactly as it appears in the source question?
Common mistakes:
"Multiple conditions aren't working together"
Remember: Multiple conditions use OR logic
Question shows if ANY condition is true
Not ALL conditions need to be true
Example:
If you need AND logic (all must be true):
This requires more complex setup
Contact support for assistance
Often better to restructure survey to avoid needing AND logic
"Condition is set correctly but still not working"
Check these details:
Exact text matching:
"Yes" ≠ "yes" ≠ "YES"
Extra spaces matter: "Satisfied" ≠ "Satisfied "
Special characters matter: "5+" ≠ "5"
Source question accessibility:
Is source question appearing in the survey?
Can source question be answered (not hidden itself)?
Is source question required (if not, might be skipped)?
Timing:
Condition references question that comes before, not after
Can't reference a question on a later page
Solution steps:
Copy the exact text from the source question options
Paste into condition value (ensures exact match)
Test in preview mode
Verify source question was answered before conditional question
"Page condition isn't hiding the page"
Check:
☐ Is condition on the page itself (not questions within)?
☐ Is condition set correctly?
☐ Did you test with the specific answer that should hide it?
Verify:
Go to page settings (click page in left sidebar)
Look at Conditions panel
Confirm condition exists and is accurate
Best Practices for Display Conditions
Keep It Simple
Do's:
✅ Use conditions for clear, obvious scenarios
✅ Limit to 1-2 levels of conditional logic
✅ Use page-level conditions for groups of related questions
✅ Document your logic for future reference
Don'ts:
❌ Create complex nested conditions (if A, then show B, which conditions C)
❌ Use conditions on every single question
❌ Make conditions dependent on other conditional questions
❌ Create circular dependencies
Example of too complex:
Design for All Paths
Every participant must be able to complete the survey:
Don't accidentally lock participants out with broken conditions
Ensure all paths reach the completion page
Test that required questions are accessible on all paths
Example problem:
Use Clear Trigger Questions
Good trigger questions:
Clear yes/no answers
Distinct role categories
Obvious rating scales
Single selection (not multi-select)
Poor trigger questions:
Open-ended text
Optional questions (might be skipped)
Multi-select checkboxes (which selections trigger?)
Questions later in the survey (creates complex dependencies)
Think About the Participant Experience
Consider survey flow:
Will participants understand why they're seeing certain questions?
Does the survey feel personalized or randomly jumping around?
Are sections logically organized even with conditions?
Example of good flow:
Document Your Logic
Create a logic map for your team:
Why document:
Easier to troubleshoot issues
Helps when editing survey next year
Onboards new team members
Explains participant experience to stakeholders
Frequently Asked Questions
How many conditions can I add to one question?
Technical limit: No hard limit
Practical recommendation: 3-5 conditions maximum per question
Why limit?
More conditions = more complex testing
Harder to understand which trigger what
Higher chance of errors
If you need many conditions: Consider if you're using the right approach. Multiple conditions often indicate the need for restructuring.
Can I reference questions from earlier pages?
Yes, you can reference any question that appears before the conditional question in the survey.
Works: Q5 on Page 1 conditions Q20 on Page 3 Doesn't work: Q20 on Page 3 conditions Q5 on Page 1 (can't reference future)
What happens if the source question is skipped?
If source question has a condition and isn't shown:
The condition evaluates as if the question wasn't answered
"Equals" conditions won't trigger (no answer to match)
"Empty" conditions WILL trigger
Best practice: Only condition questions on stable, always-visible questions.
Do conditions work with matrix questions?
Yes, you can:
Reference a matrix question in a condition
BUT you reference the entire matrix, not individual rows
Example:
Workaround: Break matrix into separate questions if you need row-level conditioning.
Can participants go back and change answers?
Yes, participants can use the browser back button.
What happens to conditions:
Changed answers re-evaluate conditions immediately
Questions may appear or disappear based on new answers
This is expected behavior
Best practice: Test this scenario - change a trigger answer and verify conditions update correctly.
How do conditions affect survey length estimates?
Different participants see different lengths:
Someone who triggers all conditions sees longer survey
Someone who skips conditional sections sees shorter survey
When communicating survey length:
Estimate based on typical participant (not maximum)
Consider saying "approximately 10-15 minutes" to account for variation
Test both short and long paths to give accurate range
Should I tell participants about conditional logic?
Generally no, but you can if it helps:
"This survey adapts to your responses, so you'll only see relevant questions"
"Some questions only apply to certain roles"
When NOT to mention:
Follow-up questions on negative ratings (obvious)
Department-specific sections (self-explanatory)
Most role-based logic (feels natural)
Can I add conditions after the survey is live?
Yes, but be cautious:
✅ Adding conditions is generally safe (future participants affected only)
⚠️ Changing existing conditions may affect data consistency
❌ Don't add conditions to questions that already have responses
Best practice: Get conditions right during testing, avoid changes after launch.
How do I remove a condition?
To remove a condition:
Select the question
Go to Conditions panel
Click edit (pencil icon)
Delete the condition
Apply changes
Or completely remove all conditions:
Conditions panel shows all active conditions
Remove them one by one
What's the impact on reporting?
Important consideration:
Conditional questions have varying response counts
Q10 (shown to all): 500 responses
Q11 (shown to managers only): 150 responses
Q12 (shown if dissatisfied): 75 responses
When analyzing:
Note which questions were conditional
Report percentages based on who saw the question, not total participants
Consider adding explanatory notes in reports
Example report note:
Last updated
Was this helpful?