Analysis code for computing emotional bandwidth and related affective dynamics metrics from experience sampling data.
The main analysis script is emotion_bandwidth_analysis.py. It:
- computes multivariate entropy and normalized emotional bandwidth for positive and negative affect,
- computes granularity (from ICC), instability (time-weighted MSSD), and variability (SD),
- merges questionnaire and demographic variables,
- exports processed datasets,
- runs correlations, descriptive statistics, regressions, and mediation analyses,
- saves figures and analysis tables to a results folder.
- emotion_bandwidth_analysis.py: Main preprocessing and inferential analysis pipeline.
- figures.py: Script to generate conceptual/illustrative figures for MSSD, SD, ICC, and bandwidth.
- requirements.txt: Python dependencies.
- Python 3.10+ recommended.
- Install dependencies:
pip install -r requirements.txtSome Plotly export operations may require kaleido depending on your local setup:
pip install kaleidoThe analysis script expects this directory layout under exp_dir:
<exp_dir>/
data/
esm_clean_trimmed.csv
demographic_data.csv
results/
By default in emotion_bandwidth_analysis.py, exp_dir is hardcoded to:
exp_dir = "D:/EmoBand/"Update this path before running so it matches your local environment.
Required columns used by the script:
participanttimestamp_response- Positive affect items:
pa_joyful_randpa_cheerful_randpa_happy_randpa_content_randpa_relaxed_randpa_energetic_rand
- Negative affect items:
na_tense_randna_irritable_randna_worried_randna_low_randna_lonely_randna_abandoned_rand
- Questionnaire variables:
RUCLA_kw(loneliness)CESD_kw(depressive symptoms)
Notes:
- Affect items are expected on a 1-7 scale.
timestamp_responseshould be parseable as datetime.
Required merge key and commonly used fields:
participant(merge key)gender(coded asF/Min source data; converted to0/1)age
For each participant and affective dimension (positive, negative), the script computes:
multivariate_entropy: Shannon entropy on discretized multivariate affect states.bandwidth: normalized state-space coverage:
where
icc: intraclass correlation (ICC3k, pingouin).granularity = 1 - icc.instability: time-interval-weighted MSSD (within-participant), summarized by valence.variability: SD, summarized by valence.mean_affect_levelby valence.
Participants with granularity greater than 1 in either valence are excluded.
With defaults in the script (save_type = "tsv"), outputs are written to <exp_dir>/results/.
Main exports include:
emo_band_proc_data.tsv: long-format processed data.emo_band_proc_data_wide.tsv: participant-level wide dataset used for analyses.emo_band_correlation.png: lower-triangle correlation heatmap with FDR-adjusted significance masking.emo_band_correlation_matrix.tsv: correlation matrix table.emo_band_descriptive_stats.tsv: descriptive statistics table.emo_band_regression_results_depression_no_mean_affect_standardized.txtemo_band_regression_results_depression_standardized.txtemo_band_regression_results_loneliness_no_mean_affect_standardized.txtemo_band_regression_results_loneliness_standardized.txtemo_band_mediation_results_negative_affect_standardized.txtgender_counts.txt
If standardize = False, corresponding non-standardized filenames are produced.
- Install dependencies.
- Edit
exp_dirin emotion_bandwidth_analysis.py. - Ensure expected CSV files and columns exist.
- Run:
python emotion_bandwidth_analysis.pyRun figures.py to generate standalone conceptual figures for:
- MSSD and SD behavior,
- ICC/granularity illustration,
- 3D state-space bandwidth visualization.
Before running, update any hardcoded output paths in that script.
- Correlations are corrected for multiple testing using FDR (Benjamini-Hochberg).
- Regression models are estimated with OLS (
statsmodels). - Mediation uses bootstrap inference (
pingouin.mediation_analysis,n_boot=5000). - Standardization is controlled via
standardize = Truein the main script.