Commit 0330ec8b authored by Hubert Nourtel's avatar Hubert Nourtel
Browse files

Add README for iemocap egs

parent 91dfba68
# IEMOCAP
This repository contains the framework for training emotion recognition models on IEMOCAP.
### Data preparation
IEMOCAP dataset must be in data/IEMOCAP.
! The scripts were made on a basis of 4 categories of emotion (neu, ang, sad, hap+exc). If you want to change those, you will have to change manually the scripts !
```bash
# To activate env.sh
source ../../env.sh
# Install pip and ruamel (to be able to modify YAML files within a Python script)
python -m ensurepip --upgrade
python3 -m pip install ruamel.yaml
# To create the different csv files needed for trainings (they will be saved in /list)
python ./local/dataprep_iemocap.py --make-train-csv
# Sort csv files according to emotions (speaker_idx)
python ./local/csv_tri.py
```
In order to use data augmentation, also run:
```bash
python ./local/dataprep_aug.py --save-path ./data --download
python ./local/dataprep_aug.py --from ./data/RIRS_NOISES --make-csv-augment-reverb
python ./local/dataprep_aug.py --from ./data/musan_split --make-csv-augment-noise
```
### Train from scratch
Multiple x-vector architectures are implemented, each of them has their own `train_<model_type>.sh` script.
You do not need to modify manually the YAML files, the script will do it automatically (except if you want to change an other value).
Example (! -s -c -b -l have to be written !):
```bash
./train_iemocap_half_resnet34.sh -s session_test -c nb_categories -b batch_size -l lr
```
During training, logs will be put under `logs/<model_type>` and checkpoints will be placed under `model_<model_type>/`.
To create a [TorchScript](https://pytorch.org/docs/stable/jit.html) compatible model use:
```bash
create_jit_model.py model_half_resnet34/best_model_half_resnet34.pt (change the name of the model)
# MODEL = half_resnet34
# Saving CPU TorchScript model to model_half_resnet34/best_model_half_resnet34_cpu_JIT.pt
# Saving GPU TorchScript model to model_half_resnet34/best_model_half_resnet34_cuda_JIT.pt
```
To share models on Hugging Face's Model Hub:
```bash
release_model.sh model_half_resnet34/best_model_half_resnet34_cuda_JIT.pt model_half_resnet34/best_model_half_resnet34_cpu_JIT.pt model_half_resnet34/best_model_half_resnet34.pt
```
### Evaluation
To launch the evaluation of the model, run:
```bash
python ./local/scoring.py #model #session_test #nb_categories #batch #lr #epoch #--emotions(default:neu ang sad hap+exc, specific order needed) #--freeze(if parts of the model were frozen)
```
A confusion matrix and losses plot will be made, and all the files will be moves to a special directory (example: "model\_half\_resnet34/Sess1\_test/4emo\_100batch\_lr-0.0001").
To launch the evaluation with cross-validation (all sessions must have a model trained with the same hyperparameters), run:
```bash
python ./local/scoring_cross_validation.py #model #nb_categories #batch #lr #--emotions(default:neu ang sad hap+exc, specific order needed) #--freeze(if parts of the model were frozen)
```
Only a confusion matrix will be plotted and will be saved under a special directory (example: "model\_half\_resnet34/Sess\_all\_cross-valid/4emo\_100batch\_lr-0.0001")
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment