diff --git a/README.md b/README.md
index 0a73262f8664dea738aa19274fa1ccd24c85ecf6..7ea8ef76b782d623d6b2c02b6309a971ca1f8d9c 100644
--- a/README.md
+++ b/README.md
@@ -22,7 +22,7 @@ conda env create -f environment.yml
 
 Afterwards you can use the predict_main.py in workflows/TransformerBasedTMPrediction/prediction_model. You simply have to replace the example fasta sequence with a path to your .fasta file and run the predict_main.py file.
 
-### Training of your own model
+### Training your own model
 
 In case you want to retrain deepSTABp simply run the training.py file located in workflows/TransformerBasedTMPrediction/MLP_training.py . You can also experiment with different architectures by directly editing the modelstructure found in the MLP_training.py file.
 The other file found in workflows/TransformerBasedTMPrediction/ is named tuning.py and you should run it after training with your already pretrained model to achiv optimal results.