Predicting Imbalanced Events with GEV-NN

Overview

Predicting rare events within imbalanced datasets is a significant challenge in machine learning. Whether it's fraud detection, medical diagnosis, or risk assessment, the minority class—the events of greatest interest—is often overshadowed by a vast majority of non-events. Traditional models tend to be biased towards the majority, leading to poor performance in detecting these critical occurrences.

In this project, I implement the Generalized Extreme Value Neural Network (GEV-NN) to address this issue. By leveraging the unique properties of the Generalized Extreme Value distribution and combining it with an autoencoder for feature extraction, the model enhances its ability to predict minority class events effectively.

To delve deeper into the methodology and view the detailed code implementation, please visit the accompanying Python script.

Objectives

  • Develop a model capable of accurately predicting rare events in an imbalanced dataset.
  • Implement the GEV-NN architecture to handle class imbalance.
  • Enhance feature extraction using an autoencoder.
  • Evaluate the model using appropriate metrics to assess performance on the minority class.

Data Preprocessing

  • Feature Standardization: All features are standardized using StandardScaler to ensure consistent scaling.
  • Class Weight Calculation: Class weights are computed to address the imbalance between majority and minority classes, influencing the loss function during training.

Model Architecture

1. Generalized Extreme Value Activation Function

The GEV activation function is integrated into the model to focus on extreme values, which are often associated with rare events. This function helps the network prioritize instances that are critical for minority class prediction.

2. Autoencoder for Feature Extraction

An autoencoder is employed to learn compressed representations of the data:

  • Encoder: Reduces dimensionality by compressing input features into a lower-dimensional space.
  • Decoder: Reconstructs the input data from the encoded representation, ensuring essential information is retained.

3. Self-Organizing Fuzzy Neural Network (SOFNN)

The SOFNN dynamically assigns importance to input features:

  • Evaluates the relevance of each feature.
  • Enhances the model's focus on attributes that are more indicative of the minority class.

Training Process

  • Loss Functions:
    • Mean Squared Error (MSE): Used for training the autoencoder to reconstruct inputs accurately.
    • Weighted Binary Cross-Entropy Loss: Employed for the classification task, incorporating class weights to penalize misclassification of the minority class more heavily.
  • Optimization:
    • Adam Optimizer: Efficiently updates model parameters during training.
    • Learning Rate Scheduler: Adjusts the learning rate based on validation loss to fine-tune the training process.
    • Early Stopping: Prevents overfitting by halting training when validation performance stops improving.

Evaluation Metrics

To assess the model's performance, several metrics are utilized:

  • AUC-ROC: Measures the model's ability to distinguish between classes across all thresholds.
  • F1 Score: Balances precision and recall, providing insight into the model's accuracy on the minority class.
  • Brier Score: Evaluates the accuracy of probabilistic predictions.
  • Geometric Mean (G-Mean): Reflects the balance between sensitivity and specificity.

Visualization tools like confusion matrices and ROC curves offer a clear picture of the model's performance.

Results and Insights

The GEV-NN model demonstrates promising results in predicting rare events:

  • Improved Detection of Minority Class: The model achieves higher recall for the minority class, indicating better detection of rare events.
  • Balanced Performance: High G-Mean scores show that the model maintains a balance between correctly identifying both classes.
  • Robustness: The integration of GEV activation and feature extraction through the autoencoder enhances the model's robustness against imbalanced data.

Conclusion

Predicting rare events in imbalanced datasets is challenging but critical. The GEV-NN model presents a powerful approach to tackle this issue, combining advanced neural network techniques with specialized activation functions. This implementation demonstrates the potential for improved detection of rare events, which can be invaluable in various domains such as finance, healthcare, and security.

By leveraging these techniques, practitioners can develop models that are more sensitive to the minority class, leading to better decision-making and outcomes.

Citation

Munkhdalai, L., Munkhdalai, T., & Ryu, K. H. (2020). GEV-NN: A deep neural network architecture for class imbalance problem in binary classification. Knowledge-Based Systems, 194, 105534. https://doi.org/10.1016/j.knosys.2020.105534

Address

908 Eagle Heights Drive
Madison, WI 53705
United States of America