Skip to content

Latest commit

 

History

History
72 lines (59 loc) · 2.38 KB

README.md

File metadata and controls

72 lines (59 loc) · 2.38 KB

Explainable machine learning for precise fatigue crack tip detection

DOI

This repository contains the code used to generate the results of the research article

D. Melching, T. Strohmann, G. Requena, E. Breitbarth. (2022)
Explainable machine learning for precise fatigue crack tip detection. 
Scientific Reports.
DOI: 10.1038/s41598-022-13275-1

The article is open-access and available here.

Dependencies

All additional, version-specific modules required can be found in requirements.txt

pip install -r requirements.txt

Usage

The code can be used to produce attention heatmaps of trained neural networks following these instructions.

1) Data

In order to run the scripts, nodal displacement data of the fatigue crack propagation experiments S950,1.6 and S160,2.0 as well as the nodemap and ground truth data of S160,4.7 is needed. The data is available on Zenodo under the DOI 10.5281/zenodo.5740216.

The data needs to be downloaded and placed in a folder data.

2) Preparation

Create training and validation data by interpolating the raw nodal displacement data to arrays of size 2x256x256, where the first channel stands for the x-displacement and the second for the y-displacement.

make_data.py

3) Training, validation, and tests

To train a model with the ParallelNets architecture, run

ParallelNets_train.py

To test a model for its performance, run

ParallelNets_test.py

after training.

4) Explainability and visualization

You can plot the segmentation and crack tip predictions using

ParallelNets_plot.py

Prediction plot

and visualize network and layer-wise attention by running

ParallelNets_visualize.py

Network attention plot Layer-wise attention plot

The explainability method uses a variant of the Grad-CAM algorithm [1].

References

[1] Selvaraju et al. (2020). Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization. Int. J. Comput. Vis. 128, 336-359.