I. Introduction
Since the dawn of computing, machines have been designed to carry out lists of deterministic operations given to them. In the last few decades, factors like Big Data and machine learning (ML) have given rise to a so-called data-driven era of artificial intelligence (AI). On one hand, we have data to help approximate parameters. On the other hand, the vast majority of algorithms are black box solutions. Sometimes, it might not be imperative to have an explanation as to what was learned; the solution in itself is all that is required. However, in other settings it may be vital to understand why a decision was reached. This need is a driving factor behind explainable artificial intelligence (XAI). Another advantage of XAI is finding gaps in current AI to accelerate the field.