Detecting influential features in non-linear and/or high-dimensional data is an increasingly important task in machine learning. However, inference on the chosen features can be significantly flawed when the selection procedure is not accounted for. We propose a post-selection inference procedure using the so-called model-free “HSIC-Lasso” based on the framework of truncated Gaussians combined with the polyhedral lemma. We then develop an algorithm, which allows for low computational costs and provides a selection of the regularisation parameter. The performance of the proposed method is illustrated by both artificial and real-world data based experiments, which exhibit a tight control of the type-I error, even for small sample sizes.