Improvements in our ability to model runoff from glaciers remain an important scientific goal. This paper describes a new temperature-radiation-index glacier melt model specifically enhanced for use in High-Arctic environments, utilising high temporal and spatial resolution datasets while retaining relatively modest data requirements. The model employs several physically constrained parameters and was tuned using a lidar-derived surface elevation model of Midtre Lovénbreen, meteorological data from sites spanning ~70% of the glacier's area-altitude distribution and periodic ablation surveys during the 2005 melt season. The model explained 80% of the variance in observed ablation across the glacier, an improvement of ~40% on a simplified energy balance model (EBM), yet equivalent to the performance of a full EBM employed at the same location. Model performance was assessed further by comparing potential and measured runoff from the catchment and through application to an earlier (2004) melt season. The additive model form and consideration of a priori parameters for the Arctic locality were shown to be beneficial, with a planimetry correction eliminating systematic errors in potential runoff. Further parameterisations defining modelled incident radiation failed to yield significant improvements to model output. Our results suggest that such enhanced melt models may perform well for singular melt seasons, yet are highly sensitive to the choice of lapse rates, and their transferability to different locations and seasons may be limited. While modelling ablation requires detailed consideration of the transition between snow and ice melt, our study suggests that description of the ratio between radiative and turbulent heat fluxes may provide a useful step towards dynamic parameterisation of melt factors in temperature-index models.