A Gated Recurrence Unit is a component that can be used in recurrent neural networks (RNNs) that add memory. This is useful for [vocal-synthesis] and other tasks that involve repeated time-steps since there is always a lot of dependence on previous time-steps in periodic data like waveforms.
Here's a decent explanation: https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21