OrganizerJin-Peng Liu 刘锦鹏SpeakerWeiliang WangTimeThur., 13:30-14:30, Dec. 26, 2024VenueB626, Shuangqing Complex Building AAccurate and efficient quantum Gibbs state preparationAbstract:Preparing Gibbs state on quantum computer is an essential task for quantum algorithm, which is believed to be one of the natural candidates for quantum advantage. Efforts to generalize classical gibbs sampler...
AbstractTransformer is a powerful architecture that achieves superior performance on various sequence learning tasks, including neural machine translation, language understanding, and so on. As the core of the architecture, the self-attention mechanism is a kind of kernel smoothing method, or "local model" by the speaker's word. The whole architecure also could be seen as a sequence model of me...