Abstract
In this talk, we will introduce gradient recovery schemes for data defined on discretized manifolds. The proposed method, parametric polynomial preserving recovery (PPPR), does not require the tangent spaces of the exact manifolds which have been assumed for some significant gradient recovery methods in the literature. Another advantage of PPPR is that superconvergence is guaranteed without the symmetric condition which is required in the existing techniques. We also will talk about several applications.
Speaker
郭海龙,2015年博士毕业于美国韦恩州立大学,于2015年到2018年在美国加州大学圣塔芭芭拉分校任访问助理教授,2018年加入澳大利亚墨尔本大学,先后历任讲师和高级讲师。主要研究兴趣包括有限元超收敛后处理、界面问题的数值方法、机器学习算法等,在Comput. Methods Appl. Mech. Engrg., J. Comput. Phys., Math. Models Methods Appl. Sci., Math. Comp., Numer. Math., SIAM J. Numer. Anal., SIAM J. Sci. Comput. 等主流期刊发表论文20多篇。