根据Self-Attention with Relative Position Representations,把transformer里的正余弦嵌入改成Relative Position Representations时遇到了一点问题 有偿求指导