redis频繁批量插入数据会丢失数据

redis频繁批量插入数据 为什么会丢失?

int pageSize = 100;
while (curPage <= totalPage){
int startPos = (curPage - 1) * pageSize;

        scoreEntities = misasScoreMapper.doCalculateLastResult(examId, startPos, pageSize);
        rank = calculatRankAndTotal(scoreEntities,rank,examId);
        curPage++;
      //  Map<Long, MisasScore> collect1 = scoreEntities.stream().collect(Collectors.toMap(MisasScore::getStudentId, v -> v));
        Map<Object, MisasScore> collect = scoreEntities.stream().collect(Collectors.toMap(MisasScore::getStudentId, v -> v));
      // redisCache.setCacheMapLong("tmp:"+gradeId,collect);
       redisCache.setCacheMap2("tmp:"+gradeId,collect);
 //       misasScoreMapper.saveLastRanks(scoreEntities);
    }
一共有3761条数据,我每次从DB中获取100条数据经过处理,插入redis,第一次没问题,往后每次都会有1-2个没有插进去,如果我每次获取3000条,则不会发生数据丢失问题。可以确定的是 里面的key值 绝对不会重复。该如何解决?