如何同时遍历N个文件以计算唯一单词的出现

This is my code to count the occurrence of all the unique word in a file:

package main

import (
    "bufio"
    "fmt"
    "log"
    "os"
)

func main(){
    file, err := os.Open("file1.txt")
    if err != nil {
        log.Fatal(err)
    }
    words := make(map[string]int)
    /*asking scanner to split into words*/
    scanner := bufio.NewScanner(file)
    scanner.Split(bufio.ScanWords)
    count := 0
    //scan the inpurt
    for scanner.Scan() {
        //get input token - in our case a word and update it's frequence
        words[scanner.Text()]++
        count++
    }
    if err := scanner.Err(); err != nil {
        fmt.Fprintln(os.Stderr, "reading input:", err)
    }
    for k, v := range words {
        fmt.Printf("%s:%d
", k, v)
    }
}

I have to iterate this map over N files concurrently in order to calculate the occurrence of all the unique words.

You could use an errgroup.Group to make it concurrent.

Keep in mind to handle concurrent writes to the map properly if you make it accessible for all goroutines.