Goio中使用ioutil.ReadAll()的“无效的内存地址”

I'm currently learning Golang (and so far I love it). But unfortunately, I've been stuck for a couple hours and I don't seem to find any solution to my problem on Google.

So here's my problem. I have this piece of code (from a tutorial) :

func main() {
    var s SitemapIndex

    resp, _ := http.Get("https://www.washingtonpost.com/news-sitemaps/index.xml")
    bytes, _ := ioutil.ReadAll(resp.Body)
    resp.Body.Close()

    xml.Unmarshal(bytes, &s)

    for _, Location := range s.Locations {
        resp, _ := http.Get(Location)
        ioutil.ReadAll(resp.Body)

    }

}

I know, my code is incomplete but that's because I deleted the parts that were not causing the problem to make it more readable on Stackoverflow.

So when I get the content of Location and try to process the data with ioutil.ReadAll() I get this error mentioning a pointer :

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x40 pc=0x1210a69]

goroutine 1 [running]:
main.main()
    /Users/tom/Developer/Go/src/news/index.go:23 +0x159
exit status 2

I really don't understand this error, no matter how much I look into it. I tried to pull the error from ioutil.ReadAll(resp.Body) by doing _, e := ioutil.ReadAll(resp.Body) and then printing e, but doing that throws another error...

I read somewhere that it can be because the body returned to me has errors, but it's working fine in the tutorial.

Hopefully you guys will have a solution for me. Thanks.

EDIT : Here are the structs I have defined :

type SitemapIndex struct {
    Locations []string `xml:"sitemap>loc"`
}

type News struct {
    Titles []string `xml:"url>news>title"`
    Keywords []string `xml:"url>news>keywords"`
    Locations []string `xml:"url>loc"`
}

type NewsMap struct {
    Keyword string
    Location string
}

As Mostafa mentioned , you have to handle the errors properly. There is no try catch in golang. Something like this i have tired. atleast it catpures url error is in liteIde

package main

import (
    "encoding/xml"
    "fmt"
    "io/ioutil"
    "net/http"
    "os"
    "runtime"
)

type SitemapIndex struct {
    Locations []string `xml:"sitemap>loc"`
}

func main() {
    var s SitemapIndex

    resp, err := http.Get("https://www.washingtonpost.com/news-sitemaps/index.xml")
    if err != nil {
        fmt.Println("Unable to get the url ")
        os.Exit(1)
    }
    bytes, _ := ioutil.ReadAll(resp.Body)
    defer resp.Body.Close()
    //fmt.Println(string(bytes))
    xml.Unmarshal(bytes, &s)
    //fmt.Println(len(s.Locations))

    for _, Location := range s.Locations {
        //fmt.Println(Location)
        go func() {
            r, err := http.Get(Location)
            if err != nil {
                fmt.Println("Error occured ", err)
                return
            }
            bytes, err := ioutil.ReadAll(resp.Body)
            if err != nil {
                fmt.Printf("Error in reading :", err)
            }
            fmt.Println(string(bytes))
            r.Body.Close()
        }()
    }
    runtime.Gosched()
}

The First Rule of Go: Check for errors.


When a function call returns an error, it’s the caller’s responsibility to check it and take appropriate action.

Usually when a function returns a non-nil error, its other results are undefined and should be ignored.

The Go Programming Language, Alan A. A. Donovan and Brian W. Kernighan


For example,

if err != nil {
    fmt.Printf("%q
", Location) // debug error
    fmt.Println(resp)            // debug error
    fmt.Println(err)
    return
}

Output:

"
https://www.washingtonpost.com/news-sitemaps/politics.xml
"
<nil>
parse 
https://www.washingtonpost.com/news-sitemaps/politics.xml
: first path segment in URL cannot contain colon

If you don't catch this error and keep going with resp == nil then

bytes, err := ioutil.ReadAll(resp.Body)

Output:

panic: runtime error: invalid memory address or nil pointer dereference

package main

import (
    "encoding/xml"
    "fmt"
    "io/ioutil"
    "net/http"
    "strings"
)

type SitemapIndex struct {
    Locations []string `xml:"sitemap>loc"`
}

func main() {
    var s SitemapIndex

    resp, err := http.Get("https://www.washingtonpost.com/news-sitemaps/index.xml")
    if err != nil {
        fmt.Println(err)
        return
    }
    bytes, err := ioutil.ReadAll(resp.Body)
    if err != nil {
        fmt.Println(err)
        return
    }
    err = resp.Body.Close()
    if err != nil {
        fmt.Println(err)
        return
    }

    err = xml.Unmarshal(bytes, &s)
    if err != nil {
        fmt.Println(err)
        return
    }

    for _, Location := range s.Locations {
        resp, err := http.Get(Location)
        if err != nil {
            fmt.Printf("%q
", Location) // debug error
            fmt.Println(resp)            // debug error
            fmt.Println(err)
            return
        }
        bytes, err := ioutil.ReadAll(resp.Body)
        if err != nil {
            fmt.Println(err)
            return
        }
        fmt.Println(len(bytes))
        err = resp.Body.Close()
        if err != nil {
            fmt.Println(err)
            return
        }
    }
}

Okay, so I've found the reason for my problem and how to solve it. The problem was that the url had some newlines I didn't see.

So instead of doing this :

resp, err := http.Get(Location)

I did this :

resp, err := http.Get(strings.TrimSpace(Location))

That solved it.