I'm writing some web service which supposed to receive an xml file from user, read it and save data to database
This file is gzipped and encoded in UTF-16. So i have to ungzip it, save xml to a file (for future purposes). Next i have to read file into a string, decode it to UTF-8 and kind of xml.Unmarshal([]byte(xmlString), &report)
Currently without saving it into a database
On my local machine i've realized that processing of one request takes about 30% of my CPU and about 300ms of time. For one request looks like okay. But i made script which simultaneously fires 100 requests (via curl ) and i saw that CPU usage is up to 100% and time for one request increased to 2sec
What i wanted to ask is: should i worry about it or maybe on a real web server things are going to be ok? Or maybe i'm doing smth wrong Here is the code:
func Parse(filename string) Report {
xmlString := getXml(filename)
report := Report{}
xml.Unmarshal([]byte(xmlString), &report)
return report
}
func getXml(filename string) string {
b, err := ioutil.ReadFile(filename)
if err != nil {
fmt.Println("Error opening file:", err)
}
s, err := decodeUTF16(b)
if err != nil {
panic(err)
}
pattern := `<?xml version="1.0" encoding="UTF-16"?>`
res := strings.Replace(s, pattern, "", 1)
return res
}
func decodeUTF16(b []byte) (string, error) {
if len(b)%2 != 0 {
return "", fmt.Errorf("Must have even length byte slice")
}
u16s := make([]uint16, 1)
ret := &bytes.Buffer{}
b8buf := make([]byte, 4)
lb := len(b)
for i := 0; i < lb; i += 2 {
u16s[0] = uint16(b[i]) + (uint16(b[i+1]) << 8)
r := utf16.Decode(u16s)
n := utf8.EncodeRune(b8buf, r[0])
ret.Write(b8buf[:n])
}
return ret.String(), nil
}
Please ask if i forgot something important