Compress http request payload in go

December 27, 2015
golang http

My current project is using golang for the API server. There are multiple type of consumers, including mobile, web and even Smart TV ( in future ) for this API endpoints. One of the API is to send telemetry data from the client which is in json format. Due to the offline feature there are cases where the payload can be huge ( more than 1000 telemetry events in one request).

We did a calculation on the time it takes to sync 1000 events in one request in various network conditions and the results are:

To reduce this time, we tried to use couple of binary serialization techniques like Messagepack, protocol buffer etc. But due to the variety of the platforms, we need to support pure JSON format in some cases. So that it leads us to check the normal compression support like any modern web server supports for http response.

This helped us to continue with the same api endpoint with json format itself and with the help of ‘Content-Encoding’ header, endpoint determine whether it is a compressed content or not.

The power of golang reader interface can be seen here.


func getData(r *http.Request) (*TelemetryEvents, error) {
	var data TelemetryEvents
	var decoder *json.Decoder
	switch r.Header.Get("Content-Encoding") {
	case "gzip":
		gz, err := gzip.NewReader(r.Body)
		if err != nil {
			return nil, err
		}
		defer gz.Close()
		decoder = json.NewDecoder(gz)
	default:
		decoder = json.NewDecoder(r.Body)
	}
	err := decoder.Decode(&data)
	if err != nil {
		return nil, err
	}
	return &data, nil
}

and we have achieved the below numbers without any memory allocation overhead

comments powered by Disqus