Go's standard library handles CSV-to-JSON cleanly without third-party deps. The choice is mostly about whether you want to parse into typed structs (cleaner, gocsv helps) or generic maps (more flexible, stdlib only). For one-off conversions of unpredictable input, calling the API is one net/http call.

Method 1: encoding/csv + encoding/json (stdlib only)

Both packages are in the standard library, so no go get required. The pattern: read CSV rows, build a slice of map[string]string using the header row as keys, marshal to JSON.

package main

import (
    "encoding/csv"
    "encoding/json"
    "fmt"
    "io"
    "os"
)

func csvToJSON(inPath, outPath string) error {
    in, err := os.Open(inPath)
    if err != nil {
        return fmt.Errorf("open: %w", err)
    }
    defer in.Close()

    r := csv.NewReader(in)
    headers, err := r.Read()
    if err != nil {
        return fmt.Errorf("read header: %w", err)
    }

    var rows []map[string]string
    for {
        record, err := r.Read()
        if err == io.EOF {
            break
        }
        if err != nil {
            return fmt.Errorf("read row: %w", err)
        }
        row := make(map[string]string, len(headers))
        for i, h := range headers {
            if i < len(record) {
                row[h] = record[i]
            }
        }
        rows = append(rows, row)
    }

    out, err := os.Create(outPath)
    if err != nil {
        return fmt.Errorf("create: %w", err)
    }
    defer out.Close()

    enc := json.NewEncoder(out)
    enc.SetIndent("", "  ")
    return enc.Encode(rows)
}

func main() {
    if err := csvToJSON("users.csv", "users.json"); err != nil {
        fmt.Fprintln(os.Stderr, err)
        os.Exit(1)
    }
}

Three things worth knowing about encoding/csv:

  • Default delimiter is comma. For semicolon-separated CSVs (common in EU exports), set r.Comma = ';'.
  • FieldsPerRecord defaults to the header count. If rows have variable field counts, set r.FieldsPerRecord = -1.
  • BOM is not stripped. Excel exports start with \ufeff on the first header. Strip it manually before parsing or use bufio.

Method 2: gocarina/gocsv (struct tags for typed output)

If your CSVs have a known schema, struct-based parsing is cleaner. gocarina/gocsv mirrors the encoding/json tag convention.

go get github.com/gocarina/gocsv
package main

import (
    "encoding/json"
    "fmt"
    "os"

    "github.com/gocarina/gocsv"
)

type User struct {
    ID    int    `csv:"id" json:"id"`
    Name  string `csv:"name" json:"name"`
    Email string `csv:"email" json:"email"`
    Age   int    `csv:"age" json:"age"`
}

func csvToJSON(inPath, outPath string) error {
    in, err := os.Open(inPath)
    if err != nil {
        return err
    }
    defer in.Close()

    var users []User
    if err := gocsv.UnmarshalFile(in, &users); err != nil {
        return err
    }

    out, err := os.Create(outPath)
    if err != nil {
        return err
    }
    defer out.Close()

    enc := json.NewEncoder(out)
    enc.SetIndent("", "  ")
    return enc.Encode(users)
}

func main() {
    if err := csvToJSON("users.csv", "users.json"); err != nil {
        fmt.Fprintln(os.Stderr, err)
        os.Exit(1)
    }
}

The struct approach gives you type coercion for free — int columns become Go ints, no strconv.Atoi needed. The tradeoff is you need to know the schema upfront, so this doesn't work for arbitrary CSV input.

Method 3: ChangeThisFile API (no parser, handles messy input)

If you're processing third-party CSV exports — varying delimiters, encodings, BOM, inconsistent rows — the API absorbs the variability. Get a free API key (1,000 conversions/month, no card).

package main

import (
    "bytes"
    "fmt"
    "io"
    "mime/multipart"
    "net/http"
    "os"
)

const apiKey = "ctf_sk_your_key_here"

func csvToJSON(inPath, outPath string) error {
    body := &bytes.Buffer{}
    w := multipart.NewWriter(body)

    f, err := os.Open(inPath)
    if err != nil {
        return fmt.Errorf("open: %w", err)
    }
    defer f.Close()

    fw, err := w.CreateFormFile("file", "input.csv")
    if err != nil {
        return err
    }
    if _, err := io.Copy(fw, f); err != nil {
        return err
    }
    _ = w.WriteField("source", "csv")
    _ = w.WriteField("target", "json")
    if err := w.Close(); err != nil {
        return err
    }

    req, err := http.NewRequest("POST", "https://changethisfile.com/v1/convert", body)
    if err != nil {
        return err
    }
    req.Header.Set("Authorization", "Bearer "+apiKey)
    req.Header.Set("Content-Type", w.FormDataContentType())

    resp, err := http.DefaultClient.Do(req)
    if err != nil {
        return fmt.Errorf("http: %w", err)
    }
    defer resp.Body.Close()

    if resp.StatusCode != http.StatusOK {
        msg, _ := io.ReadAll(resp.Body)
        return fmt.Errorf("api %d: %s", resp.StatusCode, msg)
    }

    out, err := os.Create(outPath)
    if err != nil {
        return err
    }
    defer out.Close()
    _, err = io.Copy(out, resp.Body)
    return err
}

func main() {
    if err := csvToJSON("messy_export.csv", "clean.json"); err != nil {
        fmt.Fprintln(os.Stderr, err)
        os.Exit(1)
    }
}

The API auto-detects delimiters (comma, semicolon, tab, pipe), normalizes encodings (UTF-8, Windows-1252, Latin-1), and strips BOM. Output is a JSON array of objects.

When to use each

ApproachBest forTradeoff
encoding/csv (stdlib)Most projects — no deps, full controlManual type coercion, no struct tags
gocarina/gocsvKnown schema, want struct tags + typesThird-party dep, schema must be defined
ChangeThisFile APIUnknown input, edge cases, no dep on encoding/csvNetwork call, per-call cost above free tier

CLI alternatives: csvkit, jq, miller

Outside Go, several CLIs do CSV-to-JSON cleanly. Use them when you don't need a long-running Go binary:

# csvkit (Python-based, but the canonical CLI)
pip install csvkit
csvjson users.csv > users.json

# miller (single binary, fast for big files)
mlr --c2j cat users.csv > users.json

# jq alone is awkward for CSV; use mlr or csvkit instead.

From Go, you can shell out via os/exec:

cmd := exec.Command("mlr", "--c2j", "cat", "users.csv")
out, err := cmd.Output()

For one-off conversions in a script, this is the shortest path. For a long-running service, use the stdlib approach.

Production tips

  • Strip BOM before parsing Excel exports. Read the first three bytes; if they're 0xEF 0xBB 0xBF, advance the reader past them. The header column will silently get a hidden prefix otherwise.
  • Use ReuseRecord = true for hot paths. csv.Reader allocates a new slice per row by default; setting r.ReuseRecord = true avoids the allocation and is much faster on large files (you must copy strings if you keep them).
  • Stream to JSONL for huge files. Writing to a JSON array buffers everything in memory. For multi-GB CSVs, write each row as a JSON object on its own line (JSONL / NDJSON) and stream both ends.
  • Validate row width. CSV doesn't enforce field counts. With FieldsPerRecord = -1, you must check len(record) == len(headers) per row or risk panics.

For most Go services, the standard library is the right answer — no dependencies, no surprises. Add gocsv when you have a stable schema. Use the API when you don't want to handle the input zoo. Free tier covers 1,000 conversions/month at 25MB max per file.