How to Read JSON Files in Python: A Step-by-Step Guide
Python's built-in json module makes reading JSON files simple and reliable. This guide covers every common scenario — local files, APIs, and error handling.
Reading a Local JSON File
The standard pattern for reading a JSON file in Python is: import json; with open("data.json", "r", encoding="utf-8") as f: data = json.load(f). The with statement ensures the file is properly closed even if an error occurs. Always specify encoding="utf-8" to handle Unicode correctly.
json.load() (note: no "s") reads from a file object. It is equivalent to json.loads(f.read()) but more memory-efficient because it reads the file in chunks rather than loading the entire content as a string before parsing. Prefer json.load() when reading files.
Reading JSON from an API Response
With the requests library (the standard for HTTP in Python): import requests; response = requests.get(url); data = response.json(). The .json() method is equivalent to json.loads(response.text) but also raises an exception if the content type is not JSON.
Using urllib from the standard library (no additional installation): import json, urllib.request; with urllib.request.urlopen(url) as response: data = json.load(response). This works for simple GET requests without authentication. Use requests for anything more complex.
Handling Large JSON Files
For JSON files that are too large to fit comfortably in memory, use the ijson library for streaming parsing: import ijson; with open("large.json", "rb") as f: for item in ijson.items(f, "items.item"): process(item). This reads array items one at a time without loading the entire file.
For JSONL (JSON Lines) files — one JSON object per line — use standard file reading with per-line parsing: for line in f: data = json.loads(line.strip()). This is memory-efficient and faster than loading a large JSON array, which is why JSONL is preferred for data exports and log files.
Error Handling When Reading JSON
Wrap json.load() and json.loads() in try/except blocks when reading from untrusted sources. Catch json.JSONDecodeError for parse failures and FileNotFoundError for missing files: try: data = json.load(f) except json.JSONDecodeError as e: print(f"Invalid JSON: {e}") except FileNotFoundError: print("File not found").
For API responses, also check the HTTP status before parsing: if response.status_code != 200: raise ValueError(f"API error: {response.status_code}"). Never assume an API response is valid JSON — error responses often return HTML or plain text even when the success response is JSON.
Try JSON Reader Free Online
No sign-up required. 100% client-side — your data never leaves your browser.
Open JSON Readerarrow_forwardFrequently Asked Questions
What encoding should I use when opening a JSON file in Python?
Always use encoding="utf-8". JSON is specified to use UTF-8 encoding (RFC 8259 requires it for data transported over networks). Using the wrong encoding causes UnicodeDecodeError for any non-ASCII characters.
How do I read a JSON file into a pandas DataFrame?
Use pandas.read_json("file.json") for tabular JSON or pandas.json_normalize(data) for nested JSON. The read_json function handles common cases automatically.
Can Python read JSON directly from a compressed file?
Yes. For gzip-compressed JSON: import gzip, json; with gzip.open("data.json.gz", "rt", encoding="utf-8") as f: data = json.load(f). The "rt" mode reads as text (not binary) after decompression.