Apitally Blog

What makes a good REST API?

Opinionated best practices for building user-friendly and robust REST APIs.

Today, anyone with basic programming skills can build an API. Frameworks like FastAPI, which provide an intuitive interface and are well documented, make it really easy. But what does it take to ship and maintain a robust REST API that other developers love using, that always works as expected and scales well?

This article offers an opinionated overview of REST API best practices covering:

  • API design
  • OpenAPI
  • Validation
  • Rate limiting
  • Asynchronous processing
  • Monitoring

I won’t go into too much detail in each section, but instead I’ll link to further resources on the topic for those that are interested to go deeper.

API design: Follow best practices

A good API has a well thought out and logical design of endpoints, returns data in a manner that is easy to consume, and provides detailed feedback in case of errors.

There are established best practices for designing API endpoints. These typically include:

  • Use plural nouns for resource collections in URIs, e.g. /v1/posts/123
  • Avoid using verbs in the endpoint URIs, use an appropriate HTTP method instead
  • Keep all characters in the URI lowercase, using hyphens to separate words
  • Logically nest endpoints to show relationships. For example, a comment on a blog post should be /v1/posts/{postId}/comments/{commentId}.
  • Include a version in the endpoint URIs, like /v1/posts, to allow for future updates without breaking existing clients
  • Ensure that naming conventions, response formats, and behavior are consistent across all API endpoints
  • Use appropriate HTTP status codes in responses, e.g. 201 Created if a new resource was created or 403 Forbidden for an authorization error
  • Allow users to filter, sort and paginate through datasets using query parameters and avoid returning excessively large API responses

Further resources:

OpenAPI: Auto-generate documentation and SDKs

A good API has a complete OpenAPI specification that acts as a contract between the API provider and consumers. It is also used to create comprehensive documentation and SDKs.

Many web frameworks offer support for auto-generating an OpenAPI specification from endpoint definitions in the API’s codebase, either out of the box or via third-party libraries. You can use annotations to add additional details such as descriptions and examples. Keeping a single source of truth through a tight integration with the web framework helps keep the specification up-to-date and version-controlled alongside the codebase.

There are many tools that use the OpenAPI specification to:

  • Generate interactive documentation
  • Generate client libraries / SDKs in various languages
  • Generate mock servers for testing
  • Automatically test your API endpoints against the specification
  • And more …

Further resources:

Recommended tools:

Validation

A good API meticulously validates user input and provides detailed and structured error messages when invalid input is received.

Being a fundamental task, input validation is supported well by many web frameworks. FastAPI, for example, uses pydantic for this purpose, and automatically generates responses with well-structured details about validation errors (see example below).

{
  "detail": [
    {
      "type": "missing",
      "loc": ["body", "name"],
      "msg": "Field required",
      "input": { ... }
     }
  ]
}

When creating validation rules, consider the following criteria:

  • Type: Ensure input has the correct data type.
  • Length: Check that the input has the expected length and define an upper limit (e.g. forbid extremely long string input, or arrays with millions of elements).
  • Format: Check string inputs against expected patterns to ensure they are formatted correctly (e.g. dates).
  • Range: Ensure that numeric values fall within the accepted range.
  • Business logic: Check input against your own business rules (do this last).

Input validation should occur as early as possible and fail fast when handling invalid requests. Use an appropriate HTTP status code in the 4xx range for validation errors (e.g. 400 Bad Request).

Some web frameworks can also perform validation of API responses (e.g. FastAPI with pydantic), guaranteeing that the returned data always has the correct structure and type. Alternatively, the API endpoints should be covered by automated tests that ensure the response data is formatted correctly (according to the API’s specification) under all circumstances.

Further resources:

Rate limiting

A good API employs rate limiting to prevent overloading and ensure quality of service for all clients.

Rate limiting can be implemented at different levels of your stack, including the network level, the web server, your application code, or a combination of these.

As a first layer of protection, most web servers can be easily configured to rate limit requests by IP address. For more flexibility and better control, you might want to implement (additional) rate limiting logic in the application layer. For example, you could apply dynamic limits based on the users’ pricing tier.

When rejecting requests due to rate limiting, it is good practice to use the 429 Too Many Requests response status code and include headers such as Retry-After to give clients feedback they can use to handle the rate limit gracefully.

Ensure your API documentation clearly states rate limiting rules and describes the rate limiting headers used.

Further resources:

Asynchronous processing

A good API performs longer running tasks in the background using a task queue and worker system and avoids keeping client connections open for extended periods.

This typically involves accepting a task from a client, adding it to the task queue, and then immediately sending a response back acknowledging that the task was received and is being processed. The 202 Accepted response status code is often used in this scenario.

This approach improves user experience and helps prevent timeouts as the client isn’t left waiting for a response. It also allows the server to handle multiple requests concurrently and remain responsive while processing long-running tasks.

APIs should also provide a way for clients to check the status of their tasks and fetch the result, if applicable. This could be implemented as separate endpoints which the client can poll at intervals.

Further resources:

Monitoring

A good API is proactively monitored to ensure its consistent availability, reliability and performance.

Having monitoring tools in place allows you to detect and respond to issues quickly, ideally before they impact users. It also helps you understand how the API is being used and empowers you to make data-driven product and engineering decisions.

Monitoring should cover at least the following key aspects:

  • Traffic: Number of requests (per minute)
  • Errors: Failed requests due to client or server errors
  • Response times / latencies: How long it takes API endpoints to return a response
  • Uptime: Whether the API is available to consumers

Further resources:

Recommended tools:

  • Apitally: Easy-to-use API monitoring for Python and Node.js