Making HTTP Requests in Node.js: Mastering Fetch & Proxies

Nathan Reynolds

Last edited on May 3, 2025
Last edited on May 3, 2025

Coding Tutorials

Getting Started with HTTP Requests in Node.js: The Fetch API and Proxies

The backbone of the modern web? It's largely built on HTTP requests. Your browser uses them constantly to grab the web pages you see, and behind the scenes, servers chatter using them too. Understanding how to make these requests programmatically is a fundamental skill for any developer working with web technologies.

This guide dives into one of the most straightforward ways to handle HTTP requests within Node.js: the Fetch API. We'll explore how to use it effectively, handle common scenarios, and even integrate proxies for more advanced use cases.

What Exactly is the Fetch API?

Think of the Fetch API as a modern, built-in toolkit for JavaScript environments (like your browser and Node.js) designed specifically for network requests. It provides a promise-based mechanism, meaning it handles asynchronous operations gracefully. When you make a request, Fetch returns a Promise that will eventually resolve with the server's response.

The beauty is its native availability – no extra libraries needed for basic requests if you're using a recent version of Node.js (specifically, version 18 or newer) or any modern web browser.

Using the Fetch API in Your Node.js Projects

Let's roll up our sleeves and see how Fetch works in practice.

Setting Up Your Environment

First things first: ensure your Node.js version is up to par. Fetch is built-in starting from Node.js v18. You can check your version by running node -v in your terminal. If you're running an older version, head over to the official Node.js website to update.

Next, let's create a small project. Make a new directory (e.g., my-fetch-project), navigate into it using your terminal, and run npm init -y to create a package.json file with default settings.

To enable the modern JavaScript module syntax (like import/export), which we'll use, add this line inside the main curly braces {} of your package.json file:

{
  "name": "my-fetch-project",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "type": "module"
}

With that set up, you're ready to start fetching!

The Core: The fetch() Function

The heart of the Fetch API is the global fetch() function. At its simplest, you provide it with the URL you want to request. It returns a Promise that resolves to a Response object. Since network requests take time, we use the .then() method to handle the response once it arrives.

Let's grab some product data from the DummyJSON products API as an example. Create an index.js file in your project directory.

In the code below, we request product data, parse the JSON response using the .json() method (which also returns a promise), and then log the title and price of each product.

fetch('https://dummyjson.com/products')
  .then(response => response.json()) // Parse the response body as JSON
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => console.error("Fetch error:", error)); // Basic error catching

Run this using node index.js in your terminal. You should see output like this:

Fetched Products:
  - iPhone 9 ($549)
  - iPhone X ($899)
  - Samsung Universe 9 ($1249)

You can also use fetch() to retrieve raw HTML content from web pages. However, Fetch itself doesn't include tools for parsing HTML. For that, you'd typically pair it with libraries like Cheerio or node-html-parser to extract specific data from the HTML structure.

Handling Responses and Errors (Beyond Network Failures)

What happens if you try fetching a URL that doesn't exist, maybe due to a typo?

fetch('https://dummyjson.com/productss') // Intentional typo
  .then(response => response.json())
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => console.error("Fetch error:", error));

Running this might give you an error, but perhaps not the one you expect:

Fetch error: SyntaxError: Unexpected token '<', "

Why? The server likely returned an HTML "404 Not Found" page. The fetch() call itself didn't fail (it successfully received *a* response), but the attempt to parse the HTML 404 page as JSON using response.json() threw the error. The basic .catch() only triggers for network errors (like DNS resolution failures or connection timeouts), not for HTTP error statuses like 404 or 500.

To handle these HTTP-level errors gracefully, you need to check the response status. The Response object has properties for this. A handy one is response.ok, which is true if the HTTP status code is in the successful range (200-299) and false otherwise. You can also check response.status directly for the specific code (e.g., 200, 404, 503).

Here's an improved version that checks the status before attempting to parse JSON:

<code>fetch('https://dummyjson.com/productss') // Intentional typo
  .then(response => {
    if (!response.ok) {
      // Throw an error to be caught by .catch()
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json(); // Only parse if response is OK
  })
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => {
    console.error("Fetch operation failed:", error.message);
  });</code>

Now, running this with the incorrect URL gives a much clearer message:

<code>Fetch operation failed: HTTP error! Status: 404</code>

This kind of status checking is crucial when building robust applications, especially things like web scrapers that might encounter broken links.

Sending Data with POST Requests

So far, we've only used GET requests, which are primarily for retrieving data. But often, you need to send data *to* a server, for instance, to submit a form, create a new resource, or trigger an action. This is commonly done using POST requests (though other methods like PUT, PATCH, and DELETE exist for different purposes - see HTTP request methods).

Fetch handles these methods too. You just need to pass a second argument to fetch(): an options object where you configure the request details.

Let's try adding a new "todo" item using the DummyJSON todo API endpoint.

<code>const newTodo = {
  todo: 'Master Node.js Fetch API',
  completed: false,
  userId: 5, // Example user ID
};

fetch('https://dummyjson.com/todos/add', {
  method: 'POST', // Specify the HTTP method
  headers: {
    'Content-Type': 'application/json', // Indicate we're sending JSON data
  },
  body: JSON.stringify(newTodo), // Convert the JS object to a JSON string
})
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json(); // Parse the server's response (often the created item)
  })
  .then(data => {
    console.log('Successfully added todo:', data);
  })
  .catch(error => {
    console.error('POST request failed:', error.message);
  });</code>

Key parts of the options object:

  • method: 'POST': Tells Fetch to use the POST method.

  • headers: { 'Content-Type': 'application/json' }: Informs the server that the data in the request body is formatted as JSON.

  • body: JSON.stringify(newTodo): The actual data being sent. It needs to be converted to a string (usually JSON) before sending.

When you run this, the DummyJSON server will respond (likely echoing back the item you sent, possibly with an added ID), demonstrating a successful POST request. While this mock API doesn't permanently store the data, the principle applies to real-world APIs.

Integrating Proxies with Fetch in Node.js

Sometimes, you need to route your HTTP requests through a proxy server. This is common in web scraping with Node.js to avoid IP blocks, or simply to mask your application's origin IP address for privacy or testing geo-specific content.

Proxies act as intermediaries: your request goes to the proxy, and the proxy forwards it to the target server, making the request appear to originate from the proxy's IP.

Here's a slight complication: the standard Fetch API built into Node.js (and browsers) doesn't have direct, built-in support for configuring HTTP/S proxies. Bummer.

But fear not, there's a common workaround using a couple of popular Node.js packages: node-fetch (a library that provides a Fetch API implementation for Node.js, even before it was native, and offers more features) and https-proxy-agent (to handle routing requests via an HTTPS proxy).

First, install these dependencies:

Now, import them in your script:

<code>import fetch from 'node-fetch'; // Use node-fetch instead of the global fetch
import { HttpsProxyAgent } from 'https-proxy-agent';</code>

Next, you'll need your proxy connection details. For services like Evomi, this typically involves an endpoint address, port, and authentication credentials. For example, using Evomi's residential proxies might look something like http://your_username:your_password@rp.evomi.com:1000. (Evomi also offers datacenter, mobile, and static ISP proxies with different endpoints like dc.evomi.com or mp.evomi.com).

Important: Never hardcode sensitive credentials directly in your source code. Use environment variables instead. Set your proxy URL in your terminal before running the script:

<code># Example for Linux/macOS
export HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

# Example for Windows (Command Prompt)
set HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

# Example for Windows (PowerShell)
$env:HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

Now, you can create a proxy agent using this environment variable and pass it to the fetch call within the options object:

<code>const proxyUrl = process.env.HTTPS_PROXY;

if (!proxyUrl) {
  console.error("Error: HTTPS_PROXY environment variable not set.");
  process.exit(1); // Exit if the proxy URL isn't configured
}

const proxyAgent = new HttpsProxyAgent(proxyUrl);

// Let's fetch our IP address through the proxy to verify
fetch('https://api.ipify.org?format=json', { agent: proxyAgent })
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json();
  })
  .then(data => {
    console.log('IP address seen by the server (via proxy):', data.ip);
  })
  .catch(error => {
    console.error('Fetch request via proxy failed:', error.message);
  });</code>

When you run this code, the request to api.ipify.org (a simple service that returns your public IP) will be routed through the proxy specified in your environment variable. The output should show the proxy's IP address, not your own. Using reliable proxy providers like Evomi, known for ethical sourcing and high-quality Swiss infrastructure, ensures smooth and dependable connections for these tasks.

Fetch API vs. Axios: A Quick Comparison

Another very popular library for making HTTP requests in the Node.js ecosystem is Axios. It also offers a promise-based interface and is widely used.

So, how do they stack up?

  • Native vs. Install: Fetch API is built into Node.js (v18+) and modern browsers, requiring no installation for basic use. Axios always needs to be installed (npm install axios).

  • JSON Handling: Fetch requires an explicit step (response.json()) to parse JSON responses. Axios automatically parses JSON responses, making the data directly accessible via response.data.

  • Error Handling: As discussed, native Fetch only rejects promises on network errors, not HTTP errors (like 404). Axios rejects promises for both network errors *and* unsuccessful HTTP statuses (4xx, 5xx), which some developers find more intuitive.

  • Features: Both cover most common use cases. Axios has some additional built-in features like automatic request/response transformation and client-side XSRF protection. For proxy support with Axios, you'd typically configure its underlying HTTP adapter.

Here's how the earlier GET and POST examples might look using Axios:

Axios GET Example:

<code>import axios from 'axios';

axios.get('https://dummyjson.com/products')
  .then(response => {
    // No need for response.json()! Data is directly in response.data
    console.log("Fetched Products (Axios):");
    response.data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => {
    // Axios .catch handles both network and HTTP errors (4xx/5xx)
    if (error.response) {
      // The request was made and the server responded with a status code
      // that falls out of the range of 2xx
      console.error("Axios HTTP Error:", error.response.status, error.response.data);
    } else if (error.request) {
      // The request was made but no response was received
      console.error("Axios Network Error:", error.request);
    } else {
      // Something happened in setting up the request that triggered an Error
      console.error('Axios Error:', error.message);
    }
  });</code>

Axios POST Example:

<code>import axios from 'axios';

const newTodo = {
  todo: 'Explore Axios features',
  completed: false,
  userId: 5,
};

// Axios automatically stringifies the object and sets Content-Type header for JSON
axios.post('https://dummyjson.com/todos/add', newTodo)
  .then(response => {
    console.log('Successfully added todo (Axios):', response.data);
  })
  .catch(error => {
    if (error.response) {
      console.error("Axios POST HTTP Error:", error.response.status, error.response.data);
    } else {
      console.error('Axios POST Error:', error.message);
    }
  });</code>

Which to choose? If you need a simple, built-in solution with no extra dependencies, Fetch is excellent. If you prefer automatic JSON parsing, more conventional error handling for HTTP statuses, or some of Axios's extra features, and don't mind adding a dependency, Axios is a solid choice. Both are powerful tools for network requests.

Wrapping Up

The Fetch API provides a powerful and standardized way to interact with web resources directly from your Node.js code. We've covered making basic GET requests, handling JSON responses, checking HTTP status codes for robust error handling, sending data using POST, and even routing requests through proxies using the node-fetch and https-proxy-agent workaround.

Understanding how to make HTTP requests is essential for interacting with APIs, building web scrapers, or connecting different backend services. Now that you have a handle on Fetch, try applying it!

Consider exploring some of the vast public APIs available online to practice fetching different kinds of data. Or, challenge yourself by building a simple scraper that fetches HTML and uses a library like node-html-parser to extract specific information.

Getting Started with HTTP Requests in Node.js: The Fetch API and Proxies

The backbone of the modern web? It's largely built on HTTP requests. Your browser uses them constantly to grab the web pages you see, and behind the scenes, servers chatter using them too. Understanding how to make these requests programmatically is a fundamental skill for any developer working with web technologies.

This guide dives into one of the most straightforward ways to handle HTTP requests within Node.js: the Fetch API. We'll explore how to use it effectively, handle common scenarios, and even integrate proxies for more advanced use cases.

What Exactly is the Fetch API?

Think of the Fetch API as a modern, built-in toolkit for JavaScript environments (like your browser and Node.js) designed specifically for network requests. It provides a promise-based mechanism, meaning it handles asynchronous operations gracefully. When you make a request, Fetch returns a Promise that will eventually resolve with the server's response.

The beauty is its native availability – no extra libraries needed for basic requests if you're using a recent version of Node.js (specifically, version 18 or newer) or any modern web browser.

Using the Fetch API in Your Node.js Projects

Let's roll up our sleeves and see how Fetch works in practice.

Setting Up Your Environment

First things first: ensure your Node.js version is up to par. Fetch is built-in starting from Node.js v18. You can check your version by running node -v in your terminal. If you're running an older version, head over to the official Node.js website to update.

Next, let's create a small project. Make a new directory (e.g., my-fetch-project), navigate into it using your terminal, and run npm init -y to create a package.json file with default settings.

To enable the modern JavaScript module syntax (like import/export), which we'll use, add this line inside the main curly braces {} of your package.json file:

{
  "name": "my-fetch-project",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "type": "module"
}

With that set up, you're ready to start fetching!

The Core: The fetch() Function

The heart of the Fetch API is the global fetch() function. At its simplest, you provide it with the URL you want to request. It returns a Promise that resolves to a Response object. Since network requests take time, we use the .then() method to handle the response once it arrives.

Let's grab some product data from the DummyJSON products API as an example. Create an index.js file in your project directory.

In the code below, we request product data, parse the JSON response using the .json() method (which also returns a promise), and then log the title and price of each product.

fetch('https://dummyjson.com/products')
  .then(response => response.json()) // Parse the response body as JSON
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => console.error("Fetch error:", error)); // Basic error catching

Run this using node index.js in your terminal. You should see output like this:

Fetched Products:
  - iPhone 9 ($549)
  - iPhone X ($899)
  - Samsung Universe 9 ($1249)

You can also use fetch() to retrieve raw HTML content from web pages. However, Fetch itself doesn't include tools for parsing HTML. For that, you'd typically pair it with libraries like Cheerio or node-html-parser to extract specific data from the HTML structure.

Handling Responses and Errors (Beyond Network Failures)

What happens if you try fetching a URL that doesn't exist, maybe due to a typo?

fetch('https://dummyjson.com/productss') // Intentional typo
  .then(response => response.json())
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => console.error("Fetch error:", error));

Running this might give you an error, but perhaps not the one you expect:

Fetch error: SyntaxError: Unexpected token '<', "

Why? The server likely returned an HTML "404 Not Found" page. The fetch() call itself didn't fail (it successfully received *a* response), but the attempt to parse the HTML 404 page as JSON using response.json() threw the error. The basic .catch() only triggers for network errors (like DNS resolution failures or connection timeouts), not for HTTP error statuses like 404 or 500.

To handle these HTTP-level errors gracefully, you need to check the response status. The Response object has properties for this. A handy one is response.ok, which is true if the HTTP status code is in the successful range (200-299) and false otherwise. You can also check response.status directly for the specific code (e.g., 200, 404, 503).

Here's an improved version that checks the status before attempting to parse JSON:

<code>fetch('https://dummyjson.com/productss') // Intentional typo
  .then(response => {
    if (!response.ok) {
      // Throw an error to be caught by .catch()
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json(); // Only parse if response is OK
  })
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => {
    console.error("Fetch operation failed:", error.message);
  });</code>

Now, running this with the incorrect URL gives a much clearer message:

<code>Fetch operation failed: HTTP error! Status: 404</code>

This kind of status checking is crucial when building robust applications, especially things like web scrapers that might encounter broken links.

Sending Data with POST Requests

So far, we've only used GET requests, which are primarily for retrieving data. But often, you need to send data *to* a server, for instance, to submit a form, create a new resource, or trigger an action. This is commonly done using POST requests (though other methods like PUT, PATCH, and DELETE exist for different purposes - see HTTP request methods).

Fetch handles these methods too. You just need to pass a second argument to fetch(): an options object where you configure the request details.

Let's try adding a new "todo" item using the DummyJSON todo API endpoint.

<code>const newTodo = {
  todo: 'Master Node.js Fetch API',
  completed: false,
  userId: 5, // Example user ID
};

fetch('https://dummyjson.com/todos/add', {
  method: 'POST', // Specify the HTTP method
  headers: {
    'Content-Type': 'application/json', // Indicate we're sending JSON data
  },
  body: JSON.stringify(newTodo), // Convert the JS object to a JSON string
})
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json(); // Parse the server's response (often the created item)
  })
  .then(data => {
    console.log('Successfully added todo:', data);
  })
  .catch(error => {
    console.error('POST request failed:', error.message);
  });</code>

Key parts of the options object:

  • method: 'POST': Tells Fetch to use the POST method.

  • headers: { 'Content-Type': 'application/json' }: Informs the server that the data in the request body is formatted as JSON.

  • body: JSON.stringify(newTodo): The actual data being sent. It needs to be converted to a string (usually JSON) before sending.

When you run this, the DummyJSON server will respond (likely echoing back the item you sent, possibly with an added ID), demonstrating a successful POST request. While this mock API doesn't permanently store the data, the principle applies to real-world APIs.

Integrating Proxies with Fetch in Node.js

Sometimes, you need to route your HTTP requests through a proxy server. This is common in web scraping with Node.js to avoid IP blocks, or simply to mask your application's origin IP address for privacy or testing geo-specific content.

Proxies act as intermediaries: your request goes to the proxy, and the proxy forwards it to the target server, making the request appear to originate from the proxy's IP.

Here's a slight complication: the standard Fetch API built into Node.js (and browsers) doesn't have direct, built-in support for configuring HTTP/S proxies. Bummer.

But fear not, there's a common workaround using a couple of popular Node.js packages: node-fetch (a library that provides a Fetch API implementation for Node.js, even before it was native, and offers more features) and https-proxy-agent (to handle routing requests via an HTTPS proxy).

First, install these dependencies:

Now, import them in your script:

<code>import fetch from 'node-fetch'; // Use node-fetch instead of the global fetch
import { HttpsProxyAgent } from 'https-proxy-agent';</code>

Next, you'll need your proxy connection details. For services like Evomi, this typically involves an endpoint address, port, and authentication credentials. For example, using Evomi's residential proxies might look something like http://your_username:your_password@rp.evomi.com:1000. (Evomi also offers datacenter, mobile, and static ISP proxies with different endpoints like dc.evomi.com or mp.evomi.com).

Important: Never hardcode sensitive credentials directly in your source code. Use environment variables instead. Set your proxy URL in your terminal before running the script:

<code># Example for Linux/macOS
export HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

# Example for Windows (Command Prompt)
set HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

# Example for Windows (PowerShell)
$env:HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

Now, you can create a proxy agent using this environment variable and pass it to the fetch call within the options object:

<code>const proxyUrl = process.env.HTTPS_PROXY;

if (!proxyUrl) {
  console.error("Error: HTTPS_PROXY environment variable not set.");
  process.exit(1); // Exit if the proxy URL isn't configured
}

const proxyAgent = new HttpsProxyAgent(proxyUrl);

// Let's fetch our IP address through the proxy to verify
fetch('https://api.ipify.org?format=json', { agent: proxyAgent })
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json();
  })
  .then(data => {
    console.log('IP address seen by the server (via proxy):', data.ip);
  })
  .catch(error => {
    console.error('Fetch request via proxy failed:', error.message);
  });</code>

When you run this code, the request to api.ipify.org (a simple service that returns your public IP) will be routed through the proxy specified in your environment variable. The output should show the proxy's IP address, not your own. Using reliable proxy providers like Evomi, known for ethical sourcing and high-quality Swiss infrastructure, ensures smooth and dependable connections for these tasks.

Fetch API vs. Axios: A Quick Comparison

Another very popular library for making HTTP requests in the Node.js ecosystem is Axios. It also offers a promise-based interface and is widely used.

So, how do they stack up?

  • Native vs. Install: Fetch API is built into Node.js (v18+) and modern browsers, requiring no installation for basic use. Axios always needs to be installed (npm install axios).

  • JSON Handling: Fetch requires an explicit step (response.json()) to parse JSON responses. Axios automatically parses JSON responses, making the data directly accessible via response.data.

  • Error Handling: As discussed, native Fetch only rejects promises on network errors, not HTTP errors (like 404). Axios rejects promises for both network errors *and* unsuccessful HTTP statuses (4xx, 5xx), which some developers find more intuitive.

  • Features: Both cover most common use cases. Axios has some additional built-in features like automatic request/response transformation and client-side XSRF protection. For proxy support with Axios, you'd typically configure its underlying HTTP adapter.

Here's how the earlier GET and POST examples might look using Axios:

Axios GET Example:

<code>import axios from 'axios';

axios.get('https://dummyjson.com/products')
  .then(response => {
    // No need for response.json()! Data is directly in response.data
    console.log("Fetched Products (Axios):");
    response.data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => {
    // Axios .catch handles both network and HTTP errors (4xx/5xx)
    if (error.response) {
      // The request was made and the server responded with a status code
      // that falls out of the range of 2xx
      console.error("Axios HTTP Error:", error.response.status, error.response.data);
    } else if (error.request) {
      // The request was made but no response was received
      console.error("Axios Network Error:", error.request);
    } else {
      // Something happened in setting up the request that triggered an Error
      console.error('Axios Error:', error.message);
    }
  });</code>

Axios POST Example:

<code>import axios from 'axios';

const newTodo = {
  todo: 'Explore Axios features',
  completed: false,
  userId: 5,
};

// Axios automatically stringifies the object and sets Content-Type header for JSON
axios.post('https://dummyjson.com/todos/add', newTodo)
  .then(response => {
    console.log('Successfully added todo (Axios):', response.data);
  })
  .catch(error => {
    if (error.response) {
      console.error("Axios POST HTTP Error:", error.response.status, error.response.data);
    } else {
      console.error('Axios POST Error:', error.message);
    }
  });</code>

Which to choose? If you need a simple, built-in solution with no extra dependencies, Fetch is excellent. If you prefer automatic JSON parsing, more conventional error handling for HTTP statuses, or some of Axios's extra features, and don't mind adding a dependency, Axios is a solid choice. Both are powerful tools for network requests.

Wrapping Up

The Fetch API provides a powerful and standardized way to interact with web resources directly from your Node.js code. We've covered making basic GET requests, handling JSON responses, checking HTTP status codes for robust error handling, sending data using POST, and even routing requests through proxies using the node-fetch and https-proxy-agent workaround.

Understanding how to make HTTP requests is essential for interacting with APIs, building web scrapers, or connecting different backend services. Now that you have a handle on Fetch, try applying it!

Consider exploring some of the vast public APIs available online to practice fetching different kinds of data. Or, challenge yourself by building a simple scraper that fetches HTML and uses a library like node-html-parser to extract specific information.

Getting Started with HTTP Requests in Node.js: The Fetch API and Proxies

The backbone of the modern web? It's largely built on HTTP requests. Your browser uses them constantly to grab the web pages you see, and behind the scenes, servers chatter using them too. Understanding how to make these requests programmatically is a fundamental skill for any developer working with web technologies.

This guide dives into one of the most straightforward ways to handle HTTP requests within Node.js: the Fetch API. We'll explore how to use it effectively, handle common scenarios, and even integrate proxies for more advanced use cases.

What Exactly is the Fetch API?

Think of the Fetch API as a modern, built-in toolkit for JavaScript environments (like your browser and Node.js) designed specifically for network requests. It provides a promise-based mechanism, meaning it handles asynchronous operations gracefully. When you make a request, Fetch returns a Promise that will eventually resolve with the server's response.

The beauty is its native availability – no extra libraries needed for basic requests if you're using a recent version of Node.js (specifically, version 18 or newer) or any modern web browser.

Using the Fetch API in Your Node.js Projects

Let's roll up our sleeves and see how Fetch works in practice.

Setting Up Your Environment

First things first: ensure your Node.js version is up to par. Fetch is built-in starting from Node.js v18. You can check your version by running node -v in your terminal. If you're running an older version, head over to the official Node.js website to update.

Next, let's create a small project. Make a new directory (e.g., my-fetch-project), navigate into it using your terminal, and run npm init -y to create a package.json file with default settings.

To enable the modern JavaScript module syntax (like import/export), which we'll use, add this line inside the main curly braces {} of your package.json file:

{
  "name": "my-fetch-project",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "keywords": [],
  "author": "",
  "license": "ISC",
  "type": "module"
}

With that set up, you're ready to start fetching!

The Core: The fetch() Function

The heart of the Fetch API is the global fetch() function. At its simplest, you provide it with the URL you want to request. It returns a Promise that resolves to a Response object. Since network requests take time, we use the .then() method to handle the response once it arrives.

Let's grab some product data from the DummyJSON products API as an example. Create an index.js file in your project directory.

In the code below, we request product data, parse the JSON response using the .json() method (which also returns a promise), and then log the title and price of each product.

fetch('https://dummyjson.com/products')
  .then(response => response.json()) // Parse the response body as JSON
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => console.error("Fetch error:", error)); // Basic error catching

Run this using node index.js in your terminal. You should see output like this:

Fetched Products:
  - iPhone 9 ($549)
  - iPhone X ($899)
  - Samsung Universe 9 ($1249)

You can also use fetch() to retrieve raw HTML content from web pages. However, Fetch itself doesn't include tools for parsing HTML. For that, you'd typically pair it with libraries like Cheerio or node-html-parser to extract specific data from the HTML structure.

Handling Responses and Errors (Beyond Network Failures)

What happens if you try fetching a URL that doesn't exist, maybe due to a typo?

fetch('https://dummyjson.com/productss') // Intentional typo
  .then(response => response.json())
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => console.error("Fetch error:", error));

Running this might give you an error, but perhaps not the one you expect:

Fetch error: SyntaxError: Unexpected token '<', "

Why? The server likely returned an HTML "404 Not Found" page. The fetch() call itself didn't fail (it successfully received *a* response), but the attempt to parse the HTML 404 page as JSON using response.json() threw the error. The basic .catch() only triggers for network errors (like DNS resolution failures or connection timeouts), not for HTTP error statuses like 404 or 500.

To handle these HTTP-level errors gracefully, you need to check the response status. The Response object has properties for this. A handy one is response.ok, which is true if the HTTP status code is in the successful range (200-299) and false otherwise. You can also check response.status directly for the specific code (e.g., 200, 404, 503).

Here's an improved version that checks the status before attempting to parse JSON:

<code>fetch('https://dummyjson.com/productss') // Intentional typo
  .then(response => {
    if (!response.ok) {
      // Throw an error to be caught by .catch()
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json(); // Only parse if response is OK
  })
  .then(data => {
    console.log("Fetched Products:");
    data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => {
    console.error("Fetch operation failed:", error.message);
  });</code>

Now, running this with the incorrect URL gives a much clearer message:

<code>Fetch operation failed: HTTP error! Status: 404</code>

This kind of status checking is crucial when building robust applications, especially things like web scrapers that might encounter broken links.

Sending Data with POST Requests

So far, we've only used GET requests, which are primarily for retrieving data. But often, you need to send data *to* a server, for instance, to submit a form, create a new resource, or trigger an action. This is commonly done using POST requests (though other methods like PUT, PATCH, and DELETE exist for different purposes - see HTTP request methods).

Fetch handles these methods too. You just need to pass a second argument to fetch(): an options object where you configure the request details.

Let's try adding a new "todo" item using the DummyJSON todo API endpoint.

<code>const newTodo = {
  todo: 'Master Node.js Fetch API',
  completed: false,
  userId: 5, // Example user ID
};

fetch('https://dummyjson.com/todos/add', {
  method: 'POST', // Specify the HTTP method
  headers: {
    'Content-Type': 'application/json', // Indicate we're sending JSON data
  },
  body: JSON.stringify(newTodo), // Convert the JS object to a JSON string
})
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json(); // Parse the server's response (often the created item)
  })
  .then(data => {
    console.log('Successfully added todo:', data);
  })
  .catch(error => {
    console.error('POST request failed:', error.message);
  });</code>

Key parts of the options object:

  • method: 'POST': Tells Fetch to use the POST method.

  • headers: { 'Content-Type': 'application/json' }: Informs the server that the data in the request body is formatted as JSON.

  • body: JSON.stringify(newTodo): The actual data being sent. It needs to be converted to a string (usually JSON) before sending.

When you run this, the DummyJSON server will respond (likely echoing back the item you sent, possibly with an added ID), demonstrating a successful POST request. While this mock API doesn't permanently store the data, the principle applies to real-world APIs.

Integrating Proxies with Fetch in Node.js

Sometimes, you need to route your HTTP requests through a proxy server. This is common in web scraping with Node.js to avoid IP blocks, or simply to mask your application's origin IP address for privacy or testing geo-specific content.

Proxies act as intermediaries: your request goes to the proxy, and the proxy forwards it to the target server, making the request appear to originate from the proxy's IP.

Here's a slight complication: the standard Fetch API built into Node.js (and browsers) doesn't have direct, built-in support for configuring HTTP/S proxies. Bummer.

But fear not, there's a common workaround using a couple of popular Node.js packages: node-fetch (a library that provides a Fetch API implementation for Node.js, even before it was native, and offers more features) and https-proxy-agent (to handle routing requests via an HTTPS proxy).

First, install these dependencies:

Now, import them in your script:

<code>import fetch from 'node-fetch'; // Use node-fetch instead of the global fetch
import { HttpsProxyAgent } from 'https-proxy-agent';</code>

Next, you'll need your proxy connection details. For services like Evomi, this typically involves an endpoint address, port, and authentication credentials. For example, using Evomi's residential proxies might look something like http://your_username:your_password@rp.evomi.com:1000. (Evomi also offers datacenter, mobile, and static ISP proxies with different endpoints like dc.evomi.com or mp.evomi.com).

Important: Never hardcode sensitive credentials directly in your source code. Use environment variables instead. Set your proxy URL in your terminal before running the script:

<code># Example for Linux/macOS
export HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

# Example for Windows (Command Prompt)
set HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

# Example for Windows (PowerShell)
$env:HTTPS_PROXY="http://your_username:your_password@rp.evomi.com:1000"

Now, you can create a proxy agent using this environment variable and pass it to the fetch call within the options object:

<code>const proxyUrl = process.env.HTTPS_PROXY;

if (!proxyUrl) {
  console.error("Error: HTTPS_PROXY environment variable not set.");
  process.exit(1); // Exit if the proxy URL isn't configured
}

const proxyAgent = new HttpsProxyAgent(proxyUrl);

// Let's fetch our IP address through the proxy to verify
fetch('https://api.ipify.org?format=json', { agent: proxyAgent })
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json();
  })
  .then(data => {
    console.log('IP address seen by the server (via proxy):', data.ip);
  })
  .catch(error => {
    console.error('Fetch request via proxy failed:', error.message);
  });</code>

When you run this code, the request to api.ipify.org (a simple service that returns your public IP) will be routed through the proxy specified in your environment variable. The output should show the proxy's IP address, not your own. Using reliable proxy providers like Evomi, known for ethical sourcing and high-quality Swiss infrastructure, ensures smooth and dependable connections for these tasks.

Fetch API vs. Axios: A Quick Comparison

Another very popular library for making HTTP requests in the Node.js ecosystem is Axios. It also offers a promise-based interface and is widely used.

So, how do they stack up?

  • Native vs. Install: Fetch API is built into Node.js (v18+) and modern browsers, requiring no installation for basic use. Axios always needs to be installed (npm install axios).

  • JSON Handling: Fetch requires an explicit step (response.json()) to parse JSON responses. Axios automatically parses JSON responses, making the data directly accessible via response.data.

  • Error Handling: As discussed, native Fetch only rejects promises on network errors, not HTTP errors (like 404). Axios rejects promises for both network errors *and* unsuccessful HTTP statuses (4xx, 5xx), which some developers find more intuitive.

  • Features: Both cover most common use cases. Axios has some additional built-in features like automatic request/response transformation and client-side XSRF protection. For proxy support with Axios, you'd typically configure its underlying HTTP adapter.

Here's how the earlier GET and POST examples might look using Axios:

Axios GET Example:

<code>import axios from 'axios';

axios.get('https://dummyjson.com/products')
  .then(response => {
    // No need for response.json()! Data is directly in response.data
    console.log("Fetched Products (Axios):");
    response.data.products.forEach(product => {
      console.log(`- ${product.title} ($${product.price})`);
    });
  })
  .catch(error => {
    // Axios .catch handles both network and HTTP errors (4xx/5xx)
    if (error.response) {
      // The request was made and the server responded with a status code
      // that falls out of the range of 2xx
      console.error("Axios HTTP Error:", error.response.status, error.response.data);
    } else if (error.request) {
      // The request was made but no response was received
      console.error("Axios Network Error:", error.request);
    } else {
      // Something happened in setting up the request that triggered an Error
      console.error('Axios Error:', error.message);
    }
  });</code>

Axios POST Example:

<code>import axios from 'axios';

const newTodo = {
  todo: 'Explore Axios features',
  completed: false,
  userId: 5,
};

// Axios automatically stringifies the object and sets Content-Type header for JSON
axios.post('https://dummyjson.com/todos/add', newTodo)
  .then(response => {
    console.log('Successfully added todo (Axios):', response.data);
  })
  .catch(error => {
    if (error.response) {
      console.error("Axios POST HTTP Error:", error.response.status, error.response.data);
    } else {
      console.error('Axios POST Error:', error.message);
    }
  });</code>

Which to choose? If you need a simple, built-in solution with no extra dependencies, Fetch is excellent. If you prefer automatic JSON parsing, more conventional error handling for HTTP statuses, or some of Axios's extra features, and don't mind adding a dependency, Axios is a solid choice. Both are powerful tools for network requests.

Wrapping Up

The Fetch API provides a powerful and standardized way to interact with web resources directly from your Node.js code. We've covered making basic GET requests, handling JSON responses, checking HTTP status codes for robust error handling, sending data using POST, and even routing requests through proxies using the node-fetch and https-proxy-agent workaround.

Understanding how to make HTTP requests is essential for interacting with APIs, building web scrapers, or connecting different backend services. Now that you have a handle on Fetch, try applying it!

Consider exploring some of the vast public APIs available online to practice fetching different kinds of data. Or, challenge yourself by building a simple scraper that fetches HTML and uses a library like node-html-parser to extract specific information.

Author

Nathan Reynolds

Web Scraping & Automation Specialist

About Author

Nathan specializes in web scraping techniques, automation tools, and data-driven decision-making. He helps businesses extract valuable insights from the web using ethical and efficient scraping methods powered by advanced proxies. His expertise covers overcoming anti-bot mechanisms, optimizing proxy rotation, and ensuring compliance with data privacy regulations.

Like this article? Share it.
You asked, we answer - Users questions:
How do I fetch plain text or HTML content, not just JSON, using Node.js Fetch?+
Can I set a specific request timeout for native Node.js Fetch operations?+
How can I use different types of proxies, like SOCKS5, with Fetch in Node.js?+
What's the correct way to send an `Authorization` header (e.g., Bearer token) with a Node.js Fetch request?+
Is it possible to cancel a Fetch request in Node.js after it has started?+

In This Article

Read More Blogs