I'm not seeking a step-by-step tutorial on how to set this up. I'm genuinely curious about what's achievable and what's not.
Within my Angular 6 application, I am exploring the implementation of server-side content loading for better indexing by search engine bots. I have numerous data arriving from API requests that I want to ensure can also be crawled. For instance, the code snippet below demonstrates retrieving project data for the current page from my ecommerce platform's API:
In My Angular Component
getProductById(product_id) {
const data = { product_id: product_id };
return this.http.get<any>( api_url + '/getProductById', {params: data} );
}
This function calls my API which fetches data from BigCommerce (as shown below):
My Express API
getProductById = (req, res, next) => {
let product_id = req.query.product_id;
return bc_v3.get(`/catalog/products/${product_id}`).then(data => {
return data; // this data will then return back to the client async
});
};
Is It Feasible to Index API Data?
Can the data retrieved from APIs be indexed effectively by SEO bots?