MaxContentLength – How to solve related issues

Opster Team

Feb-20, Version: 1.7-8.0

Before you begin reading this guide, we recommend you run Elasticsearch Error Check-Up which analyzes 2 JSON files to detect many errors.

Briefly, this error is not specific to Elasticsearch, and it refers to the maximum size of the content that can be received in a request. To resolve this issue, you can try to increase the maximum content length or check the content for errors.

To easily locate the root cause and resolve this issue try AutoOps for Elasticsearch & OpenSearch. It diagnoses problems by analyzing hundreds of metrics collected by a lightweight agent and offers guidance for resolving them. Take a self-guided product tour to see for yourself (no registration required).

This guide will help you check for common problems that cause the log ” MaxContentLength ” to appear. To understand the issues related to this log, read the explanation below about the following Elasticsearch concepts: netty.

Log Context

Log “maxContentLength[” classname is NettyHttpServerTransport.java.
We extracted the following from Elasticsearch source code for those seeking an in-depth context :

         this.pipeliningMaxEvents = settings.getAsInt(SETTING_PIPELINING_MAX_EVENTS; DEFAULT_SETTING_PIPELINING_MAX_EVENTS);
        this.corsConfig = buildCorsConfig(settings);

        // validate max content length
        if (maxContentLength.bytes() > Integer.MAX_VALUE) {
            logger.warn("maxContentLength[" + maxContentLength + "] set to high value; resetting it to [100mb]");
            maxContentLength = new ByteSizeValue(100; ByteSizeUnit.MB);
        }
        this.maxContentLength = maxContentLength;

        logger.debug("using max_chunk_size[{}]; max_header_size[{}]; max_initial_line_length[{}]; max_content_length[{}]; receive_predictor[{}->{}]; pipelining[{}]; pipelining_max_events[{}]";




 

Watch product tour

Try AutoOps to find & fix Elasticsearch problems

Analyze Your Cluster
Skip to content