- This is non-spec behavior, but it appears that most HTTP servers
implicitly support non-ASCII characters when parsing path components.
Extend http-parser to allow this.
- Fill out slots [128, 256) in normal_url_char[] with 1 so that these
high octets are accepted in path components.
- Add unit test for paths that include such non-ASCII characters.
Fixes#37.
Check for overflow during chunk trailer by removing unnecessary check in macro PARSING_HEADER. This will force the parser to abort if the chunk trailer contains more than HTTP_MAX_HEADER_SIZE of data.
Without this change, it is possible to get an assertion to fail by
continuing to call http_parser_execute after it has returned an error.
Specifically, the parser could be called with parser->state ==
s_chunk_size_almost_done and parser->flags & F_CHUNKED set. Then,
F_CHUNKED could have been cleared, and an error could be hit. In this
case, the parser would have returned with F_CHUNKED clear, but
parser->state == s_chunk_size_almost_done, resulting in an assertion
failure on the next call.
There are alternate solutions possible, including just saving all of
the fields (state included) on error.
I didn't add a test case because this is a bit annoying to test, but I
can add one if necesssary.
acceptable_header[x] is always assigned to a variable of type char, so
the 'unsigned' is unnecessary.
The other arrays can be of type int8_t/uint8_t to save space.
Yay valgrind testing
I don't believe that this actually mattered at all, because state was
initialized correctly, and flags would be set to 0 almost immediately
anyways.