mirror of
https://github.com/haproxy/haproxy.git
synced 2026-02-03 20:39:41 -05:00
For tokenizing a string, standard Lua recommends to use regexes.
The followinf example splits words:
for i in string.gmatch(example, "%S+") do
print(i)
end
This is a little bit overkill for simply split words. This patch
adds a tokenize function which quick and do not use regexes.
|
||
|---|---|---|
| .. | ||
| _static | ||
| conf.py | ||
| index.rst | ||
| Makefile | ||