Damn I apologise for the noise now, but I did manage to run into one problem:
Subject: 日本語らららららららららららららららららららららららららららららららら ららららららららららららららららららららら
Subject: =?UTF-8?B?5pel5pys6Kqe44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ?= =?UTF-8?B?44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ?= =?UTF-8?B?44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ?= =?UTF-8?B?44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ44KJ?= Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 7bit
Testing against:
if header :contains "Subject" "らららららららららららららららら" { fileinto "nihongo"; }
=> OK
if header :contains "Subject" "ららららららららららららららららら" { fileinto "nihongo"; }
=> NG
So it seems the longest word test is 16 UTF8 chars (or 48 bytes?). So as long as we use small words, it should be ok.
Out of curiousity, can we increase this limit?
-- Jorgen Lundman | lundman@lundman.net Unix Administrator | +81 (0)3 -5456-2687 ext 1017 (work) Shibuya-ku, Tokyo | +81 (0)90-5578-8500 (cell) Japan | +81 (0)3 -3375-1767 (home)