CodeEditorComponent and own tokenizer problems

I wrote a small tokenizer to colorize urls (simple http:// starting words with a blue color), and it works fine in DEBUG builds in release builds the characters get all mangled and repeated many times, here is an example below. I tried it with both DirectWrite enabled and disabled, same results in both cases.

mangled characters:
[attachment=1]code_editor_bug.png[/attachment]

same document but in debug build:
[attachment=0]code_editor_nobug_debug_build.png[/attachment]

And if you remove your highlighter code, does it all work ok…?

Yep if i remove the tokenize code it’s all good.

Does that mean that my tokenizer is at fault ?

Well, obviously I can’t say for 100% certain that it’s your fault, but if things break when you add your tokeniser, then I’d say your tokeniser is probably the main suspect in the case!

Well yeah, but why DEBUG/RELEASE difference i mean i wrote the tokenizer in the debug build and it’s all good in the release it’s not good. It takes a long time to link my app (i got boost and luabind added) so checking it in Release mode is impossible (it would take me a few days to verify a few simple functions). I’ll paste the code here maybe someone will see an obvious mistake. For now i removed that code and the URLs are black but still clickable (witch was the most important part)

CtrlrLuaMethodTokeniser::CtrlrLuaMethodTokeniser() : scheme("http://"), inComment(false)
{
}

CtrlrLuaMethodTokeniser::~CtrlrLuaMethodTokeniser()
{
}

int CtrlrLuaMethodTokeniser::readNextToken (CodeDocument::Iterator &source)
{
	int result = tokenType_error;

	source.skipWhitespace();

	juce_wchar firstChar = source.peekNextChar();

	switch (firstChar)
	{
		case '-':
			if (source.peekNextChar() == '-')
			{
				result = tokenType_comment;
				source.skip();
			}
			break;

		case 'h':
			result = parseUrl(source);
			break;

		default:
			source.skip();
	}
	
	return (result);
}

StringArray CtrlrLuaMethodTokeniser::getTokenTypes()
{
	const char* const types[] =
    {
        "Error",
        "Comment",
        "URL",
        0
    };

	return StringArray (types);
}

Colour CtrlrLuaMethodTokeniser::getDefaultColour (int tokenType)
{
	switch (tokenType)
	{
		case 1:
			return (Colours::darkgrey);

		case 2:
			return (Colours::blue);
		default:
			break;
	}
	return (Colours::black);
}

int CtrlrLuaMethodTokeniser::parseUrl(CodeDocument::Iterator &source)
{
	if (isUrlScheme(source))
	{
		while (!CharacterFunctions::isWhitespace (source.nextChar())) {}

		return (tokenType_url);
	}

	return (tokenType_error);
}

const bool CtrlrLuaMethodTokeniser::isUrlScheme(CodeDocument::Iterator &source)
{
	for (int i = 0; i<7; i++)
	{
		if (source.nextChar() == scheme[i])
		{
			continue;
		}
		else
		{
			return (false);
		}
	}

	return (true);
}

while (!CharacterFunctions::isWhitespace (source.nextChar())) {}

Looks pretty dodgy to me. What if there is no whitespace?

Well that wasn’t definetl the case here but allright i simplified it as much as i could, remove all the code, same problem debug build all fine release build all messed up

Is there something else i should do in this method?

int CtrlrLuaMethodTokeniser::readNextToken (CodeDocument::Iterator &source)
{
	int result = tokenType_error;

	source.skip();
	
	return (result);
}