Re: Token filters and simplification
[
Date Prev][
Date Next][
Thread Prev][
Thread Next]
[
Date Index]
[
Thread Index]
- Subject: Re: Token filters and simplification
- From: Asko Kauppi <askok@...>
- Date: Thu, 9 Nov 2006 22:30:15 +0200
I did. And started thinking something else, thinking oh boy the
Brazilians will reduce soooo much.. ;)
- object syntax
- anything marked "syntax sugar" in the reference ;)
But actually, I don't think the telescope should be held that way.
Reason is, token filtering will always add onto the loading speed,
even if the modified features weren't even used by the script. That
is bad, and Lua authors value loading speed highly, which is great.
So in my opinion, the benefits of a smaller core would be
overshadowed, and smallness is not an end goal per se.
In fact, I would expect to-be-famous token features to get adopted
into the language, some day, once the use and syntax of them has
stabilized (and more than 50% of Lua users are indeed using such
tokenizers). Maybe, that day will come?
-asko
Gavin Wraith kirjoitti 9.11.2006 kello 16.50:
Apologies if this email is short on detail. Token
filters are being seen as a way of modifying Lua's
syntax without interfering with the Lua core. Has
anybody thought about looking at them through the
other end of the telescope? That is to say, what
elements of standard Lua can be replaced by a
token-filter built on top of an even smaller core?
--
Gavin Wraith (gavin@wra1th.plus.com)
Home page: http://www.wra1th.plus.com/