I don't think there is a limit, if you do it correctly (eg one of the options launches more options).
Thanks a lot! Any idea which is more efficient (computing wise), if it matters at all?
i.e. A single long event or many events
My gut tells me that it won't matter, as the event has to be called either way. At best, it may improve loading times to (imperceptibly) because there would be fewer events to load.
This seems like it would apply to is_triggered_only events, but with triggered events, it seems like it would be more efficient to calculate long strings of events within one single master event than to calculate a whole host of events every 20 days. This is just my gut feeling though.
Also, I'm working on a lot of events for my mod and wondering if it would be better to create some kind of master event:
random_list = { # only one event, or no event fired every twenty game days, with varying 'days =' delays.
a = my event
b = ""
c = ""
d = ""
e = ""
f = ""
g = ""
h = ""
i = ""
Advantage would be all my events are triggered only and handled by this master event and only two (or one) of my events need to be calculated at any given time.
Or would it be better to use systems already in place --- mean time to happen, weights, which sound like they evaluate, for every event, every twenty days.
I could test this, if I was so motivated (I'm not), by taking vanilla events, and organizing them in this way to see if I could
A) produce similar effects to Vanilla experience
B) Observe a change in performance (either increase, no change, or decrease)
The question then is, is it more efficient for the game to calculate a single random_list or calculate weights and mean_time_to_happen for every event in the game?