Forums » Community Projects
Chocolateer, I must say I've very conflicted on this. I understand what you're after, and I can see there being a use for it, but we have very different views on how it should be implemented.
What I think is the simplest way to handle it, is as I've already said: in the Library API documentation mention waiting until after PLUGINS_LOADED to create a dialog. I'm not the most educated when it comes to what libraries should do, but I currently believe that they should only do what they are designed to accomplish and not include any extra logic to control loading and use access.
Barring this as an acceptable solution, the next thing I would personally do is create some sort of delayed load handler that a plugin gets registered with to handle exactly what you describe. This is because i believe that LibStub should do what it does without adding any additional complexity to it's own feature-set.
The only other reason I can think of that LibStub should not be the place to implement LIBRARIES_LOADED event is that it starts to blur the lines a little where libraries end and the plugin begins, which is why I suggested the second solution.
That said, I wish there were others with an opinion on the matter to lend weight one way or the other. Regardless, I suppose I'll make the addition though my reservations stand. Like you've said, it doesn't have to be used by a plugin if the author doesn't want to.
One thing I disagree with, in your suggestion for the implementation of LIBRARIES_LOADED, is the use of isloaded, and requiring libraries to check for the variable to be true before actually embedding themselves.
The entire embedding/mixin paradigm is such that if a library uses embedding and it gets updated, it should update the embedded function references that have already been created. In the case of a library that doesn't mesh well with embedding, for whatever reason, there is no need for updates to happen anyway. It is basically an over-complication for what LibStub is supposed to do already anyway. Plugin authors should be capable of reading API docs and understanding what is expected of them, assuming the library author wrote decent documentation. And I suppose this is a large part of why I continue to think it's unnecessary to make this change.
What I think is the simplest way to handle it, is as I've already said: in the Library API documentation mention waiting until after PLUGINS_LOADED to create a dialog. I'm not the most educated when it comes to what libraries should do, but I currently believe that they should only do what they are designed to accomplish and not include any extra logic to control loading and use access.
Barring this as an acceptable solution, the next thing I would personally do is create some sort of delayed load handler that a plugin gets registered with to handle exactly what you describe. This is because i believe that LibStub should do what it does without adding any additional complexity to it's own feature-set.
The only other reason I can think of that LibStub should not be the place to implement LIBRARIES_LOADED event is that it starts to blur the lines a little where libraries end and the plugin begins, which is why I suggested the second solution.
That said, I wish there were others with an opinion on the matter to lend weight one way or the other. Regardless, I suppose I'll make the addition though my reservations stand. Like you've said, it doesn't have to be used by a plugin if the author doesn't want to.
One thing I disagree with, in your suggestion for the implementation of LIBRARIES_LOADED, is the use of isloaded, and requiring libraries to check for the variable to be true before actually embedding themselves.
The entire embedding/mixin paradigm is such that if a library uses embedding and it gets updated, it should update the embedded function references that have already been created. In the case of a library that doesn't mesh well with embedding, for whatever reason, there is no need for updates to happen anyway. It is basically an over-complication for what LibStub is supposed to do already anyway. Plugin authors should be capable of reading API docs and understanding what is expected of them, assuming the library author wrote decent documentation. And I suppose this is a large part of why I continue to think it's unnecessary to make this change.
Without the isLoaded flag, having the LIBRARIES_LOADED event is completely useless (registering LIBRARIES_LOADED would be identical to registering PLUGINS_LOADED, and the race condition I described earlier would still be present).
At first, I was unsure if that was the case, so I made the following plugins to test how events get handled:
-- Plugin Bravo --
Bravo = {}
function Bravo:PLUGINS_LOADED(event, data)
console_print'Bravo sees PLUGINS_LOADED'
ProcessEvent'BRAVO_LOADED'
end
RegisterEvent(Bravo, "PLUGINS_LOADED")
-- Plugin Delta --
Delta = {}
function Delta:PLUGINS_LOADED(event, data)
console_print'Delta sees PLUGINS_LOADED'
ProcessEvent'DELTA_LOADED'
end
RegisterEvent(Delta, "PLUGINS_LOADED")
-- Plugin Echo --
Echo = {}
function Echo:BRAVO_LOADED(event, data)
console_print'BRAVO_LOADED event generated'
end
function Echo:DELTA_LOADED(event, data)
console_print'DELTA_LOADED event generated'
end
RegisterEvent(Echo, "BRAVO_LOADED")
RegisterEvent(Echo, "DELTA_LOADED")
-- Console Log --
Bravo sees PLUGINS_LOADED
BRAVO_LOADED event generated
Delta sees PLUGINS_LOADED
DELTA_LOADED event generated
-- End Test --
So as you can see, LIBRARIES_LOADED can still be triggered before some plugins see PLUGINS_LOADED.
Now, I could have each individual library do basically the same thing I requested having LibStub do, but that would mean each library would be generating its own event (e.g. LIBRARY_A_LOADED, LIBRARY_B_LOADED), but that gets to to be complicated because plugins that use multiple libraries have to register multiple events, and the events could be generated in any order so that puts the burden on the plugin to have extra logic to wait for all the events to be triggered before proceeding with embedding.
And doing a delayed load handler isn't as trivial as you make it sound. As I indicated earlier, my libraries have dependencies on other libraries which dictates that the embedding be done in a particular order. I had a concept for delayed loading that was close to working where the libraries would have an initialization function that gets triggered later, but it ended up not being feasible because the embedding would have been done in the wrong order (and it still was nowhere near as elegant as the isLoaded flag I proposed for LibStub).
The only other reason I can think of that LibStub should not be the place to implement LIBRARIES_LOADED event is that it starts to blur the lines a little where libraries end and the plugin begins
The reason I proposed the LIBRARIES_LOADED event was to make those lines more clear-cut. What I am looking to accomplish is a means to ensure the following flow:
1) Load all libraries
2) Each plugin embeds the libraries it uses and then
3) executes the main code
At first, I was unsure if that was the case, so I made the following plugins to test how events get handled:
-- Plugin Bravo --
Bravo = {}
function Bravo:PLUGINS_LOADED(event, data)
console_print'Bravo sees PLUGINS_LOADED'
ProcessEvent'BRAVO_LOADED'
end
RegisterEvent(Bravo, "PLUGINS_LOADED")
-- Plugin Delta --
Delta = {}
function Delta:PLUGINS_LOADED(event, data)
console_print'Delta sees PLUGINS_LOADED'
ProcessEvent'DELTA_LOADED'
end
RegisterEvent(Delta, "PLUGINS_LOADED")
-- Plugin Echo --
Echo = {}
function Echo:BRAVO_LOADED(event, data)
console_print'BRAVO_LOADED event generated'
end
function Echo:DELTA_LOADED(event, data)
console_print'DELTA_LOADED event generated'
end
RegisterEvent(Echo, "BRAVO_LOADED")
RegisterEvent(Echo, "DELTA_LOADED")
-- Console Log --
Bravo sees PLUGINS_LOADED
BRAVO_LOADED event generated
Delta sees PLUGINS_LOADED
DELTA_LOADED event generated
-- End Test --
So as you can see, LIBRARIES_LOADED can still be triggered before some plugins see PLUGINS_LOADED.
Now, I could have each individual library do basically the same thing I requested having LibStub do, but that would mean each library would be generating its own event (e.g. LIBRARY_A_LOADED, LIBRARY_B_LOADED), but that gets to to be complicated because plugins that use multiple libraries have to register multiple events, and the events could be generated in any order so that puts the burden on the plugin to have extra logic to wait for all the events to be triggered before proceeding with embedding.
And doing a delayed load handler isn't as trivial as you make it sound. As I indicated earlier, my libraries have dependencies on other libraries which dictates that the embedding be done in a particular order. I had a concept for delayed loading that was close to working where the libraries would have an initialization function that gets triggered later, but it ended up not being feasible because the embedding would have been done in the wrong order (and it still was nowhere near as elegant as the isLoaded flag I proposed for LibStub).
The only other reason I can think of that LibStub should not be the place to implement LIBRARIES_LOADED event is that it starts to blur the lines a little where libraries end and the plugin begins
The reason I proposed the LIBRARIES_LOADED event was to make those lines more clear-cut. What I am looking to accomplish is a means to ensure the following flow:
1) Load all libraries
2) Each plugin embeds the libraries it uses and then
3) executes the main code
Except what you're forgetting is that by the time PLUGINS_LOADED event has triggered, if your load sequence was constructed properly all of the libraries will have been registered, embedded, updated as necessary, and the plugins are in the clear to start doing what they need to do. There is no race condition. Stop making the libraries try to do something they don't need to. Leave the heavy lifting to the plugins. Describe their dependencies and leave it to the plugin author to use them properly.
To further substantiate things, here's my own test. http://pastebin.com/KQQRH7ei
The files can be downloaded here: http://bespin.org/~draugath/vo/libtest.zip
In this test I created 3 dummy libraries with two functions each that only print the Major and Minor version of the library at the time the functions are written. I then created three dummy projects that instantiate the libraries in the exact same order, which could be said to simulate dependencies between the libraries, though order wasn't necessary for this test. Each dummy plugin executes all of the functions at load time and later after PLUGINS_LOADED. The second project increments the minor version of the second library. The third project increments the minor version of the third library. Otherwise they all have the same minor version in their libraries. I then ran the test three times with the load order changed for each run, as indicated in the test data.
The bottom line is that there is no race condition, no need for a further event beyond PLUGINS_LOADED, and no need to complicate things any more than they were initially.
If you have a library that creates dialogs or other elements that don't work will with embedding or don't update easily after updates, then just tell the plugin author to wait until after PLUGINS_LOADED to create them. I continue to not see why these extra levels of complication are needed.
I want this to be a community effort, and I want it to be as simple as possible, but I'm at a loss when the only two people who have so far chimed in have two radically different views on what is necessary and how to approach it.
To further substantiate things, here's my own test. http://pastebin.com/KQQRH7ei
The files can be downloaded here: http://bespin.org/~draugath/vo/libtest.zip
In this test I created 3 dummy libraries with two functions each that only print the Major and Minor version of the library at the time the functions are written. I then created three dummy projects that instantiate the libraries in the exact same order, which could be said to simulate dependencies between the libraries, though order wasn't necessary for this test. Each dummy plugin executes all of the functions at load time and later after PLUGINS_LOADED. The second project increments the minor version of the second library. The third project increments the minor version of the third library. Otherwise they all have the same minor version in their libraries. I then ran the test three times with the load order changed for each run, as indicated in the test data.
The bottom line is that there is no race condition, no need for a further event beyond PLUGINS_LOADED, and no need to complicate things any more than they were initially.
If you have a library that creates dialogs or other elements that don't work will with embedding or don't update easily after updates, then just tell the plugin author to wait until after PLUGINS_LOADED to create them. I continue to not see why these extra levels of complication are needed.
I want this to be a community effort, and I want it to be as simple as possible, but I'm at a loss when the only two people who have so far chimed in have two radically different views on what is necessary and how to approach it.
Because I still find the current state of Plugins in Vendetta Online unsatisfying (different plugins loading multiple and possibly differing versions of a library concurrently), I have taken a different approach.
While it is nice to have shared libraries, I don't believe it is a very good solution unless it is part of the main game. And even then, when the API of a library changes, it renders all plugins depending on a previous version of this library dysfunctional.
Instead, I propose shipping a plugin with all its required libraries included. I know that this is a little harder because there is only a subset of Lua available in VO. It does however work if the library is contained in a single file, because it allows you to do the following:
local lib = dofile("somelib.lua")
And if you need it in multiple files:
myplugin = {}
myplugin.lib = dofile("somelib.lua")
Because it is tedious to restrict development to a single file when writing a library, I wrote a simple script that scans a Lua file in a depth-first manner and constructs a single lua file. In addition, it changes the global namespace to local and returns it to e.g. dofile. You can find this script here:
https://gist.github.com/fhirschmann/4755009
While it is nice to have shared libraries, I don't believe it is a very good solution unless it is part of the main game. And even then, when the API of a library changes, it renders all plugins depending on a previous version of this library dysfunctional.
Instead, I propose shipping a plugin with all its required libraries included. I know that this is a little harder because there is only a subset of Lua available in VO. It does however work if the library is contained in a single file, because it allows you to do the following:
local lib = dofile("somelib.lua")
And if you need it in multiple files:
myplugin = {}
myplugin.lib = dofile("somelib.lua")
Because it is tedious to restrict development to a single file when writing a library, I wrote a simple script that scans a Lua file in a depth-first manner and constructs a single lua file. In addition, it changes the global namespace to local and returns it to e.g. dofile. You can find this script here:
https://gist.github.com/fhirschmann/4755009
ok. First, I have not read the entire thread, and I do not think I completely understand what I have read. That being said, here is my opinion:
main.lua: I am of two minds on this:
Mind 1) main.lua is the plugin, and all other files, if any, are auxiliary. Put all the code here you want.
Mind 2) main.lua could be converted to little more than a generic loader, to the point that every plugin could use the exact same copy of main.lua.
Example:
-- main.lua --
dofile(mypluginmain.lua)
f = loadstring(dofile_result)
f()
dofile_result, f = "", ""
-- mypluginmain.lua --
declare("dofile_result", "
... plugin code goes here
"
End example.
I have not tested that code, and it is just an example. The point being, is that a plugin can be executed, or read like a file. Reading it like a file would allow you to use the actual code as documentation if you had an ingame text reader.
Enough on that point. Next point.
I am working on a plugin, and am curious how people here think my plugin should be structured.
The plugin is to be called something like "plugin file system" abbreviated to "pfs". The purpose is to allow one or more plugins to add file mounts, such that each mount will offer a function to open a file, which will return a "file" object, which in turn supports file:read, write, del, close, etc..
I would like to include with this system several predefined mounts. One that works with config.ini, another with systemNotes, and a third that uses dofile to read files, and perhaps one or more virtual mounts, that uses other mounts to offer new services.
As an example, a plugin might have extensive set of help files that it might not wish to load until the user request them. On user request it could open a file handle to the help file(s), display them, and when done, send the handle to garbage collection.
Another example is using the SystemNotes mount to store files in dofile format, using the config.ini mount to store the pathnames, and then present a virtual mount that gives you full time read only access to those files.
On top of all this, I am considering including one or more primitive applications, like a file explorer, a text viewer, and perhaps even a text editor. Mostly as example implementations, but also for debugging and providing some minimum of useful functionality out of the box.
Currently I am looking at dividing this project into a large number of files. A file for the primitive applications, that load pfs, which in turn loads the default mounts, each mount type being in it's own file, with extensive documentation in seperate dofile formatted files, etc..
So does what you are working on in this thread have any bearing on what I am working on?
main.lua: I am of two minds on this:
Mind 1) main.lua is the plugin, and all other files, if any, are auxiliary. Put all the code here you want.
Mind 2) main.lua could be converted to little more than a generic loader, to the point that every plugin could use the exact same copy of main.lua.
Example:
-- main.lua --
dofile(mypluginmain.lua)
f = loadstring(dofile_result)
f()
dofile_result, f = "", ""
-- mypluginmain.lua --
declare("dofile_result", "
... plugin code goes here
"
End example.
I have not tested that code, and it is just an example. The point being, is that a plugin can be executed, or read like a file. Reading it like a file would allow you to use the actual code as documentation if you had an ingame text reader.
Enough on that point. Next point.
I am working on a plugin, and am curious how people here think my plugin should be structured.
The plugin is to be called something like "plugin file system" abbreviated to "pfs". The purpose is to allow one or more plugins to add file mounts, such that each mount will offer a function to open a file, which will return a "file" object, which in turn supports file:read, write, del, close, etc..
I would like to include with this system several predefined mounts. One that works with config.ini, another with systemNotes, and a third that uses dofile to read files, and perhaps one or more virtual mounts, that uses other mounts to offer new services.
As an example, a plugin might have extensive set of help files that it might not wish to load until the user request them. On user request it could open a file handle to the help file(s), display them, and when done, send the handle to garbage collection.
Another example is using the SystemNotes mount to store files in dofile format, using the config.ini mount to store the pathnames, and then present a virtual mount that gives you full time read only access to those files.
On top of all this, I am considering including one or more primitive applications, like a file explorer, a text viewer, and perhaps even a text editor. Mostly as example implementations, but also for debugging and providing some minimum of useful functionality out of the box.
Currently I am looking at dividing this project into a large number of files. A file for the primitive applications, that load pfs, which in turn loads the default mounts, each mount type being in it's own file, with extensive documentation in seperate dofile formatted files, etc..
So does what you are working on in this thread have any bearing on what I am working on?
tinbot:
I think it does apply if you intend your plugin to be used by other plugins. Your first option is pretty much what I proposed - the script I provided does exactly that - pack all the code into a single Lua file. The others have proposed an approach where complex dependency management, including shared libraries, is involved.
I don't mean to criticize your plugin idea, but keep in mind that Vendetta Online is sandboxed on purpose. A lot of people, myself included, don't want in-game plugins to have access to the filesystem.
meridian, draugath:
I would also like to note that hooking onto LIBRARIES_LOADED/PLUGINS_LOADED is problematic for transitive dependencies, i.e. Plugin A depends on Lib 2 depends on Lib 1.
I think it does apply if you intend your plugin to be used by other plugins. Your first option is pretty much what I proposed - the script I provided does exactly that - pack all the code into a single Lua file. The others have proposed an approach where complex dependency management, including shared libraries, is involved.
I don't mean to criticize your plugin idea, but keep in mind that Vendetta Online is sandboxed on purpose. A lot of people, myself included, don't want in-game plugins to have access to the filesystem.
meridian, draugath:
I would also like to note that hooking onto LIBRARIES_LOADED/PLUGINS_LOADED is problematic for transitive dependencies, i.e. Plugin A depends on Lib 2 depends on Lib 1.
firsm:
This plugin is not intended to break the sandbox.
It is intended to provide a more standardized interface to the various available options, and make the options more accessible.
I also intended to to use a recyclable key system, so that when files are deleted, all resources used by that file can be recycled.
I also wished to present as many options as possible, so that users can choose the best method of storage. I would rather a plugin not dump all it's data in config.ini, when that data, only needs to be written when a character is online. The systemNotes method would be better. The system is to encourage using the most appropriate kind of storage.
This plugin is not intended to break the sandbox.
It is intended to provide a more standardized interface to the various available options, and make the options more accessible.
I also intended to to use a recyclable key system, so that when files are deleted, all resources used by that file can be recycled.
I also wished to present as many options as possible, so that users can choose the best method of storage. I would rather a plugin not dump all it's data in config.ini, when that data, only needs to be written when a character is online. The systemNotes method would be better. The system is to encourage using the most appropriate kind of storage.
tinbot:
I am sorry, I thought you wanted to provide a method for accessing the real file system. Having a filesystem-like structure or a key-value store that uses the mission notes as backend sounds like a good idea.
I am sorry, I thought you wanted to provide a method for accessing the real file system. Having a filesystem-like structure or a key-value store that uses the mission notes as backend sounds like a good idea.
This thread has a high probability of remaining relevant. I have nothing to add, but the devs insist on autolocking threads when they haven't had a reply in a while, so I'm replying.