First of all, to get the idea of what I am trying to achieve here, please have a look at my idea thread, entitled Idea For Using A Cron Job To Grab Daily Travian Map.sql Updates. This gives a prelude to the project, what it is for, what it is supposed to do, etc.
Now, what I need is specific advice on setting up the mySQL database(s) to implement this idea. There is a useful FAQ file for how to make use of the map.sql data located on the help.travian.com website, which also gives an example of the CREATE TABLE instruction that matches the data in the map.sql files. The problem with this FAQ, and the reason for my making this thread, is that the information is only presented for the situation where the user is playing on a single Travian server. As you will know if you have read my idea thread, I am planning to do this for each server that my clan members are playing on.
So, here is the question. Do I need to make use of a new mySQL database (knowing that Xisto provides 99 of them on a hosting account) for each Travian server that I will be downloading map.sql files for? Should I just create multiple tables in a single mySQL database on my Xisto account, one table for each Travian server? Or, third option, should I put all the data for all Travian servers that I'm downloading map.sql files for into a single table in a single database, and create a new field on the fly to identify which server the data is from?
Also, I notice that the example CREATE TABLE instruction on the FAQ page does not include a field for a timestamp, so how will I track the daily changes?
Obviously, the information that I am looking at storing is going to result in a huge database, however it is sliced up, whether they all be in one huge table in one database, multiple tables in one database, or one table in multiple databases. I may even need to look at multiple tables in multiple databases. But the fact remains that the amount of data that I'm gathering is going to be large, and might even be beyond the scope of a free hosting account. They even mention this in the FAQ page... "To be more explicit: Don't even consider using a free hoster if you really consider writing this mapping tool." So, maybe I am biting off more than I can chew. I'd love to get some input on this. I am not planning on building yet another mapping tool, as there are plenty of those around. I am not planning to build a world analyzer, as there are also plenty of those around. I am not planning on building a signature image generator, as there are plenty of those also. What I want my website to be is elements of all of these and more, taking advantage of the data that is provided in the map.sql files. I should be able to query and present the data stored in the mySQL database(s) of my hosting account, and display the results as a single instance map in response to a specific query. I should be able to query information about a specific Alliance, player, or village and present that information. I should be able to create auto-updating signature images based on the data in the mySQL databases, but only for registered members of my clan.
Again, any help or advice with specifics of how best to implement this would really be appreciated.
Personally i would go with one database per sever and then one table per update. Nice and simple. As for the time stamp you can manually create some SQL for that and then fill the field with the current server timestamp. IVe posted in your other thread too.
Thank you for your advice shadowx. I have further questions though. If, as you suggest, I were to have one table per update, I would need to mass edit the map.sql files each time I update, wouldn't I? The file contains a whole load of INSERT statements, in the form... INSERT INTO `x_world` VALUES (data1,data2,data3,...); ...and therefore they would automatically be inserted into a table called x_world, unless I altered the map.sql file before running the script. I guess also I would need a new CREATE TABLE instance for each table, right? Please excuse my n00b questions, I have not done much with SQL before. Is there a simple way to do this in a non-interactive way, that could be called from the CRON job that grabs the latest update?
Thats a fair point, i ve just seen the example SQL and this is what i would do, first do what i said above and get the data into the variable. then do a str_replace function on that variable to swap the "INSERT table_x" with "INSERT table_timestamp" or whatever you wanted. EG:$data // this is our data variable that contains the SQL file we got from my first lot of codestr_replace("table_x", "table_5/07/08", $data); //search for "table_x" replace it with a table name of your choice
Of course it all depends on exactly what the name of the table is within their SQL file, but you get the idea.
And cronjobs cant really help with this, all it does is run a script, so first you have to write the script, then tell cronjobs where it is and when to run it, its basically the task scheduler in windows.
I think I understand where you are coming from for this idea, and I would agree with shadowx that you need to have separate servers for each instance of the sytem you want to update.
To set up multiple server sounds fairly easy, I have been working on doing the same thing for a backup of my works data onto an external disk.... However it sounds complicated from the official manual, and there aren't really any good places that I have found that explain all the sytax for the my.ini (on windows) or my.cnf (unix/linux) file.
I would highly recomend the reference manual from O'Reilly: MySQL in a nutshell, so far I have gradually been working my way through it and it seems concise and reasonably easy to follow.
heres a web references that may be of some help
However, after an extensive search, and a final return to the book mentioned above, I came across this command.mysqld_multi --example
this will return a usefull sample of how your my.ini (or my.cnf) file should look. One thing to remember with this file is that you can't put in any comments - you will need to put those into another file elsewhere!
Your next problem is the files that you want to import?
I would actually recomend setting up your servers to act a slaves to the primary server on the travian web site. All you will then need to do is link into their server logs (which may or may not be public!) and then they will automagically do the required updating etc.
You may even find that you could set your local system as a cluster, in this instance it looks as though there may be good info on the official mysql site
This isn't something I had thought of setting up for a while, but may find myself doing so in the not too dim and distant!
Good luck with your programing.
|VIEW DESKTOP VERSION||REGISTER||GET FREE HOSTING|