My knowledge mysql is very poor...
Since Run Basic only supports SQLite, I'll stick to that discussion. Granted mySQL is managed it has advantages.
Enlighten me and let me know the limitations....
Biggest limitation of SQLite as implimented in RB is it has a tendancy to lockup when you have very complex queries that update. That is you have two cursors, one reading, and one writing back to the same file. However most databases simply do adds, changes, deletes, lookups and reports,
I would assume that the mysql would have to sort by some algorithem.
SQLite and almost all other DB's do not do a sort in the traditional way. As you add records it has a tree structure that is very efficient and it inserts the keys with pointers to the data into the tree, It's very fast.
But sometimes I want to get a copy of the mysql for special statistical work done on the data
so I must download or run from RB and get the data from the sql listing.
Not sure what stats you are talking about. But normally you create a Query to retrieve the data and do the stats on the fly.
Maybe I can generate that in real time too .... hot d**n.. that would be "Fantastic"
OOPS... Just answered that above.
Adding,delete and changes to the sql data base entries need to be done real time.
That's usually the standard way of maintaining and keeping a DB up to date.
Some accounting and emailing info to customers as their accounts need to be renewed or
updated.
Done all the time..
This sql db will be dynamic and change as time goes by.
The primary reason most people use a sql db.
But must always render the listings for any user.
Standard stuff.
I am not sure if that is really a good thing as repeated searches of the sql is cpu intensive.
If you are constantly hammering a DB in a particular sequence then put a index on that data. For example assume you had a db with the following file called list:
customerNum int(5),
entryDate date,
itemID valchar(10)
notes text
And you constantly need to get a customers information in date and item sequence then you could place a index on customerNum, entryDate, itemId.
And you also constantly need item and customer information you could also place a indes on itemId,customerNum.
Of course you would also want to have tables for customers and items so you can do joins of those files to the list for customer info, and item infor.
And Joins are simple.
Linear searches I don't think are pratical in the large number of sql entries in the system.
The system would have to search and any subsearch or some way to break the file or store the index for selections could be built into another sql file and updated on any sort, again real time.
Well lets assume you have a fairly small file of only a couple million records. If you knew that items belonged to categories, you could put a category in the item file. Then again you may want a category file to break that out further. In any event if you wanted for example category =1 from the list, you could simply do a SQL with the list and item joined as follows:
SELECT * FROM item JOIN item ON category=1, and item.itemId = list.itemId, ordered by customerNum (or whatever)
Assuming many people will come( I hope) to my sight and look thru the data.
being able to quickly find a listing is important and so is not wasting the resource time and time again for the 1000's of searches.
Either by creating a subset and if someone searches then that creates a subset of the selection.... as long as the sql doesnt change then that subsearch becomes available for the next user to find the same thing someone else already searched for.
If you do a web search for a selection on the web you find a few that are quickly listed, but
you might go deeper and I think google adds to this as those searching add more search criteria, probable something with statisictics as those most searched for websites. Or Some
seo so that website appears most often and you click it Google makes revenue from. These
are called featured websites or ad links...
This farther refinds the search and avoids going thru the intire file.
You do not really do searches of large files if you index properly. For esample you should see instant response to find all items for a certain customer for certain item categories, and dates that are on a Friday.
I hate to waste the servers time continually sorting each time a new element is
added, changed or deleted.
Again almost no time is needed it only takes a couple disk seeks (that are normally already cached anyway) to find the proper place to add the index in the tree,
I would expect the data base to grow rather large over time and of course perge
as the selections come and go...
Normally there is usually a interval for purgeing old stuff from the file.
For example you could create a QUERY to delete all dates older than some time once a day/week/month/ year....
DELETE FROM list where entryDate < '2013-03' or something like that.
Actually there is a program that will generate the code to do Adds, Changes, Deletes, Queries, Sorts. You simply give it the db path/name and tell it what file to work with . In this case the list file. Then tell it you want to join the customer and item tables. It generates 90% of what you want. You simply tweak the code to show the listings exactly how you want them and what exactly they can do lookus (drill downs) on and sorts etc.
Go to
www.kneware.com/rbp/index.html and click on the rbGen program and download it and run it.
Hope this helps.