Last modified: 2012-01-03 19:56:45 UTC
Database::factory includes a hardcoded list of database drivers, $canonicalDBTypes = array( 'mysql', 'postgres', 'sqlite', 'oracle', 'mssql', 'ibm_db2' I don't see how an extension could append to this list.
What's the use case for it - do you know a DB backend that we refused to add to the core? Note that database access is a very sensitive area where one core change can ruin everything, so core developers fiddling with it MUST be able to see the implications of these changes for every DB supported.
(In reply to comment #1) > What's the use case for it - do you know a DB backend that we refused to add to > the core? Note that database access is a very sensitive area where one core > change can ruin everything, so core developers fiddling with it MUST be able to > see the implications of these changes for every DB supported. I had to do a core hack to make the Extension "MSSLBackCompat" for someone using it in some Semantic* extension. They wanted to use the php libraries, not the microsoft written ones (they don't work on *nix, right?) Ended up having to put some value in $canonicalDBTypes to get this to work, not the worst core hack, but the intention is still there Be interested to see what the bug author has in mind
One use case is Extension:Offline, but u beg the question of why mediawiki cannot read the .xml.bz2 dumpfiles it generates. This will eventually become core functionality, one can hope. What do you think the best approach would be if I were to write a test suite for new database drivers?
(In reply to comment #3) > One use case is Extension:Offline, but u beg the question of why mediawiki > cannot read the .xml.bz2 dumpfiles it generates. This will eventually become > core functionality, one can hope. > MediaWiki can have (unzipped at least) the file imported via Special:Import. The compressed files can be imported by using the importDump.php maintenance script
Good point, Reedy, I am trying to eat my cake without opening the box... but I could not adapt the import script to do fast lookups of a partial data set, for example retrieving an article and the templates it depends on.
(In reply to comment #5) > Good point, Reedy, I am trying to eat my cake without opening the box... but I > could not adapt the import script to do fast lookups of a partial data set, for > example retrieving an article and the templates it depends on. Sure, but you've got a seemingly useable case. Many people won't have the room to store, neither the room to put into a database for it to be useable. I can see where you'd go about creating a Faux database object and have that readonly Surprised someones not wanted to do something like this for NoSQL/MonoDB etc
Essentially a hook in factory after the $canonicalDBTypes passing the array by reference should be anoth, and possibly (?) the $dbType copied in for reference. Are the parameters any use also?
Created attachment 9785 [details] proposed change to Database::factory This is the semantics I would like to see. A more conservative patch would be to call new $class() with all the positional arguments, then pass the params array as an additional parameter.
Added "patch" and "need-review" keywords to indicate that Adam submitted a patch that awaits review. Thanks, Adam! Next time you submit a patch, please add those keywords to get review faster.
r107932