Commit graph

5 commits

Author SHA1 Message Date
Eevee
a89a616203 Allow passing engine arguments to connect(). 2010-03-17 00:44:33 -07:00
Eevee
1a7d046fbc Vastly improved the pokedex import/export UI.
csvimport is now load; csvexport is now dump.

Both take an optional -e switch to specify an engine, but will happily
use a default SQLite database in the pokedex package directory.

Additionally, the CSV directory is now controlled by the optional -d
switch, and defaults to Doing The Right Thing.

So `pokedex load` now does exactly what you'd expect: loads the data
from the right files into a consistently-located database.
2009-08-18 18:02:53 -07:00
Eevee
634ef3ed1e Fixed a slew of foriegn key import problems. #29
Curse's type_id was 0, which is bogus; this has been fixed by creating a
real ????? type.
Fourth-gen moves all had zero as a contest effect id, which was also
bogus.
Pokémon 494 and 495 were junk and have been scrapped entirely.
pokemon_form_groups's description column was too short.

pokedex's connect() now takes kwargs passed to sessionmaker().

A more major change: some tables, like pokemon, are self-referential and
contain rows that refer to rows later in the table (for example, Pikachu
evolves from Pichu, which has a higher id).  At the moment such a row is
loaded, the foreign key is thus bogus.  I solved this by turning on
autocommit and wrapping add() in a try block, then attempting to readd
every failed row again after the rest of the table is finished.  Slows
the import down a bit, but makes it work perfectly with foreign key
checks on.
2009-07-03 23:12:13 -04:00
Eevee
20c9c23f51 Fixed some MySQL import problems.
Tables weren't being defined as UTF-8 if that wasn't the server default.

A lot of tables were trying to create erroneous auto_increment columns.

Foreign key checks were pretty much fucking everything up.
2009-03-07 18:54:01 -08:00
Eevee
bad044d1d8 Initial commit, with much of the data imported.
Includes a wrapper script 'pokedex' that can, so far, read data from a
db and spit out CSVs or deploy CSVs to a db.
2009-02-05 00:05:42 -08:00