[https://gahistoricnewspapers.galileo.usg.edu]
Building on baseline ChronAm, we added:
- Calendar support (thanks Nebraska)
- Browsing by Newspaper Type, Region, City
- Static Pages for Help, Participation, etc.
- SolrCloud support
- place batch files in
data/dlg_batches - ensure that an xml file containing MARC data for the Titles contained in the batch is present in the
data/nonlccndirectory - run the chronam load_batch job like this:
manage.py load_batch path_to_batch_folder
In order to import batches, batches must conform to the NDNP Digital Assets format (examples).
But since we aren't NDNP awardees, we have to hack some things...
- clear out any existing loaded batches with
dlg\hacks\clear_loaded_batch_data.sql - execute
dlg/hacks/setup_dlg.sqlto setup the DLG as an awardee and institution
GHNP enhances the core_place table data to support the included
browsing features. This includes a new model core_region that is related to
each place entry that is in Georgia. As well, the county and city values
are extracted from the core-loaded data.
manage.py loaddata regionsmanage.py refine_places
From Solr dir:
- Delete old config:
bin/solr zk rm /configs/ghnp/managed-schema -n ghnp -d /opt/chronam/solr/ -z localhost:9983 - Load new config:
bin/solr zk upconfig -n ghnp -d /opt/chronam/solr/ -z localhost:9983 - Restart Solr
This is my first django project