You'll need to write a script to take the data you have from your own IP database and output the data to match the CSV data. You'll need to ensure you map your physical locations to a location as listed in the GeoLiteCity-Location.csv file - Feel free to add your own as required.
The output of your script should match:
"IP start range in INT notation", "IP end range in INT notation","Location ID from GeoLiteCity-Location.csv"
Finally, after you concat the original data and your data, you'll run the mmutil tool: csv2dat.py
mmutils does nothing about duplicate IPs. You need to make sure that your input data does not over lap either already existing IPs, or itself in anyway. IE: If you have Public IP space and you want to identify where those spaces exist, you need to remove the already existing data from maxmind.
Here, I'm going to write about building an Enterprise OSSIM/LogStash build out. This should include, but not be limited by:
To begin, I've desired on the following Data Flow:
Client End Point -> Local Geo Collection Point -> Central Collection Point -> Split to OSSIM and LogStash
-> LogStash -> ElasticSearch (Cluster) -> LogStash Front End
-> OSSIM Cluster
My build out will begin with the Elastic Search Cluster then work backwards to the client end points. I will then Tack on the LogStash Front End, and add OSSIM to the Central Collection Point.
Since the LogStash system can push data cleanly, the collection points will be based on LogStash - Alternate would be a SysLog forwarder.