We have processes that are producing logs using log4j, and some of the log entries are supposed to be loaded in a database for analysis and reporting (right now everything is going to files). The problem is that some of those processes don't have access to the database. So the idea is that every process is producing a file, which is then sent to/read by another process who has access to the DB.The preferred format for this file is a standard log4j text format, so that the same file can be used both by the process that loads it in the DB and by real people for reading it. So the question is: is there an existing log file parser (ideally a java library)? We don't want to invest time to write a parser.Another solution would be to generate two files, one for reading by humans, and the other one with, for instance, serialized log4j logging event that could be easily deserialized, but for now my management is not buying this...There may also be other solutions that I'm not seeing, so any suggestion is welcomed
here is a link to a similar question: stackoverflow.com/questions/2327073/parse-a-log4j-log-file
I saw this one, but the answers are referring to either another appender (which I want to avoid), UI tools (sawmill, chainsaw) or manual parsing, and I want to avoid all of these.
That's not really what I'm looking for. From what I understand, splunk or sawmill are loading the data in their own structure/DB, and then you use their reporting/analyzing tools. Here, I want to load the data in our own custom log tables, to use internal, pre-existing tools that are using these tables.
If you have a proprietary format/db to adhere to I think you're stuck with writing your own parser/db implementation..
I want to have the same file containing the data to be parsed and the data to be human-readable. SQL statements are somewhat human-readable, but not really user-friendly when it comes to reading logs