Any enterprise level scripting environment is going to need an awesome logging system. Detailed and easily accessible logging will be very helpful for troubleshooting scripts or to enable basic or advanced monitoring on your automation jobs. Proper logging allows you to quickly and concisely show your value to the organization by the use of reporting or dashboards based on the information collected in your logs.
It has to be Easy
When considering your logging options be sure to keep in mind it has to be easy to write a log event and easy to retrieve it. You don’t want any part of your logging in your scripts to be a chore. It should be simple to make a log entry! It should be as easy as using Write-Verbose. You could use a PowerShell module to write a wrapper around your logging calls. i.e.
New-fbLogMessage -Message "Did somethings" Super simple. Let the CmdLet take care of database connection, the time stamp and other meta data.
Use a Database
If you are writing to static log files you definitely want to consider migrating to a database. If you don’t have a centralized logging system yet be sure to start with a database. (Sharepoint is a database) Having .log files scattered all over the place can defeat the purpose of logging at all since the data will be be dispersed and challenging to retrieve in a timely fashion. Static log files may seem like a simple solution but it is not, you will certainly over complicate things with static log files. Besides it will be very challenging, potentially impossible, to do any reporting on your awesome automation solutions if your data is scattered. Having clear reporting available is critical.
The ability to tell your automation story in terms of cost savings is vital. With properly recorded metrics from a centralized logging environment you can start looking at your automation in terms of hours saved and that can be directly correlated with real savings and added efficiency. This is the bread and butter of proving that the time you put into this automation initiative was well spent and is worth future investment. When implementing any new automation system be sure to quantify the amount of time saved for each automation that has run and log that information to your centralized solution. With that information logged you can easily show your value in a simple report or dashboard. Management LOVE dashboards. If you can have charts and pie graphs with pretty colors that is a bonus. Also make sure to make these reports available on demand to whoever wants to view them at any time. Keep access simple. Do not make it a chore for someone to see the data. Three clicks or less is always desirable! This reporting can be done by using tools like SQL Server Reporting Servers, PowerBI, OMS, Tableau and many others. Just make sure it’s available and easily accessible and that every automation solution logs valuable information.
Monitoring You can investigate your existing logged records to monitor that a runbook or automation process has been running as expected. For example you investigate if there are new entries in the log indicating normal activity and you can alert on anything abnormal that may have happened. i.e. A specific runbook job is expected to do approximately 5 things. You can monitor the log files to a create an alert for anytime, lets say, 10 or more things happen for that specific runbook job. You can also start doing trend analysis and see what other new data points may expose themselves. Without a basic central log store, none of this possible.
What you can do with good logging: * Dashboards * Metrics with real dollars * Determine hours saved * Proactive monitoring * Trend Analysis