Shell Script to Continuously Monitor CPU Usage and Memory Usage of a Process and write output to CSV file

Hi all, recently I had to do some performance tests at work at AdroitLogic and I wanted to monitor CPU usage and memory usage of a process continuously and get that output to a file so that I can use them to analyze more later with graphs and stuff.

So I came up with two shell scripts which use Linux top and free commands to undertake this task for me.

In these scripts I run the top command in a loop and use grep command to only filter out the rest and get the line I need and then do some text processing using cut and tr. After that result of appended to a file using tee command. Here I am monitoring influxd, a time series database process. You can monitor any process you require by simply replacing the relevant name here “grep -w influxd”. Note that the filter string you use should properly filter out others and only leave a single line.

Hope these will be useful. Cheers!

#!/bin/bash
PRG="$0"
PRGDIR=`dirname "$PRG"`
[ -z "$LT_HOME" ] && LT_HOME=`cd "$PRGDIR/.." ; pwd`
cd $LT_HOME

rm -rf data/cpu*
> data/cpu.csv
echo "writing to cpu.csv"
echo "TIME_STAMP, Usage%" | tee -a data/cpu.csv
while :
do
DATE=`date +"%H:%M:%S:%s%:z"`
echo -n "$DATE, " | tee -a data/cpu.csv
top -b -n 1| grep -w influxd | tr -s ' ' | cut -d ' ' -f 10 | tee -a data/cpu.csv
sleep 1
done
#!/bin/bash
PRG="$0"
PRGDIR=`dirname "$PRG"`
[ -z "$LT_HOME" ] && LT_HOME=`cd "$PRGDIR/.." ; pwd`
cd $LT_HOME

rm -rf data/mem*
> data/mem.csv
echo "writing to mem.csv"
echo "TIME_STAMP,Memory Usage (MB)" | tee -a data/mem.csv
total="$(free -m | grep Mem | tr -s ' ' | cut -d ' ' -f 2)"
#echo $total

while :
do
DATE=`date +"%H:%M:%S:%s%:z"`
echo -n "$DATE, " | tee -a data/mem.csv
var="$(top -b -n 1| grep -w influxd | tr -s ' ' | cut -d ' ' -f 11)"
echo "scale=3; ($var*$total/100)" | bc | tee -a data/mem.csv
sleep 1
done

~Rajind Ruparathna

6 thoughts on “Shell Script to Continuously Monitor CPU Usage and Memory Usage of a Process and write output to CSV file

  1. Are these scripts both python scripts?
    I need to run exactly this sort of thing on some Redhat Linux machines , but I may need to log the information for 2 or more different processes at the same time.
    I have root access to the platforms, Do I just type in the commands as given , or place them in a file and launch( ??) the script using python?

    As you can tell I am really clueless about python at this time,
    Any help would be appreciated as it will help us determine whether all the RH VMs are correctly dimensioned or if we need to increase CPUs/ram allocated to each VM

    Thanks in advance

    Wayne

    Liked by 1 person

    1. Hi Wayne,

      These are shell scripts and they do not use python. Sorry for the comment related to python there. It was a mistake and I will update to remove that comment.

      I haven’t tried these scripts in RH but since these are common shell commands, Ideally you should be able to run these scripts in RH without any issue.

      Let me know if you are facing problems and I might be able to help you.

      Cheers!.
      Rajind

      Liked by 1 person

  2. Hi Rajind, I finally got to run the CPU script on teh target system, which by teh way uses MontaVista Linux 4 .
    I find the output is not quite what I want. Instead of 2 nice columns with teh time and CPU usage, I get multiple outputs in teh same row, over many columns. Looks like I need a newline character inserted at then end of this line: top -b -n 1| grep -w EC | tr -s ‘ ‘ | cut -d ‘ ‘ -f 9 | tee -a data/cpu.csv
    I have no idea how to do that

    Liked by 1 person

  3. Could you please explain the script line by line ? I need to use it and edit it

    Actually I have java application and I need to collect cpu and memory usage data of this application while its running for each second.. I need CSV file recording the cpu and memory usage for each second and end when the application terminates

    Like

Leave a reply to pre Cancel reply