MRTG - how it calculate the values?

Hello,

Finally I have configured MRTG to work on my server MRTG - how it calculate the values?

Look like it work fine, but there's some thing I dont understand.

The script I use as a target for MRTG is written by RutRow:
Code:
#!/bin/bash

# Get transmit and receive bytes.
#
INFO=`grep eth0 /proc/net/dev | tr -s ' ' ' ' | cut -d: -f2`
RECEIVE=`echo $INFO | cut -d" " -f1`
TRANSMIT=`echo $INFO | cut -d" " -f9`

# Get uptime.
#
UPTIME=`uptime | tr -s ' ' ' ' | cut -d" " -f4-`

# Final output to MRTG
#
echo $RECEIVE
echo $TRANSMIT
echo $UPTIME
echo "MY_SERVER"
Its output is OK for MRTG work, and MRTG is now drawing really nice graph.

But when I run the script alone, it output really big number which will exceed MaxBytes value in everyway, like:
Code:
3747650633
417532997
24 days, 12:13, 1 user, load average: 1.23, 1.32, 1.36
MY_SERVER
Can you tell me how MRTG translate these number into its value?

Plus, currently the graph draw input graph with solid green, and output graph with blue line, how can I instruct MRTG to draw output graph with solid green and input graph with blue line (line ThePlanet RTG graph).

 

 

 

 

Top