Tonight I went to do a test with the Veeam Agent simulating instances that I have in Digital Ocean and Amazon of 512MB and 1GB. For simulation instead of generating the backup of the whole system, I only selected the / etc and / home directories, because I believe that for the reduced number of files consumption would be lower.
When running the backup on a Ubuntu Server 14.04 VM with 512MB of RAM, the Veeam Agent consumed at least 80% of the available RAM, leaving the services virtually inoperable due to lack of resources.
In the same machine with 1GB ram the result was a little better, but even so the memory consumption made the access to only 1 page opening get quite compromised.
I know that generating incremental backup, compressing files, etc., consume many resources, but is there any way to compensate for this for using the Agent in small instances to be feasible?
Has anyone done the same tests and got the same or different results from mine?