Archive for February, 2013

Memory Management in Mac OSX

February 11th, 2013 -- Posted in Tech | 3 Comments »

I’ve been working on focussing on the things I can do instead of what I can do, so I’m posting something entirely different this time.

I use a mac (10.6.8) for accessibility reasons. I’m fairly satisfied with it, but I have been having trouble with RAM. I have 8 GB of RAM memory and that’s more than enough for me. Wired memory (memory used by the operating system) is never over 1.5 GB and active memory (memory used by applications that are running) is usually around 2 GB, never above 4 GB. So 8 GB should be quite enough.

However, there is something called inactive memory. This is used to store data that applications released (sometimes incorrectly) so that when you start an app after closing it, it’s still in the RAM and will load way faster. A kind of cache. Awesome. But what when you run out? You’d think that OSX would simply forget some of the cached data to free memory. Unfortunately OSX does in fact not do this. Instead it starts swapping (using a portion of the harddisk as memory). This is considerably slower and makes for a sluggish and laggy user experience, something you’d expect to avoid with 8 GB RAM.

With ‘normal’ use you won’t run into this problem often. After a while inactive memory does get freed (it seems to take very long though), leaving you with enough free memory to do whatever yon want. But if you are a programmer like me, some other kind of computer nerd or simply someone who likes to get the most out of their mac, this is a big issue. My biggest problem is that my torrent clients (no, that’s not a typo, I run three of them) leave me with a lot of inactive memory. And with a lot I mean over 5 GB at times.

There are people claiming that inactive memory is good. It would make your system faster. In a sense that is true, inactive memory makes restarting apps a lot faster. But when you don’t want to restart anything and simply need some RAM, it sucks. In my experience my system is a lot faster after clearing inactive memory and I’ve yet to hear from someone who has the opposite experience.

So we want to get rid of some inactive memory. If you go to the apple store they tell you to get more RAM. In my case that is bullsh*t. 8 GB is more than enough for what I do. It sounds like a cheap sales-trick on something apple should have fixed long ago. So what else is there to do?

A fairly simple solution is to run the purge command whenever you run out of ram. There are people claiming it does not work. I don’t know what funky way they type in purge, but when I run the command inactive memory goes from +5GB to around 50MB. There is a downside to it however. While running purge the system freezes for around a minute. So it’s not something you can put in a cron to run at regular intervals.

After some searching I found a free OSX app called “Free Memory” (it’s in the app-store). Most of these apps simply execute the purge command for you (which we don’t want), but this particular app works by allocating a lot of memory, until the swap file gets too big and OSX starts releasing inactive memory. Then it frees all the memory (correctly, so it won’t get cached). Understandably, your system gets slow when this app is allocating lots of memory. It does not completely freeze like purge however. So it is a step in the right direction.

FreeMemory runs from the taskbar, so it’s easy to see now much memory you have left and to click the free memory button. But being a perfectionist I’m not satisfied. This app does not run at a set interval. You need the payed version for that. I refuse to pay for a fix, I already payed for the operating system. Aside from that it’s still a bit too slow and laggy to my taste.

After some searching I found a bit of c++ code that does the trick.

#include <iostream>
#include <stdlib.h>
#include <math.h>
#include <string>
using namespace std;

int main (int argc, char * const argv[]) {
        if(argc <2){
                std::cout << "Must specify size in bytesn";
                return 1;

        // allows full memory space usage
        long size, i, ints;

        // size in bytes, parsed from the int
        size = atoll(argv[1]);
        // calculate the number of unsigned ints this will occupy
        ints = size / sizeof(unsigned int);
        // give a nice value printout
        std::cout << "Allocating "<< size << " bytes ("<<ints << "ints)n";

        // allocate the memory
        unsigned int* mem;
        mem = (unsigned int*) malloc(size);

        // in OS X you have to use it for it to count
        for (i=0; i<ints; i++) {
                mem[i] = rand();

        // free it up.
    return 0;

You can compile this by running:
g++ Filename.cpp -o Binaryfilename

Place the binary in /usr/bin and make it executable (chmod +x filename).

Now we’re going to call this binary with a bash script that I wrote:


# This script checks the available inactive memory.
# Memory is purged if the available number of MB is
# greater than the following "msize" variable. Attach
# this script to launchd to run it periodically.

MM=`vm_stat | awk '/Pages inactive:/ {print int($3/255)}'`
MF=`vm_stat | awk '/Pages free:/ {print int($3/255)}'`

echo $(date '+%Y-%m-%d %H:%M:%S')" - Testing status of inactive free memory..." $MM "MB Inactive " $MF "MB Free" >> $log

if [ "$MM" -gt "$msize" ]; then
echo $(date '+%Y-%m-%d %H:%M:%S')" - You have too much inactive free memory." $MM"MB. Allocating "$memtoclear "MB now..." >> $log
echo $(date '+%Y-%m-%d %H:%M:%S')" - " $memtoclear " bytes to allocate" >> $log
ClearInactiveMemory $memtoclear >> $log
echo $(date '+%Y-%m-%d %H:%M:%S')" - Memory Activated." >> $log
exit 0
echo $(date '+%Y-%m-%d %H:%M:%S')" - Inactive Memory amount" $MM"MB does not meet purge threshold." >> $log
exit 0

Don’t forget to fill in a log path and replace ClearInactiveMemory to the name of your binary.

This script checks how much inactive and free memory there is and if too much is inactive it tries to allocate the sum with the binary you compiled. The threshold of 1111MB is something that works well on my system, but this depends on your personal preferences and the total amount of RAM.

In my experience this runs faster and less laggy than Free Memory, but if it slows you down you can decrease the memtofree var a little (change it to ((memtoclear=$MM+$MF-500)) for example. This way it won’t clear ail the inactive memory but your system should experience less problems.

And now for the truly lazy:

crontab -e
17 * * * * /path/to/your/sh/file

Now the script will run every hour. I did not experience any problems with this, but did not test it with cpu and memory intensive games for example. I’m curious to hear your experience.

Note: use this code at your own risk. I am not responsible for any damage to your system or lost data.