Fixing Microsoft Outlook’s GPU usage on MacBookPros

MacBookPros, as well as other PC laptops, can count on two GPUs to use alternatively. OSX switches automatically between the two GPUs under certain conditions – an app requesting so, for instance. gfxCardStatus is a sweet tool to get insights on which OSX app makes use of the faster GPU (and more battery draining) and when that switch happens. I would recommend to re-compile gfxCardStatus because the binary is quite old despite it seems to be still holding on fine on OSX 10.9.

Anyway, getting back to the subject of this post, one of the apps that doesn’t seem to really need a faster GPU is MS Outlook and possibly other MS Office products, I haven’t tested them all. Sometimes when I send an email, MS Outlook seems to cause a GPU switching from the less powerful GPU (Intel HD Graphics 4000) to the more powerful one (NVIDIA GeForce GT 650M) and then the latter remains switched on. That uses up more battery charge.

If you’ve noticed the same behaviour, a fix to make MS Outlook stop doing that requires a tiny bit of settings editing.

Open a Terminal session and copy/paste in the following line. Please do notice that your path and/or MS Outlook version might be different so please amend the line below.

nano -c /Applications/Microsoft\ Office\ 2011/Microsoft\

search (Ctrl + w) for the lines


and place the following two lines right under them.


save (Ctrl-o) and restart MS Outlook. From this moment on, MS Outlook should no longer request a GPU switch.

How to build OpenCV with GPU support via MacPorts in OS X 10.8

First thing first, install the NVIDIA CUDA libraries from (take notice that last night, OS X 10.9 Mavericks has got its update too!)

Second, open Terminal and add the two paths below

export PATH=/Developer/NVIDIA/CUDA-5.5/bin:$PATH

if previously installed, uninstall opencv.

sudo port uninstall opencv @

If you have a different version installed, MacPorts will tell you and you’ll need to change the one above with your own, of course. Now you need to edit the opencv’s portfile

sudo port edit --editor nano opencv

nano will open up, scroll down and change the appropriate settings to


and optionally


you “could” also add


(although I’m not completely sure this is the right syntax for cusparse though! Please do let me know if this is the case)

Now, it’s time to clean the previous installation

sudo port clean opencv

And at last, type

sudo port upgrade -s -n --force opencv

to rebuild the amended portfile. The flag -n avoids rebuilding of the dependencies and -s starts building from the source code only with no binaries whatsoever and now you can go get yourself a cuppa, even two, as this step will take a bit (about half a hour on a rMBP).

These links below can be helpful too:

Minimalistic OpenGL setup as PyQt script

Just added a minimalistic setup for PyQt apps using OpenGL. It’s quite compact and it appears as one file that can be plundered here. MIT license as usual.

Keeping the Cmd-key pressed while using the mouse will allow changing the pivot’s position while keeping the Alt-key down will allow rotating the scene. The whole setup runs at 60 FPS (20 msec interval) but can be customised.

How to switch between multiple MacOS X Python framework versions

This is mainly a note-for-self kinda post.

I’ve got multiple Python framework versions installed in my Mac, including MacPorts’ versions.

If I need to switch between MacPorts versions, I can use the usual

sudo port select python python26
sudo port select python python27

but if I need to switch to a MacPython version then I have to update the PATH manually like this.

export PATH

At this point, I can type   which python  in the shell and check whether the path has been changed.

To revert back to the MacPorts version, I will use

export PATH

Again, you can double-check by typing   which python  in the shell.

Now, what I’ve noticed is that at this point the

sudo port select python python27

won’t take effect anymore. I mean after the PATH has been manually changed, MacPorts above command doesn’t seem to be any longer effective although it said it succeeds in doing the selection. That means that manually changing the PATH may have compromised the port selection functionality in some ways, please let me know if you’ve got a clue of what might’ve happened to it.

That’s all. Err, not yet. I found this blog post very helpful.

Equivalent ways to register two point clouds with known pairing in PCL

Given the 2 point clouds:

pcl::PointCloud<pcl::PointXYZ>::Ptr XPm1 (new pcl::PointCloud<pcl::PointXYZ>);
pcl::PointCloud<pcl::PointXYZ>::Ptr XPm2 (new pcl::PointCloud<pcl::PointXYZ>);

the following three ways to register the two point clouds by means of a rigid transformation are equivalent :

1) Singular Value Decomposition-based estimation of the rigid transform

pcl::registration::TransformationEstimationSVD <pcl::PointXYZ, pcl::PointXYZ> te;
Eigen::Matrix4f T;

te.estimateRigidTransformation (*XPm2, *XPm1, T);

2) explicit Levenberg Marquardt-based estimation used in the ICP algorithm

pcl::IterativeClosestPoint<pcl::PointXYZ, pcl::PointXYZ> icp;

icp.setInputSource (XPm2);
icp.setInputTarget (XPm1);

typedef pcl::registration::TransformationEstimationLM <pcl::PointXYZ, pcl::PointXYZ> te;
boost::shared_ptr<te> teLM (new te);
icp.setTransformationEstimation (teLM);

pcl::PointCloud<pcl::PointXYZ> Final;
icp.align (Final);
std::cout << "has converged:" << icp.hasConverged() << " score: " << icp.getFitnessScore() << std::endl;
std::cout << icp.getFinalTransformation() << std::endl;

T = icp.getFinalTransformation();

3) implicit Levenberg Marquardt-based estimation used in the ICP algorithm

pcl::IterativeClosestPointNonLinear<pcl::PointXYZ, pcl::PointXYZ> icp;

icp.setInputSource (XPm2);
icp.setInputTarget (XPm1);

pcl::PointCloud<pcl::PointXYZ> Final;
icp.align (Final);
std::cout << "has converged:" << icp.hasConverged() << " score: " << icp.getFitnessScore() << std::endl;
std::cout << icp.getFinalTransformation() << std::endl;

T = icp.getFinalTransformation();

Installing py2app for Python 2.6.8 (MacPorts) on Mac OS Lion

I needed to install py2app in my main python (2.6.8) which was installed a while ago through MacPorts and has all the useful libraries already in it, PyQt, openCV, numpy, etc.

I tried installing it with the usual :

sudo easy_install -U py2app

but it just installed py2app in the Apple’s Python 2.7 and then skipping my main Python 2.6.8.

I then tried to go to to download the egg file for Python 2.6 setuptools-0.6c11-py2.6.egg and then do the

sh setuptools-0.6c11-py2.6.egg

from Terminal but it didn’t work either as it threw the error “…no matching architecture in universal wrapper” as Python 2.6.8 is a 64bit version to be able to work with openCV and the egg was presumably 32 bit.

So, I had to download the source code and install it manually, see below for the Terminal commands:

cd Downloads
tar xfz setuptools-0.6c11.tar.gz
cd setuptools-0.6c11
python build
sudo python install

at this point to install py2app in the right python (2.6.8), all I needed to do was to choose the right path to the bin to be able to use the newly installed easy_install.

sudo /opt/local/Library/Frameworks/Python.framework/Versions/2.6/bin/easy_install -U py2app

that’s it. You can try it out by typing python (2.6.8) in the Terminal

import easy_install
import py2app
import py2applet

and it shouldn’t throw any error at this point.


Two additional commands are required to get a python app standalone.

sudo /opt/local/Library/Frameworks/Python.framework/Versions/2.6/bin/py2applet --make-setup /Users/macbookpro/git/JADE/src/

this one produced a file in the home folder (mine was /Users/macbookpro/, and at last

sudo python /Users/macbookpro/ py2app -A

creates the standalone in the dist folder, always in the home folder.

installing OpenCV on MacOS Snow Leopard for Python 2.6

As a result of encountering problems in installing OpenCV, I thought I would drop a few lines with the sequence of steps that worked for me, in the end. However Homebrew looks cooler, and probably it is, I found it more helpful to make use of MacPorts and do all of the installation with it.

My specific problem was that Python 2.6 (Framework) was 32-bit which then clashed with the Homebrew installation of OpenCV 2.3.1a which is 64-bit. Homebrew instructions says you can customise the installation to 32-bit but… it just didn’t work, not even in installing Python 2.6 64-bit as a Framework.

So, I moved on to try the MacPorts option and that actually managed to install Python 2.6 64-bit as a Framework, in the first place, and the rest of the libraries followed smoothly after too.

So, installing MacPorts is step one. Use the Terminal from now on.

2) sudo port selfupdate
3) install zlib, sudo port install zlib +universal
4) install Python 2.6.8 (as framework and 64-bit), sudo port install python26 +universal
5) make it the first choice,
sudo port select –set python python26 (or ‘sudo port select python python26’ in Mac OS Lion)
6) install OpenCV,
sudo port -v install opencv +python26
7) (optional) install PyQt4 and OpenGL library,
sudo port install py26-pyqt4

sudo port install py26-opengl

That’s all.

A test to see whether your python is 64-bit is open python in Terminal and type
print hash(‘a’)
If the result is 12416037344 then you’re setup with a 64-bit python framework.

Hope this helps,

Nearly there, Idoru

Could someone ever be able to mess up the Big Bang through observing it ?

I’ve had the chance to watch the episode “What Is Reality?” from BBC series, Horizon, yesterday. Very cool stuff ! I think it’s still possible to catch up with that on the BBC iPlayer.

The episode goes through open questions concerning quantum reality, its bizarre properties – to say the least – and recent years developments. The famous two-slits experiment, elegantly described by Prof. Anton Zeilinger, opened the episode and my attention stayed glued to it throughout. Although, I’d already known about such experiment (Young, 1801), I think I grasped more profoundly what it entails, this time. Apparently, Feynman once said that “if you wish to confront all of the mysteries of quantum mechanics, you have only to study quantum interference in the two-slit experiment”. I couldn’t help going after more details on the Internet about it straight at the end of the episode. …’cause of the appealing inquisitive way it had been presented, I suppose – a detective-like sort of way. It somehow allowed the idea that there were still room for improvement in unfolding the mystery.

The experiment is conceptually simple. A laser is put in front of a wall with two vertical slits on it, right opposite the laser. There’s another wall (screen) behind the one with the slits on and when the laser shoots the photons, one by one, the pattern taking shape on the screen in the back, because of the photons passing through the slits, unexpectedly differs from what we would expect. Moreover, once the scientists tried to detect what slit each photon was passing through before landing on the back screen, the pattern switched to a ballistic one, which is the one we would’ve expected in the first place, before attempting the photon route detection. Check out Wikipedia (double slit experiment) to get the nitty-gritty.

Quite incredibly, the scientists’ act of determining what route the photons had taken, played a role in the produced results. In other words, the attempt to find the route the photon (or proton or molecule) went before bashing against the rear wall, would alter the result of the experiment dramatically …to the point, that actually there were two experiments. One, with observation, giving the ballistic pattern as result, and the second one, without observation, that returns a diffraction pattern on the back screen as if the particle’s doubled itself while passing through the two slits at once behaving like a wave.

The following scientist in the programme picked up on that, explaining that the photon doubled the reality it was in instead of doubling itself. Explanation even harder to digest which implies another frame of knowledge/beliefs wrapping up the one we commonly use and feel at ease with. Coming back to the experiment, it’s like the particle was spread over the entire distance Laser -Back Screen (covering more routes, also) throughout the experiment. Property that is called “nonlocality”.

My little research didn’t take much to end up on something even more interesting (Wikipedia). Wheeler’s “delayed choice”s conceptual experiment, built on top of the 2 slits one and verified in 2007(!) asks, what if we detected the particle after coming out of one of the slits to see which slits it has just taken? No way, the experiment would once again collapse in the ballistic situation even though the decision to detect its route was decided after it had left the laser. It’s like the photon acted retroactively. It’s baffling, I know, but that also means the particle was “spread” in time also, during the experiment. That thing felt ahead of time, so to speak, that the scientist would try to catch it and consequently opted for the ballistic fashion before the decision was taken! That means that the there’s also a property of nontemporality, and I think that that is what teleporting entangled particles is about – another recent quantum gift. Anyway, these nonlocality/nontemporality properties actually quantum-lock the experiment.

Anyway, this observation thing …it’s quite difficult to swallow. At the end of the wiki-page there’s also a mention to an article written by physicists that claim that by observing the stars we could collapse them to a certain state. The next logical, although imaginative, step is ask What if we were able to take a clear look at the Early three minutes’s big bang?
Would there be the possibility to collapse it to a certain state only through the sole observation of it? Are we already “observing” the result of someone else’s previous observation?

I now look at the anthropic principle in a new light and it makes some sort of sense. I can’t believe what I’ve just written.

Love and kisses, Zaphod ;)

A way of inserting new rows using UTC_TIMESTAMP() in a DOUBLE (or BIGINT) as primary key

Recently I needed to implement a sort of queue system in a database. The PHP/MySQL script had to record new user-generated entries with a primary key based on when they were created. The problem was that UTC_TIMESTAMP(), by itself, isn’t granular enough to generate unique keys for calls requested within less than a second from each other – and, even if it was that granular, there would always be the remote eventuality of unpredictable primary key overlap (and consequent data loss) hanging over the system. So, I thought I’d keep the UTC_TIMESTAMP() value for the overall time in the primary key and append extra figures to it in order to make the key unique no matter how close two requests were. The appended extra figures represent only a correction value to the sampled datetime and are unrelated to any time measurement.

In this example, I use 2 extra-digits which account for a maximum of 100 new possible rows that can be inserted within the same second. Such limit can be extended. The column id (primary key) is a DOUBLE, here.

After trimming off the two UTC_TIMESTAMP() top digits from 2011, which leaves us with the stub “11” standing in for the year 2011, the result gets then multiplied by 100 to make room for the 2 extra-digits on the right handside. Below, it’s shown how to do that.  3 is just an example, any number between 1 and 99 will do.

SELECT (100*(UTC_TIMESTAMP()-2e13)+3);;

Someone might argue that taking the 2 top digits off makes this method work fine only up until the year 2099. Yep, that’s correct.

Let’s now go through the details of how to insert new rows with the composite key datetime + adjustment.

I need to insert a first row in the table since this example wouldn’t work without an entry’s id to compare with and I don’t want to slow down the Server’s digestion with verifying away a condition planned to be used only once and never again, at least on what I was working on then.

INSERT INTO tbl (id,data,notes) VALUES (100*(UTC_TIMESTAMP()-2e13),'test', '');

Then, it follows the snippet that generates appropriate consecutive primary keys.

INSERT INTO tbl (id,data,notes) VALUES (
(100*(UTC_TIMESTAMP()-2e13))>(SELECT id FROM tbl AS t ORDER BY id DESC LIMIT 1),
'add data',
'add notes');

The calculation of the primary key is carried out through the MySQL function IF(expr1, expr2, expr3).

The function IF(expr1, expr2, expr3) can be probably better grasped by adding some more words to it, IF(test expr, test-passed expr, no-pass expr). Better? By the way, that’s a function IF(*,*,*) not a statement IF which is different.

The conditional function, in the snippet, simply says “IF the UTC_TIMESTAMP-based key isn’t greater than the last one found in the table then someone else must’ve already created it”, so now my new key will be the last one which is already there plus 1. In this way, I’m sure that the new primary key will be unique regardless of the new entries created so far, within the same second.

You will have noticed that subqueries have also been used in the snippet above, and, as you probably know, subqueries like that can sometimes be a bit fussy to deal with because of the restrictions imposed on them. Renaming the column id FROM tbl “AS t” does the trick and makes it all hold together. Without it, an You can’t specify target table for update in FROM clause error would pop up.
Check this out for more details about subquery-restrictions.