Saturday, November 12, 2011

GNOME3 Fallback mode 'fix'

I switched to archlinux recently. I was used to Gnome2, and Gnome3 is quite of a disappointment in the sense that it involves re-learning how to do things, when software is supposed to adapt to us. Anyway, I don't plan to start a discussion on that, googling already reveals many 'gnome3 sucks' posts which provide lot of detail on the 'why'.

In summary, I tried KDE4, but didn't feel comfortable either. Finally I ended up in GNOME3 fallback mode, whose default theme is disappointing as well. I found this theme which makes it look friendlier http://gnome-look.org/content/show.php?content=145210 , but had some problems in nautilus and background font color. So I modified it and uploaded it here.

Now my desktop reminds me of Gnome2 and sometimes I believe it is even friendlier and better looking ;)

Monday, September 26, 2011

ITK and C++11

Dacap made me aware of the new C++11 standard and the supported functionality in GCC 4.6.1. Right after that, I downloaded GCC 4.6.1 and compiled it to try it with the Insight Tool-kit (ITK), particularly because of all those super-templated types in ITK that make coding longer than what it should take, mostly because of the amount of typedefs and re-typing of class names. One of the 'wonders' of C++11 is the auto keyword, which happens to be close to a salvation angel for ITK users.

My main concern was that C++11's implementation in GCC is still quite new, such that incompatibilities may appear almost instantly; and that was the case with a particular GCC extension that -std=c++0x was considering invalid. Therefore, when compiling ITK-based code you may find 'constexpr' errors if you enable c++0x support, particulary in the file vnl.h. Fortunately, a patch has already been created and published here and can be applied directly to the 4.6.1 release source code.

To compile it with Ubuntu you can follow the instructions given here. In my case I didn't want to overwrite the previous gcc compiler in my system, nor I hadn't administrative rights, so I used make DESTDIR=/wherever/you/want install . This way I just had to modify the PATH environment variable to point to the gcc 4.6.1 binaries and tell CMake to use g++-4.6 as the compiler rather than the standard gcc one.

The patch mentioned earlier doesn't disable the constexpr error by itself; for that to be effective one must use the -fpermissive compiler flag. Some other compatibility problems are described here.

Moreover, it is very likely to run on another error regarding an include file missing. To obtain a successful build of ITK my CXX flags were set to: -std=c++0x -fpermissive -include cstddef

UPDATE: a better practice is to compile GMP, MPFR and other libs as static before compiling GCC, so that they are statically linked and the generated compiler binaries can be executed in other machines that may not have those specific libraries installed. Instructions for compiling these libraries and GCC can be found here. My particular configure line for GCC 4.6.1 is the following:

configure --prefix=/usr \
--enable-languages=c,c++,fortran \
--enable-threads=posix \
--enable-tls \
--enable-libgomp \
--enable-lto \
--disable-nls \
--disable-checking \
--disable-multilib \
--with-gmp=/tmp/gcc \
--with-mpfr=/tmp/gcc \
--with-mpc=/tmp/gcc \
--with-libelf=/tmp/gcc \
--with-fpmath=sse

Thursday, September 8, 2011

Linux: Fixing resolution problem on external monitor

I recently got a Dell L502x laptop. In order to connect it to an external VGA monitor I am using the mini-Displayport (a la Apple).

The problem is that Linux (Ubuntu Natty) does not always figure out the right display modes for this particular monitor. Unfortunately, cvt nor gtf generate the correct modelines and I am stuck with a lower resolution.

The trick was to dual-boot Windows :S and use PowerStrip to extract the proper modelines, as explained here http://www.x.org/wiki/FAQVideoModes

A bunch of very useful information can also be found here: https://wiki.ubuntu.com/X/Config/Resolution

Sunday, January 23, 2011

Excellent good-looking plots

When doing research one usually doesn't care about the aesthetics of plots or the visualized results, as long as they are clear enough to be interpreted. However, when making presentations or in publications, nice plots have an interesting impact and, even though they do not change how good results are, they still make things look more professional.

I have been usin Matlab for a while now. It is one of those softwares that you usually hate, especially if you come from a better-structured programming background. Leaving execution speed out of the equation, Matlab sucks in many ways but there a few pros that build up its popularity, such as extremely easy and straightforward debugging and the available toolboxes and functions on Matlab Central. However, it is very easy to find blogs like Abandon Matlab, where some posts really make the point about leaving Matlab forever and finding a better and more appropriate alternative.

Anyways, enough of Matlab hate. The point is that I was looking for nice plots and I remembered about that amazing piece of software called Mathematica. I will not discuss the differences between Mathematica and Matlab, but just say that they were made for different purposes. However, take a look at the following plots generated with Matlab and Mathematica respectively, from the same data (click to see the real image since blogger is automatically introducing some JPEG artifacts):

Matlab Mathematica
load data.mat;
stem(x,y); hold on;
xlabel('x','Interpreter','LaTex');
ylabel('f(x)','Interpreter','LaTex');
data = Import["test.mat", "LabeledData"];
ListPlot[
Transpose[Flatten[{"x" /. data, "y" /. data}, 1]],
Filling -> Axis,
AxesLabel -> {x, f[x]}
]


The difference is easy to grasp by just looking at the plots: Mathematica does a great job, while Matlab looks just ok. Something I usually do to make Matlab plots is to apply a grid with grid on, but the plot still looks not as professional as with Mathematica. Obviously there is space for cheating here; maybe if you look at how much Mathematica code is needed you may say that I haven't been fair enough. However, most of the code in the Mathematica snippet is needed because the data is read from a Matlab MAT file.

In my opinion, the most amazing and simple detail that Mathematica uses and Matlab does not is antialiasing. It is a very subtle detail but it makes plots look softer and, somehow, more human and easier for our eyes to watch. There have been some attempts such as this script. However, that script is an smart user attempt to generate a plot with antialiasing but the antialiasing is simulated by resizing the plot, which still doesn't look as good as Mathematica's output (images not shown here but you can try it on your own).

Beyond that, there are some problems when exporting plots from Matlab. First, exporting to PDF generates a PDF file that contains a whole page such as A4 and the plot in the middle with huge white spaces around. This is very annoying when working with pdflatex and the PDF generated by Matlab must be cropped. Moreover, even if exporting to PNG, the final image file does not look exactly as what is seen on the computer screen. This can lead to an endless 'fight' with Matlab's exporting options, sometimes without success. All of this doesn't seem to happen to Mathematica, at least considering what I have tried so far. PDF files look great and have the exact size of the plot and can be inserted straight away into latex documents.

I don't want to go into much detail in this post since it could take a long time to discuss plot customization options in Matlab and Mathematica. My experience shows that Matlab works well to show the data properly and it is still interpretable, but it lacks the quality of softwares such as Mathematica or Matplotlib (this last one is another very nice plotting tool, free of charge). For more examples on the plotting power of Mathematica see here, and for Matplotlib here.