A Simpler MRU for GTKMM C++

08/15/2014 § Leave a comment

I guess I’m a glutton for punishment, or like the little bird who flies against the wind; I really enjoy developing desktop applications. Becomming something of a lost art, now its all about apps for phones and web applications. But I still really enjoy using my desktop and writing applications for it.

The complaint I most often hear from people opposed to adopting Linux as their desktop os is the lack of applications for it. Well, that won’t change until people start writing those apps.

As far as desktop application design goes, everything should be as simple as possible, or so m philosophy goes. All components of application design should follow that simple rule where its makes sense, and I see no reason for that principal to not be included in the humble RUF, or Recently Used File list, also known as the MRU, or Most recent, so on…

The MRU (from now on) is a tradition, should be available on any mouse driven GUI, and should be easy to implement in any toolkit or framework. Imagine my surprise when I started doing the research on using the MRU object as implemented in GTKMM; its pretty much the most complicated collection of classes I’ve ever seen. And to display only the recent files relevant to your application you need to employ some sort of filter, and then there’s all the complaints I read about those objects on the web, and after that I stopped worrying about it and wrote my own class.

I have a lot of respect for Murry Cumming and the GTKMM team, and the whole GTK+ project, its a huge effort to maintain a unified set of APIs and keep them working for a moving platform like GNU/Linux/Gnome, I am surely aware. I’m also aware that there are usually a huge number of underlying reasons as to why a developer or organization implements a feature set the way they do. But sometimes you just want a thing to just work the way you want them to.

When I got a little deeper into how GTKMM’s RecentChooser classes (there’s the base class Widget, a dialog object, an action object, as well as a menu object, then the filter you need to employ, and on and on) I simply shrugged my shoulders and told myself “I’m not doing that”. I get all the variations, obviously the action object is so you can add a history option to an edit menu, whatever. I just wanted the user to click on a dynamic menu that contained an MRU.

So with the history out of the way I bring you a simpler method using the STL and GTKMM’s own menu api:

My current application for the gnome Desktop is a classic Model-View-Controller impl with a Menu Bar, and of course under the File item is my MRU.

An MRU at the simplest level is a FIFO, and std::deque is perfect for that job. In my application’s data class (a collections of structs, really) reference to a std::deque object.

I started by adding some file menu entries, file 1..4, and binding them to a dummy function. I knew I could change the menu label and bind them to an actual function later (from my application’s Glib::ustring ui_info);
(word press is having conniptions with the xml-heavy Gtk UI string, so look in the sample code for this)
(And from my app’s Gtk::ActionGroup object):
m_refActionGroup->add(Gtk::Action::create("file1",
"file1", "Reopen this file"),
sigc::mem_fun(*this, &ExampleWindow::on_dummy));
m_refActionGroup->add(Gtk::Action::create("file2",
"file1", "Reopen this file"),
sigc::mem_fun(*this, &ExampleWindow::on_dummy));
m_refActionGroup->add(Gtk::Action::create("file3",
"file1", "Reopen this file"),
sigc::mem_fun(*this, &ExampleWindow::on_dummy));
m_refActionGroup->add(Gtk::Action::create("file4",
"file1", "Reopen this file"),
sigc::mem_fun(*this, &ExampleWindow::on_dummy));

The “on_dummy” method is just an empty method, we need that because the ActionGroup template demands it be filled in, we’ll fill it in with the real method later.

My file open menu item, when activated, as well as opening a file, takes the path it got from a file picker and sends it to a method that inserts the file into the deque object, after checking the current size of the deque:

In a header file we have these declarations (more on the signal array later):
std::deque mru; // our deque object
sigc::connection mru_sig[4]; // dynamic menu signals

Then in the implementation file, in our “mru manager” method, app is just a pointer to an “application structure”, a struct with the deque object, among other things:

// If the deque is more than four, we need to pop one file off the que
if(app->mru.size() >= 4)
app->mru.pop_back();
// then add the new file
app->mru.push_front(str);
app->mru.resize(4);

Pretty simple stuff. Now, every time a file is opened it’ll be placed in our deque object, and round robin rotated to the bottom of the list ever time a new file is placed on it. In this case I’m keeping the number of recent files at 4, but it would be simple enough to adjust that number or make it user configurable if one wanted by adding an integer class member and using it instead of the “4” constant above.

Then comes the re-assignment of the menu signals, earlier in the method code I point to some empty Gtk::Widgets with a simple array of pointers:

Glib::RefPtr _refUIManager; // Typical GTKMM stuff
Gtk::Widget* file[4];

file[0] = _refUIManager->get_widget("/ui/MenuBar/FileMenu/file1");
file[1] = _refUIManager->get_widget("/ui/MenuBar/FileMenu/file2");
file[2] = _refUIManager->get_widget("/ui/MenuBar/FileMenu/file3");
file[3] = _refUIManager->get_widget("/ui/MenuBar/FileMenu/file4");

The paths refer to the ui file menus in the Gtk XML gui, if you’re familiar with Gtk desktop programming with GTKMM you should be aware of how that works. We’ll need these as references for the dynamic menus we’ll be connecting the activation signals to. Speaking of which, here’s how those signals are connected to our menu items;
int n = 0;
for(deque::iterator it =
app->mru.begin(); it mru.end(); ++it) {
const Glib::ustring& label =
(*it).substr((*it).find_last_of("//") + 1, (*it).length());
dynamic_cast(file[n])->set_label(label.c_str());
app->mru_sig[n].disconnect();
app->mru_sig[n] = dynamic_cast(file[n])->signal_activate().
connect(sigc::bind(sigc::mem_fun(*this, &ExampleWindow::on_mru), label));

if(dynamic_cast(file[n])->get_label().length() > 0)
file[n++]->show();
}

We iterate through our list of 4 file paths, take out the last bit for the name we display in the menus, and then we do a generic signal disconnect on the item. If we don’t signals will stack up on the item and we’d have several file paths flying at our “open file” method.

We then connect a new signal bound with the path data we want the menu item to open.

The signal method is simplicity itself:

on_mru(Glib::ustring& label)
{
deque::iterator it = _app->mru.begin();
for(; it mru.end(); ++it) {
const Glib::ustring& text =
(*it).substr((*it).find_last_of("//") + 1, (*it).length());
if(text.find(label) != string::npos)
cout << (*it).c_str() << endl;
}
}

The bound text from the label is searched for in the deque object, if we have a match we have our full path to the recently processed file.

There, a functioning MRU for a Gnome Desktop Application without the hassle of Gtk:: RecentChooser. Couple that with a way of serializing the list* for between sessions and you have a full fledged MRU feature for your app.

There’s always room for improvement; traditionally MRU’s have key accelerators “1, 2, 3…” and that would be a nice touch, and simple to add. And ideally this should be a fully encapsulated object, a drop-in, rather than the collection of methods to a main class as presented here.

The sample code shows a simple implementation using GTKMM 2.4, but I don’t believe there’s anything in the code preventing it being converted to 3.0. I hope this will help developers create more applications for Linux, we need all we can get!

There’s a very simple sample you can grab here, build it with:

g++ -Wall -std=c++11 examplewindow.cc main.cc -o menu `pkg-config --cflags --libs gtkmm-2.4`

*Keep your fingers crossed, I may publish a much simpler alternative to Gconfmm.

Advertisements

I Did It Again

07/09/2013 § Leave a comment

IMG_8444The Retina 13″ is on top, the previous MackBook Pro I have is in the middle, I think

I’m so ashamed. I swore off Apple products for ever and here I am again with a new McBook Pro Retina 13″.

I love my Acer S3, but it has problems. Its got 4 Gigs ram, total. The keyboard is prone to spurious typing anomalies (broken words, typing errors, lots of them). The resolution is really low, even for an ultrabook in 2013. The battery lasts 2 hours on a full charge, 2 1/2 if you really pack it. In 2013 those stats are rediculous. Plus I’ve had a banner year so far so I had some spare bux burning a hole in my pocket.

First I went to the nearby Fry’s Electronics and took a look. What I look for in an ultrabook is light-weight and power. I look for the lighest book with the most Ghz I can get. Then I look for RAM, expandability would be nice but that’s REALLY hard to find in an ultra. So, given that the RAM will be static in size I try for the most I can get. That’s also hard. It was impossible to find an ultra with more than 4 Gigs two years ago, ALL the manufacterers were worried about price plus meeting the minimum specs for running Windows 7, so 4 Gigs was the most they were willing to fit the new, hot-selling ultrabook phenomenom with. Now that things are a little more relaxed its easier to find ultras with 6, and even 8 gigs. Another thing I crave is low weight. I know I ask a lot but as a consultant I travel a lot and weight is serious consideration. One thing I really don’t need is a book with a light drive (you know, a CD/DVD drive.) I needed to use one last year to install Windows XP on an old but tiny pc I wanted to use as a media server, but before and after, rarely. If you feel like you need to use plastic light media for anything you need to get aquainted with modern SD Multi Media memory devices. Ever breath on a CD and all of a sudden not be able to read it? I have) yet they were difficult to find, being larlgely relagated to the Japanese market. Lately however that hasn’t been as much of an issue and light-driveless books are easy to come by here in the states.

At the Fry’s nearest to my house I wandered about the notebook aisles until I spied a really great number that met all my criteria. It in fact looked a bit smaller than typical ultrabooks, but at 8 Gigs RAM it would have worked quite well, and I wanted it.

Is there anything worse than a retail store that won’t sell you something? I don’t think so. I found a sales droid and showed her the ultra I wanted to purchase. She spent the usual 10 minutes fumbling about doing who knows what and finally came back and told me should coudn’t sell it to me. I asked her for the display model. She said she couldn’t sell me that one either. Seeing red I left the store. I should have looked on-line for the model and probably would have gotten it cheaper but I was really pissed off. I was on a mission now.

If you’re familure with Fry’s you know its the one retail brick store that, like Mitt Romney’s “binders full of women” has aisles full of notebooks, there’s really no other place like it. The help is utterly worthless but the sheer number of models on display can’t be beat. The only other place better WAS CompUSA, may that establishment rest in peace. So my only other shot, though I was loathe to take it, was another Fry’s. So I decided to haul my butt to the next nearest one, which happens to be the Fry’s in Palo Alto. THE Fry’s. A Fry’s in San jose is certainly near the pulse of Silicon Valley, but the Fry’s in PA would be in the Valley’s heart beat. This is near Stanford University and Page Mill Road, the valley’s trail of venture capital repositories. THE Fry’s did indeed have a number of models available on display, but not the make/model of the one in San Jose that I wanted. But what it did have was a full selection of MacBook Pros with the Retina display. I took a look at the Retinas. Damn the display was pretty. They had both MacBook Airs and the “classic look” Pro models, the new ones. The smallest one caught my eye; it was just like my older MacBook Pro but considerably smaller, and with that increadible Retina display. I also knew that my keyboard issues with the Asus would be completely gone. The crisp MackBook Pro kb design is probably the best in the business. I also knew that I would have problems running the software that *I* wanted to run on it. The latest MacBooks use the new Intel boot process known as Unified Extensible Firmware Interface, or UEFI, and like anything unknown the human reaction is to fear it. Which I did, but its the replacement for BIOS, and not going away. It also complicates Linux installation. Thankfully it doesn’t prevent it, which I first feared, it simply complicates it.

In an effort to be both entertaining, relevant, AND useful let me breifly summerize the process of installing Linux on a Retina. And let me preface the process by explaining that I have absolutely NO use for MacOS, sorry mac fan boys. And I have a larger MacBook that runs Windows 7 when I need that, I also stuffed 16 Gigs of RAM in the thing so I use it for running virtual machines (usually other versions of Ubuntu, the embedded & thin client world is going nuts for Ubuntu for some reason). What I wanted was a small, light, powerful book for traveling with MORE RAM. Since most of my work is on Linux, that’s what I wanted to run.**

First thing you’ll want to do is install rEFInd, and use the “binary zip file”. Don’t get too caught up in the wordy web page that is the rEFInd home page; the author spends WAY too much time explaining the story of rEFInd in tangents. After resizing your disk execute the install.sh script as root using the “–esp” and “–drivers” options. I’m not sure that the drivers option is absolutely nessessary, but the esp one is. If you don’t specify it refind won’t get installed on the disk and when you reboot the machine Linux won’t boot. I went ’round and ’round on that one. Then reboot with your Linux distribution ISO of choice written on a plugged-in USB dongle. There are some instructions on the net saying you need to write the ISO in a special way for MacOS, I didn’t find that to be true. You should see a new boot manager menu with an Apple logo and a generic USB symbol as button selections. This is the rEFInd boot manager. Select the USB option. Your choice of Linux should be a fairly recent so as to take advantage of the EFI boot process, if you insist on using an older distribution you’re on your own, I have no idea what BIOS-based distributions work on the EFI system of the MacBook Pro Retina. After the dry run system (if your distro has a test drive desktop, I think most do now) boots up go ahead and double click the install icon. Installation is the same as always, but be very aware of what you are doing during the disk editing part of the install; you’ll be presented with a gparted (or whatever they do with KDE based distros) dialog. Go ahead and partition the main slices however you want; BUT DO NOT DELETE THE EFI PARTITION. If you want to use the Linux as your sole OS on the Retina thats fine as long as you do not touch the ~200 Meg boot partition at sda1, or whatever device node your boot disk is (usually sda1 on Debian systems). This is the partiton that should clearly be labeled “EFI” in the gparted partition list. I wanted to use this book soley for linux, so when I got to this step I blithely deleted all partitions and created a main slice and a swap area, which normally would work fine. I installed Linux (Mint in my case) and when I re-booted: NOTHING. The machine wouldn’t load Mint.

After doing some research I learned about the newer EFI boot process, that rEFInd was needed to install a new boot loader, and that you don’t want to re-construct an EFI boot partition from scratch. After messing around with re-creating EFI boot partition structures for 3 days (They have to be a certain size, have a certain directory structure, have certain files…) I finally re-installed MacOS Mountain Goat* or whatever and re-tried my Linux installation, this time without messing with the EFI partition. It worked like a charm, my new Retina was running Mint 15.

Here’s some after install pointers, points; I had to install and open up the curses-based alsamixer app and unmute all the sound devices, simply uping the volume controls or messing with them in any way using the usual gnome controls didn’t give me my sound. I also edited /etc/modprobe.d/alsa-base.conf and added “options snd-hda-intel model=mbp101″ as the last line in that file. The HDMI port on the right side doesn’t appear to work unfortunately, and neither does a minidisplay port to HDMI adapter. I was really looking forward to having HDMI out. I don’t know if a miniport to VGA or DVI dapter will. Also this book appears to have two display adapters, one from Intel and one from nVidia; don’t install any of the many nVidia driver options available in the repositories, they don’t appear to work, while the Intel driver works great. Its kind of wierd getting a full 2560×1600 resolution on a 13” notebook LCD. That resolution is so high that I had to step on it a bit to make everything readable. I re-compiled a mandelbrot generating X app I wrote that also prints the execution time in the shell if its launched from that and running it on the Asus took about 9 seconds; on the Retina it takes 5. I get the sense also that this thing has four full core i5 @2.5 GHz processors, not just two real and two virtual ones. I’ve also read reports of the Retina running very hot on Linux, but I’ve not noticed this.

The 13″ Retina is a very powerful ultrabook, a true “Ultra”. I love it. Its really the perfect size with the perfect power and RAM. It’ll run at least twice as long on a full battery charge as my trusty-but-slower Acer S3. I’m looking forward to doing a lot of work on it. I hope linux developers down the road get the ports working, but that’s not going to hold me back.

UPDATE: I spent the latter half of my yesterday building and installing the 3.9 kernel and some Intel support libraries and viola! The HDMI port works!!! I’m staring into the warm glow of my Vizio 26″ HDTV as I type this. Its funny, the Retina’s LCD is STILL higher rez than the Vizio, but its nice to have a “console” sized display. The MicroSD slot on the right works too! I LOVE THE RETINA!! Pricey, and locked down as far as RAM & SSD go, but I’ve come to live with that from Ultras. If you’re looking to run Linux on the 13″ Retina, follow the above directions and then grab the 3.9 kernel and install it. Also grab the intel graphics stack components here. After installing everything (yes, I went ahead and compiled everything from source, getting missing libraries from the baseline repositories when they popped up) I had control over my HDMI and SD ports.

* I have to say that Apple really saved my ass in this regard; the 13″ Retina (and I assume all the latest Pros) don’t come with much in the way of paperwork or media, almost none at all in fact. Just the usual worthless warranty “square”. There is no Mac OSX install disk, nothing. Just the MacBook and that funky, little white power supply. Scary, but in some ways refreshing for a faux minimalist such as myself. Re-installing Moutain Lion was a simple matter of hitting an option-R key combo during the boot process, using the disk utility to re-partition the drive the right way, and then selecting the Mac OS re-install option. Apparently, since I had already configured the book to use my wifi it simply retrived that configuration from *wherever* and went to town. After a warning that the re-install process would be slowed by my use of wifi (a hard ethernet connection would obviously be faster, but who cares?) it automagically just connected to an Apple server (I assume) and re-installed Mountain Lion. The whole thing was really kind of amazing from a geekly perspective and very easy.

** The Apple droids will say that MacOS is a version of Linux. No, its not. It resembles it in better then superficial ways, but its not.

Let’s Hack XFC

04/10/2012 § Leave a comment

I was delighted to find an OSS solution to address Xfce delopment with C++ in XFC, it uses GTKMM as its model and development with it is very similar. I was also delighted when Bo Lorentsen. XFC’s current chief maintainer, agreed to take me on to assist. I’ve been wanting to participate in an OSS effort and had a hard time finding one until now. I wanted two things out of an OSS project; one that really needed help, and one that I really believed in. There’s no shortage of the first condition, but the second, that was a little tough. Web 2.0 is still raging hard, web technologies are popping up faster than the problems they are desgined to remedy, and systems and native programming solutions are taking a back seat to these technologies. Where UI/UX once referred generically to any GUI technology now almost exlusively describes web presentation. Its only natural and certainly makes sense in a web 2.0 world. But there’s still a need for native code, at least until the perfect web os comes along (everyone hold your breath…)

Jeff Franks, the original XFC code master, has been MIA since 2005, and Bo has been stepping up to the plate since then. He’s had help here and there, but I think he’s been going it alone for the most part since then. This is a perfect fit for me, I’m really excited to help out, and I hope he feels the same. I’ve set up a repository on github and made some updates to the web site, eventually I hope to get access to update the site directly, releiving Bo from that responsiblity.

The same google scanning that informed me of Jeff Frank’s current status also revealed some very positive reviews of the initial releases of the kit. Some commentors even went so far as to say it was superior to Gtkmm in a lot of ways.

Due in no small part to Linus Torvalds’ recent comment that Gnome 3 had become an ‘unholy mess’, it seems to me that more Linux users are going to seek out desktop alternatives, and with Xfce especially, we should see much more support. Xfce is a natural alternative for users who want a sleeker, more responsive alternative to Gnome. With that we need toolkits to enable devlopers to write native API it. XFC is one answer that problem. I hope you can support it.

Goodbye Gnome!

04/07/2012 § Leave a comment

Well, its not really good bye, but I will be saying good bye to GTKMM, and to Gnome as my main desktop. I’m switching to Xcfe as my desktop and as my Linux GUI development platform. Its not much of a switch, Xcfe depends on GTK+ and DBus and whatnot, hardly a complete 180.

My little experiment of attempting to pull out the current GTKMM 3.0 infrastructure and replace it with the stable 3.3 stuff didn’t work. It wasn’t a complete disaster, but as I wrote to fellow GTKMM mail list member ‘Phil’, replacing gdm-pixmap (a dependency of Gtk+) current with stable resulted in a broken DBus area, specifically whatever deals with updating the status of the trash. After I recompiled ALL the stuff leading up this pixmap library, and after I compiled and installed IT, I re-booted and logged back into my desktop, the familiar trash icon was replaced with a ‘red x’, meaning it was broken. Not only was the status icon busted, I couldn’t access the trash directory from the desktop.

I thought about carrying on, but not having access to the trash directory was a problem, it meant I had to shell out every time I wanted to recover something, and I wasn’t even exactly sure where it was in the file system. Forget that. Experiment over. It was a failure.

Out of the ashes of failure raises a new bird; not all is lost. Seems that Xfce has its own version of Gtkmm, XFC. Looks like a winner. The only thing I need to do is install it, and perhaps write a project wizard for Anjuta. That will be easy, and I’ll look forward to it. That will give me an easy and useful project I can contribute to the community lickity split. I still need to re-install Mint/Xcfe on that machine. Might as well get to it.

Enough With Gnome!

03/13/2012 § Leave a comment

Image

Believe it or not, I’m really tired of grousing about Gnome. Or Gnome 3, to be specific. In earlier episodes of my whining (and it wasn’t my intent to write a complaint blog) I’ve gone at length about my dual head set up, how important it was to me to be just so, and blah and blah…

I thought I had it fixed by simply switching to Mint, and then that turned out to be a bust after another update. Well I never fixed it completly, settling for a dual mode set up that left the apps & system menu on my net book. Well it simply wasn’t what I wanted so I went ahead and made the switch to Xfce.

It was relatively painless, I simply flipped on the on the Xfce packages in Synaptic & let’r rip. (I’ve tried using that other package manager thing but its slow (I blame python) and you can only select one package at a time. What is that?) I selected everything I thought I would need for Xfce, like Thunar, and any other related libs. I also read that most apps would probably be compatible with Xfce as well, even gsettings had an Xfce-specifc lib. After downloading the nessessary components I logged out, logged back in, and had a GUI that was completely recognizable. And y’know what else? The desktop was much more responsive. This counts on an underpowered netbook.

Fallback Mode

03/12/2012 § Leave a comment

I don’t know why I’m so married to the Gnome desktop. I should trip down Torvalds’ route and use Xfce. This latest questioning of my desktop philosophy comes from the latest X11 update I recieved from the update gods at ???, I’not exactly WHO is in charge of updating each of the particular forks of the various components that congeal to create Mint Lisa. At least I assume an update caused my latest hassle.
As I remarked previously I like to have my desktop a certain way, chief among these preferences is the ability to have my X output on a large Vizio LCD. If I cant have that I have nothing. Well, last night I received another update and blithely accepted it. Today, after I powered back up my precious display settings were munged. I tried to get back to where I wanted to be but I had the same problem that I had with Oneiric, I could not set the display up on the Vizio and turn off the netbook’s LCD, OR make the Vizio my main display. Obviously the X server code itself has gone through some kind of change. Frustratingly I searched for a solution and thought briefly about switching again, hopefully for good, to Kbuntu, or trying out Xfce, or some other desktop.
Then I ran accross some mention of Gnome Fallback Mode.
But my settings applet doesn’t have the Forced Fallback Mode switch.
Not to fear: one more duckie search and I found a Gsettings tweak that did the job; in a terminal enter this:
gsettings set org.gnome.desktop.session session-name 'gnome-fallback'
then log out, and when you’re back in you should see your applet panel and system menu in whatever you have set as your main display. I’m saved from the pain of being a Gnome refugee one more time. This is probably all I needed to do with Ubuntu Oneric. Now I need to add this to the lengthy list of post installation procedures the next time I need to install/upgrade/shoot my netbook.
This solves my immediate UI problem, for now. But on the development front I’m still having problems building some apps from source. For example I’m trying to learn how to use DBus, and since my preferred language is C++ I’m trying to learn libdbus-c++. It took a considerable amount of time to figure out that this needed to be my replacement for GConf. First I thought it was supposed to be XConf. Ok, I understand things change. But this is on top of problems I’m having building GTKMM 3.0 example programs, and I’m afraid of tinkering too much with my system for fear of breaking things. I’ll need to do my development with the old 2.x kit until things settle down with later releases of Mint. I’m hopeful that later Mint releases will have all this sorted out. This is a real problem, unless Gnome and Ubuntu (as separate, but related issues of concurrency) care to compete with RedHat and KDE, Xfce, etc.

Of course situations like this are part of the price of admission for the joy of running open source software. But I’m not alone. its very easy to find many users who aen’t happy with the current state of affairs in Gnomeland and these Debian forks of Linux.

Never Mind

03/06/2012 § Leave a comment

Gilda radner as Emily Litella

Ok, so apparently application.h has been removed from the kit, and on all distributions. The commit entry I cited previously is a little misleading. I guess they should say something like “we NEED to put application.h back”. No word on when it’ll be put back in either. Oh well . We still have the old 2.4 system to play with. Now I can finally get back to coding. I have my own ideas for a password keeping applet I want to write for gnome. This was all kind of a waste of time anyway since I’m not sure how gtkmm works with the gnome panel framework, I think I’m going to need to write it using strait C/GTK+ (And THAT isn’t the case either, so that’s next on my list of things to do, learn how to write a gnome-applet using GTKMM). Anyway, I think its going to be useful so it’ll be my first submission to Github. I’ll announce it here so all 0 of you can look for it. Keep reading, Anon.
(An email exchange I had with members of the GTKMM Mailing List):
On Tue, 2012-03-06 at 17:55 +0000, (I Wrote, in reply to another list member)


> Application.h is missing from gtkmm 3.0 right now, I assume it'll be
> put back in in later releases of your distro.
>
> http://osdir.com/ml/commits.gnome/2011-11/msg06140.html

Yes.

> I spent all yesterday infact trying to understand what the situation
> is.

Note that you can at any time see the recent changes:

http://git.gnome.org/browse/gtkmm/log/
And you can see what is in each tarball release:
http://git.gnome.org/browse/gtkmm/tree/NEWS

That’s Murry Cumming himself cutting in with the ‘Yes.’ Well, a one word reply from the chief wrangler and debutante of the ball is good enough for me.

What I didn’t include was WHY it took me all day to understand exactly what was missing and why, when it would be put back, and what distros where affected. Murray’s succinct reply doesn’t really explain any of this, nor is it exactly what I would call ‘plain’ from the commit logs. THIS is why I am a better customer support engineer than most code monkeys. They, 100% of the time, miss these simple explanations. I appreciate that this information, or some semblance thereof has been documented on a publicly available feed/outlet, but I do have something that resembles a life. And it is not my life to be aware of every information outlet that the GTKMM team chooses to use.
I have google, or duckduckgo. I rely on them to take me to the information I need on the net. Perhaps to my downfall, I understand that, but any more than that and I sincerely believe I am being asked for too much of as an information consumer.

First I MUST make an unholy bargain with google to trade my life for the information *I* want, then I am asked by the head meat-gatherer of a technical website to stay on top of the motions his group are making with regard to a technology they develop. THIS IS ON TOP of all the mundane little things that are a LEGAL and functional REQUIREMENT that I MUST HAVE to enable my life to function at as rough a level I am willing to tolerate (you know, taxes, traffic laws, bills, local taxes, sales taxes, VAT taxes, chewing gum, toilet paper…)

But Murray, if you’re reading this (and I know you aren’t so I feel at complete ease in saying this, as much respeck as I do have for your efforts); THAT IS NOT GOING TO HAPPEN.

Well, maybe I’m being too cynical. Perhaps part of being a good netizen is keeping a list of all the blogs of all the tool developers I use whose tools I use and check them daily. No, hourly. That way, they don’t have to be held liable for any damage or mayhem I might cuase using their tools due to my ignorance. Then I should be able to include a disclaimer with all my consulting work that I can’t be held liable for any mistakes in my code as I can’t know what uses they’ll put my code to or know all the documentationn outlets fed by the maintainers of the myriad collection of tools I need to get the job done… hey, we already do that. As Emily Litella used to say “Never mind”.

Where Am I?

You are currently browsing the Gnome category at Twittech Conditional Behavior Modification.