03/28/2012 § Leave a comment
I think I would be good at evangelism. I’ve seen ads from companies asking for people to fill these roles from time to time, and I’ve ignored them, chiefly because I don’t really see myself as a marketer, and product evangelism is a form of marketing, really.
But I’ve been re-thinking that logic over the last few days. I’m not so sure that handshaking and deal making is my forte, but I think being a technology booster might be. I get excited about technology and enjoy talking about it. Could I walk into a Strength Through Technology rally and get people nuts over a particular product? I don’t know, never done it. I see myself as being willing to try it though. I think writing about it is one of my strengths, certainly. I’ve done that to a small extent having written manuals and some marketing literature from a technical perspective, this was for a number of industrial automation products in the past. It’s certainly something I think I’d like to try.
Steve Jobs was the most notable and probably the most successful evangelist ever. It helps to be a founding member of the company, sure. But all that does I think is strengthen the belief he had in his product. You have to admire the guy, he brought a subtle sense of style to the Apple product line. Before he re-joined Apple one of their most innovative products was the Newton PDA. I had a chance to play with it, and it was a pretty nifty machine, one of the more remarkable features of the Newton was its handwriting recognition. It actually worked for me, although I know it was roundly mocked in the media. It was cancelled a little over a year after it was introduced. Two years after the Newton was cancelled and Jobs rejoined the company the first iPod was introduced. Although a rather different function; audio media playback only, it featured a smaller, sleeker design and and a slick ad campaign that has typified Apple’s marketing since. That was Jobs.
Those are some pretty big shoes to fill. I wouldn’t deign to suppose I could be a visionary on that level, but that’s certainly a model to aspire to. Every successful technology has some aspect that makes it interesting, otherwise it wouldn’t be successful. I think the trick is getting everyone excited about it. A good evangelist finds ways of achieving that goal. That’s a challenging goal, and one that interests me.
03/27/2012 § Leave a comment
I got a job working on a web sight leveraging .NET technology so I really wanted to get my Macbook Pro working. Hoping the power supply problem was some temporary state of wackiness and not really looking forward to trying to add features to a C#-powered .NET/IIS website using Mono, or running a Windows 7 virtual machine on my under-powered netbook, I said a small prayer to Alan Turing, and fired up the Macbook Pro. HOT DAWG! The machine roared (or actually more like “blinged”) right up!
I guess the hardware deserves more credit than I’d given it. Obviously the confused state of the power supply had settled in the odd-month or so I had it powered off and the only problem I had was accessing my gmail account. Seems ssl certs really like the client machine to have its clock set correctly. Being powered off for so long allowed all power to drain from the machine (probably the critical factor in getting the p/s reset) set the clock back to the year zero, which apparently is January 2001 for the Mac. Did a quick update, as Microsoft is so fond of pushing out updates just about as often as your typical Linux distro, and viola! I was back online with Windows 7, 64 bit with that awesome 2.2 GHz dual core i5 processor. Thank you very much Steve, you magnificent stallion. This does nothing to restore my faith in Apple hardware or software, btw. My next machine will still be that 1.7 GHz i7 netbook, if I can find it when I get ready to buy it. Or something comparable.
Mono is an interesting .NET runtime layer for Linux btw, it seems to work fairly well. The only hitch I’ve seen so far is one example program I typed in to see it work did a measure of its processing speed employing the .NET Stopwatch object. That object was not available to the Mono package for Ubuntu, which should be the latest port. No big deal, I simply commented out the relevant code and the rest of the program ran like a champ. I then downloaded an ip-based utility written for Mono and that ran great as well. I got curious about how that code would run on a windows machine, but I haven’t tested it, the Mac was still out of commission at that time. Now I can test it!
03/21/2012 § Leave a comment
I’m writing a gnome app to hold all my passwords and have it export its data to an android app, maybe put in some simple sync mechanism that will check periodically that both instances are in sync some how over the net. The first part is done, I have some code written that I will use to store the data in a simple way. I was going to use a berkelydb key/data pair, but I decided that a plain old encrypted file would be better. With that done I needed the encryption piece. As in everything I write I try to keep everything as simple as possible. I found a dandy little rijundael encrpytion class and am adapting it to my needs. One need was the ability to make it portable so I could compile it on Windows, I think a windows version of the app is a good idea, so I need to make the code as portable as possible.
The rijndael code was easily ported, but not as I found it. The author of the orignial code used Microsoft-specific STL with his, and made no provision to have the relevant portions cross-compilable.
The issue here is that Microsoft’s general stl::exception object contains an internal string to hold an error message; you don’t need to derive your own class to use it with the extra capability. Which is nice, although its hardly a task to simply derive your own excception class. Plus its outside the specification, which is clear, we’re meant to derive our own exceptions.
But that’s ok, I’m not exactly sure what the danger is in how Microsoft imlemented it, and its nice to use stuff that’s there. What is not nice is to write some otherwise very useful code using a non-standard class object and then go blithely on your way as though you don’t know or care what anyone else is doing. Unless you live in a dark, Microsoft-licensed cave, its difficult not to see how some code isn’t going to port using these extensions.
And there’s nothing about the code that requires Microsoft-specific technology at all. This a cryptographic solution, not a microsoft one. Should be cross-compilable on any C++ platform. The solution is easy enough, add conditional compiler macros, which I’ve done. In the header that defines the Rijndael class I went ahead and added a new namespace (the class itself was placed in the default namespace, also easily fixed) above it, and derived an exception class from std::exception that implemented the particular features the author wanted out of exception, and viola, all done. Wrap that declaration in ifdef MSVC macros and you have code that should compile on a standard C++98 compiler and a Microsoft one.
The take away from this is its only an effort if you don’t design your code to be portable from the start. Even if you ever only write windows apps, or android apps, or whatever, you never know when your code might be the kernel of a brand new technology, especially if its a generic function, like encrypting data.
This is the code I ended up with. I’ve tested it on both windows and linux and will compile as is. I hope that includes MacOS. I promise I’ll test it as soon as I can…
03/16/2012 § Leave a comment
Unless you hang under a rock you’ve no doubt noticed the higher volume of chatter regarding privacy, and Google. The deal you make with Google is in return for more relavant hits that line up with your search terms Google promises to track your use of the web and any personal information it can glean from it including your spending habits, lifestyle, anything about you it deems marketable.
Although consciousnessly aware of this faustian bargain I implicately have made with the search giant, I’ve gone ahead and used Google to find the information I needed. It was (and probably still is) the best way to find anything on the net. I’m starting to be more and more aware of the eyes over my shoulder however and as a consequence I’ve started using duckduckgo more and more.
One of the things that has changed in the world of marketing with the tech age has been an dramatic access to a new wealth of tools for analyzing user habits. These unabtrusive technolgies have really put the advertiser in the driver’s seat without putting an actual physical poll taker anywhere near the subject. I put Google first and formost on top of this cable of marketing spys however; it still stands behind its philosphy although I think there’s a real perception problem out among the unwashed masses. I feel for a company like Google in some ways, its difficult to do something well without getting back some measure of the effect your methods are having your target. I will not however exchange understanding for acceptance. There are alternative methods for gathering the data a marketer needs to understand and improve the process. No, I’m not going to research and list the alternatives, I’m not a marketer, and I don’t do that kind of work. I just know they exist.
With Google and its war chest its plainly obvious that if you want to be a successful online marketer it would pay to create the tools Google wants. If you’re like me and not interested in participating in helping the cabal swap your own personal information with each other like a used car though, there are some things you can do to make life much more difficult for them.
Be sure to install the tools listed here (scroll to the bottom of the page.) Most have plugins for the more popular browsers.
- scorecardresearch.com – Full Circle Studies is a market research company that studies Internet trends and behavior. They work in conjunction with distributors and content providers to develop an anonymous, census-level analysis of Internet usage. Using data gleaned from its content provider partnerships Full Circle Studies constructs a census type view of visits to that website, which it uses to develop an understanding of broader Internet usage patterns.
- effectivemeasure.net – Effective Measure is a web analytics company that provides data about visitors to a website. Their patent-pending Digital Helix technology addresses cookie deletion issues and unique visitor audience calculations. This allows advertisers and publishers to define and measure audience numbers accurately without duplication, and track data points over a specific time period.
- pro-market.net – Datonics (formerly Almondnet) is an aggregator and distributor proprietary behavioral purchase intent, life-stage and demographic data. Datonics’ provides custom keyword-based segments to facilitate the delivery ads to online consumers.
- media6degrees.com – Media6Degrees is a socially targeted advertising company. Powered by social media, they customize audience segments for advertisers, through social graph data gathered across social media platforms. Media6Degrees provides marketers with ad targeted delivery inside and outside of social media sites.
- bkrtx.com – BlueKai operates an auction based, online data exchange. Unlike ad networks, BlueKai does not sell ads or impressions but provides on-demand data to networks, ad exchanges, agencies, creative optimizers, demand and sell-side platforms. Marketers and networks utilize BlueKai’s aggregate shopping and research data to improve ad targeting, while publishers can earn revenue as intent data providers. BlueKai partners with data solution companies including datalogix, TARGUSinfo, Experian, Bizo, Nielsen, Acxiom, and Polk to aggregate a large source of high performance intent data.
- exelator.com – eXelate Media provides a marketplace for publishers to sell their anonymous data to advertisers for ad targeting. Aggregate data is sold to advertisers through its exchange and affiliate partnerships with ad networks and publishers.
- cpxintereactive.com – CPX Interactive is a global marketing company. They provide advertisers and publishers reach, content and premium networks. They ensure advertiser success while monetizing 100 percent of publisher premium to remnant inventory. (What does that even MEAN???)
I probably sound like I’m wearing a tinfoil hat. But as an exercise fire up wireshark some time and take a look at the information flowing between you and google, or any other web service. The sheer amount of chatter is astounding. So if this growing concern is eating at you, one option is to simply stop them in their tracks.
03/13/2012 § Leave a comment
Believe it or not, I’m really tired of grousing about Gnome. Or Gnome 3, to be specific. In earlier episodes of my whining (and it wasn’t my intent to write a complaint blog) I’ve gone at length about my dual head set up, how important it was to me to be just so, and blah and blah…
I thought I had it fixed by simply switching to Mint, and then that turned out to be a bust after another update. Well I never fixed it completly, settling for a dual mode set up that left the apps & system menu on my net book. Well it simply wasn’t what I wanted so I went ahead and made the switch to Xfce.
It was relatively painless, I simply flipped on the on the Xfce packages in Synaptic & let’r rip. (I’ve tried using that other package manager thing but its slow (I blame python) and you can only select one package at a time. What is that?) I selected everything I thought I would need for Xfce, like Thunar, and any other related libs. I also read that most apps would probably be compatible with Xfce as well, even gsettings had an Xfce-specifc lib. After downloading the nessessary components I logged out, logged back in, and had a GUI that was completely recognizable. And y’know what else? The desktop was much more responsive. This counts on an underpowered netbook.
03/12/2012 § Leave a comment
Although I’ve strayed, I don’t use Apple products as a general rule. I broke down last year and bought a MacBook Pro though, as I needed a machine more powerful than my trusty but slightly underpowered Gateway-branded Asus net book; I had a need to run some very heavy-weight programming software from Rockwell Automation and some games I like, which meant running Windows. So I settled for the Mac. It was the most bang for the buck in the store. 2.2 GHz dual-core i5 processor with the ability to over clock (or more like step up) itself to almost 3 GHz, and 4 Gigs of ram, it was the best I was going to do. I was shown a lap top that was rated at over 3 GHz, but it was bulky and I’m a real nut about weight and space, and the Mac was much sleeker and lighter.
Well, I regret that decision now. Later in the year the excellent line of Acer S5 ultrabooks were introduced, and I even found a different ultrabook (make escapes me) that featured an i7 processor running at 1.7 GHz. Plenty of power for what I needed. I thought briefly about a MacBook Air, but I didn’t like that you can’t replace the SSD, apparently its some sort of static assembly within the Air.
But I needed a machine right then, so I walked off with a BRAND NEW, not a refurb, MacBook Pro. And it worked great, for about 6 months. Then the power supply self-destructed. I’ve since removed the HD and recovered my more important files, but seriously? After 6 months?
Before that incident hailing my first and last little fling with Mac products of any kind, I steered clear of Apple for a history of reasons; my first issue was the cost. I would loved to have gotten a Macintosh II personal computer when they came out in the mid 80s, but couldn’t afford the steep price. The Atari ST was a much better deal and one could get a dandy C compiler for it, something called Laser C. I loved that set up. Then the iPod came out, which I ignored for a long time becuase I already had a PMP, a Creative Nomad II, which worked fine. Then Bluetooth came out, and I knew I had to have it. Earphones without a cord seemed like a revelation to me. but the iPod wasn’t getting bluetooth anytime soon. Still, to this day, if you try to listen to audio output via A2DP you’ll get nothing. You have to buy an external USB bluetooth adapter and stack to get hi-fi audio output from your MacBook Pro.
I actually found an inter-company memo, supposedly from the Steve himself, on the net somewhere saying to the effect that “we make too much on third party licensing to build the iPod with bluetooth”, but I don’t have a url for that. Later my live-in girlfriend at the time bought a Macintosh, and I tried to learn the API but didn’t get very far. The tools I found were some Pascal tool chain from the same people that later become Code Warrior, and it just didn’t feel right, and I stopped bothering with it. I didn’t much care for the “love” I was feeling from Apple with regard to their customers, developers, products, the whole schmear. So I never really bothered with any of their offerings after that, until the MacBook Pro.
I suppose I should simply send take it to the nearest Apple Genius Bar and see what they have to say about fixing it. But its not like I dont have a life, so it sits under my desk
In the article I swiped that pic up top from, Apple’s Sr. VP of Industrial Desgin Jonathan Ive discusses his design philosphy. Its boils down to being a collaborative effort.
Wow, there’s a revalation. I’ve been in many organizations during my career. Mavericks who aren’t successful at being a maverick are quickly shown the door. EVERY effort I’ve been involved in where I wasn’t a contractor was certainly a team one. Apple’s enlightned “new” approach to working with design resources isn’t new, its simply logical. What they did have was the Steve himself, and for whatever I might think about the man there’s no denying that he had a different approach to product design and marketing.
Now that the Steve is gone after building a up a mighty empire by carefully choosing what technology goes into the company’s products, can Apple continue to be dominant? Jobs obviously had very tight control over all the aspects of product delevopment. He had total control over the direction of the company. Now that he’s gone it will be interesting to see where Apple goes next.
But I’m still not buying any more of their products.
03/12/2012 § Leave a comment
I don’t know why I’m so married to the Gnome desktop. I should trip down Torvalds’ route and use Xfce. This latest questioning of my desktop philosophy comes from the latest X11 update I recieved from the update gods at ???, I’not exactly WHO is in charge of updating each of the particular forks of the various components that congeal to create Mint Lisa. At least I assume an update caused my latest hassle.
As I remarked previously I like to have my desktop a certain way, chief among these preferences is the ability to have my X output on a large Vizio LCD. If I cant have that I have nothing. Well, last night I received another update and blithely accepted it. Today, after I powered back up my precious display settings were munged. I tried to get back to where I wanted to be but I had the same problem that I had with Oneiric, I could not set the display up on the Vizio and turn off the netbook’s LCD, OR make the Vizio my main display. Obviously the X server code itself has gone through some kind of change. Frustratingly I searched for a solution and thought briefly about switching again, hopefully for good, to Kbuntu, or trying out Xfce, or some other desktop.
Then I ran accross some mention of Gnome Fallback Mode.
But my settings applet doesn’t have the Forced Fallback Mode switch.
Not to fear: one more duckie search and I found a Gsettings tweak that did the job; in a terminal enter this:
gsettings set org.gnome.desktop.session session-name 'gnome-fallback'
then log out, and when you’re back in you should see your applet panel and system menu in whatever you have set as your main display. I’m saved from the pain of being a Gnome refugee one more time. This is probably all I needed to do with Ubuntu Oneric. Now I need to add this to the lengthy list of post installation procedures the next time I need to install/upgrade/shoot my netbook.
This solves my immediate UI problem, for now. But on the development front I’m still having problems building some apps from source. For example I’m trying to learn how to use DBus, and since my preferred language is C++ I’m trying to learn libdbus-c++. It took a considerable amount of time to figure out that this needed to be my replacement for GConf. First I thought it was supposed to be XConf. Ok, I understand things change. But this is on top of problems I’m having building GTKMM 3.0 example programs, and I’m afraid of tinkering too much with my system for fear of breaking things. I’ll need to do my development with the old 2.x kit until things settle down with later releases of Mint. I’m hopeful that later Mint releases will have all this sorted out. This is a real problem, unless Gnome and Ubuntu (as separate, but related issues of concurrency) care to compete with RedHat and KDE, Xfce, etc.
Of course situations like this are part of the price of admission for the joy of running open source software. But I’m not alone. its very easy to find many users who aen’t happy with the current state of affairs in Gnomeland and these Debian forks of Linux.