Thursday, December 22, 2011
Blackberry SMTP servers identified as sending Spam?
Spamhaus.org servers were marking some blackberry SMTP servers are being spam relays or sending out spam. The servers were added to the XBL and CBL lists and as a result many messages from BB did not reach their recipients. Problem is, they appeared as delivered on the BB device. The message were probably delivered the the BB SMTP servers but communication from the servers were rejected by other SMTP servers that were using DNBL or similar services that relied on data from Spamhaus.
The only recourse is to resend the messages from the device.
It appears to affect BIS users in Malaysia between 4.00 GMT to 7.30GMT. sporadically.
Details to follow.
Tuesday, December 13, 2011
What to do with WebOS now that it's Open Source
Never stop hoping against hope. WebOS, whose pedigree can be traced back to PalmOS and the popular Palm Pilot series of PDAs (they were mini tablets, for you young'uns), will be OpenSource.
I've already had my say on what could be done with it here and here so I'm not going to repeat myself. So the question will be who will do what with it.
Will it be the next phone OS? Suddenly the open source world is full of phone OS / tablet OS. We have the resurrected MeeGo: Tizen, the OS for low-end Nokia phones Meltemi and now WebOs. Not to mention the lower profile GridOS, LiMo and SHR. This is all to the benefit of manufacturers who will be free to build on top of these platform. The best thing that could happen is that phones go the way of the PC. Hardware is independent of software. This means you can load any OS you want to your phone. Or customize it to the hilt. Skinning will no longer be the realm of the manufacturers and mobile service providers. Ah, the prospect of a high school graduation themed phones (complete with school-spirit color and logo) warms the hearts of retailers everywhere.
Will is be on your car dashboard? Toyota is on borad. Ford has flirted with in the past. Don't know how they are going to get it past their partner, Microsoft, though. But a WebOS powered in-car system is a great fit. It already does the touch interface well enough. Component and accessories manufacturers could just provide an interface and it would be the linux kernel's job to hook them up to WebOS. WebOS will then providing the human interface. That is a long way from a knob to turn on the air-conditioner.
Will it be on the TV? Finally TV manufacturers can provide an interface worthy of the Linux kernel already running in most new TVs. If your TV does YouTube or Netflix, chances are that you threw out a printed GPL license together with the FCC notices that came along with your TV. The TV doesn't need to be touchscreen for it to work but a remote with an accelerometer would be nice. This could be the start of a shift of how we consume entertainment. Think of the ability to buy channel apps where a channel is an app. Or packs of programs as apps. Better still, an app to control kids from watching too much TV.
What does this mean to Linux users out there? Your knowledge just got more valuable.
I've already had my say on what could be done with it here and here so I'm not going to repeat myself. So the question will be who will do what with it.
Will it be the next phone OS? Suddenly the open source world is full of phone OS / tablet OS. We have the resurrected MeeGo: Tizen, the OS for low-end Nokia phones Meltemi and now WebOs. Not to mention the lower profile GridOS, LiMo and SHR. This is all to the benefit of manufacturers who will be free to build on top of these platform. The best thing that could happen is that phones go the way of the PC. Hardware is independent of software. This means you can load any OS you want to your phone. Or customize it to the hilt. Skinning will no longer be the realm of the manufacturers and mobile service providers. Ah, the prospect of a high school graduation themed phones (complete with school-spirit color and logo) warms the hearts of retailers everywhere.
Will is be on your car dashboard? Toyota is on borad. Ford has flirted with in the past. Don't know how they are going to get it past their partner, Microsoft, though. But a WebOS powered in-car system is a great fit. It already does the touch interface well enough. Component and accessories manufacturers could just provide an interface and it would be the linux kernel's job to hook them up to WebOS. WebOS will then providing the human interface. That is a long way from a knob to turn on the air-conditioner.
Will it be on the TV? Finally TV manufacturers can provide an interface worthy of the Linux kernel already running in most new TVs. If your TV does YouTube or Netflix, chances are that you threw out a printed GPL license together with the FCC notices that came along with your TV. The TV doesn't need to be touchscreen for it to work but a remote with an accelerometer would be nice. This could be the start of a shift of how we consume entertainment. Think of the ability to buy channel apps where a channel is an app. Or packs of programs as apps. Better still, an app to control kids from watching too much TV.
What does this mean to Linux users out there? Your knowledge just got more valuable.
Labels:
Thinking aloud
Wednesday, November 16, 2011
Recover from a missing kernel : The Problem
You read right. A missing kernel. Although this sounds terminal, the fix was fortunately simple enough. If you are in a jam the solution is here. But the journey how it got to this is a cautionary tale of "a little knowledge is a dangerous thing'. This is a long post.
A novice sysadmin in a small company had problems with CentOS VMs on VMWare ESX version 3. I had set it up for them a few years ago and had been maintaining it for a while after that. I recommended to the management to send their sysadmin for training on VMWare administration, even if it wasn't for certification. They agreed on principle but never did anything about it. Don't get me wrong. The guy was smart. He shadowed my work and understood what I was doing and knew to ask questions when he didn't. Not formally trained but experienced in administrating services (e.g. Samba, printing), I think it was a normal progression for him to take on more work closer-related to installation and configuration.
The VMware config - Linux Kernel Dance
I had not heard from them for a while when I began getting calls about "network problems". A quick look and I figured out the VMs that was running their DHCP and DNS servers had frozen up (if you are wondering, the Magic SysReq key a.k.a. Ctrl-Alt-SysReq BUSIER works in the VMI console). Apparently the VMs were running out of resources with the CPU hitting and sustaining 100% average utilization. There weren't many VMs on the server and being Linux, I knew I could cram more than they were running currently. A more closer look revealed that it was caused by vmware-tools not being loaded. It wasn't being loaded because the sysadmin had updated the kernel but not reconfigured vmware-tools. This was happening for some time despite the message during bootup warning him about it.
I call this the linux vmware-config dance. For reasons known to VMWare, Linux is a second-class citizen. Even though, VMware ESX and ESXi and their flagship product VSphere, run on a Linux kernel, Linux support comes second in everything. The all-powerful VMware Machine Interface (VMI) client is Windows only. Don't point to that pathetic web-based management system. On Linux, we could start and stop servers but console access is broken or at best, works sporadically. We can't even create a VM using it. It's better with VMware Server Free (previously ESX). The web interface provides full access but it requires an ssl 1.0 support which is insecure and requires manual parameter configuration in Firefox to work .
The Vmware-tools service provides the kernel optimized access to memory, disk and network access. If it's not running, the VM can't do things like share memory with other VMs. Basically, it'll run slower and eat up more memory. And apparently, run it long enough, some resource gets gobbled up bit by bit without being properly released. The kicker is that since re-configuring vmware-tools affects network access, you can't do it remotely via ssh. It must be done via the console either through the web interface (VMware Server Free only) or the VMI for the paid stuff.
VMWare requires that vmware-tools to be reconfigured every time there is a kernel update. Updated kernels need to be loaded first, so a reboot is required. If you use a RedHat or SuSe kernel, the related Vmware-tools modules will load ok. But if it doesn't it'll recompile the modules. So you will need gblic and at least kernel headers to recompile. Depending on the distro, you may need to load kernel sources to get the headers. It's also good to restart the server after reconfiguration and reloading of vmware-tools to test whether there are knock-on effects on other modules. So to recap: restart the machine to load new kernel, reconfig vmware-tools and restart to cleanly load and test vmware tools and other modules. Now times that with the number of VMs you have and look at spending a lot of time doing this.
A novice sysadmin in a small company had problems with CentOS VMs on VMWare ESX version 3. I had set it up for them a few years ago and had been maintaining it for a while after that. I recommended to the management to send their sysadmin for training on VMWare administration, even if it wasn't for certification. They agreed on principle but never did anything about it. Don't get me wrong. The guy was smart. He shadowed my work and understood what I was doing and knew to ask questions when he didn't. Not formally trained but experienced in administrating services (e.g. Samba, printing), I think it was a normal progression for him to take on more work closer-related to installation and configuration.
The VMware config - Linux Kernel Dance
I had not heard from them for a while when I began getting calls about "network problems". A quick look and I figured out the VMs that was running their DHCP and DNS servers had frozen up (if you are wondering, the Magic SysReq key a.k.a. Ctrl-Alt-SysReq BUSIER works in the VMI console). Apparently the VMs were running out of resources with the CPU hitting and sustaining 100% average utilization. There weren't many VMs on the server and being Linux, I knew I could cram more than they were running currently. A more closer look revealed that it was caused by vmware-tools not being loaded. It wasn't being loaded because the sysadmin had updated the kernel but not reconfigured vmware-tools. This was happening for some time despite the message during bootup warning him about it.
I call this the linux vmware-config dance. For reasons known to VMWare, Linux is a second-class citizen. Even though, VMware ESX and ESXi and their flagship product VSphere, run on a Linux kernel, Linux support comes second in everything. The all-powerful VMware Machine Interface (VMI) client is Windows only. Don't point to that pathetic web-based management system. On Linux, we could start and stop servers but console access is broken or at best, works sporadically. We can't even create a VM using it. It's better with VMware Server Free (previously ESX). The web interface provides full access but it requires an ssl 1.0 support which is insecure and requires manual parameter configuration in Firefox to work .
The Vmware-tools service provides the kernel optimized access to memory, disk and network access. If it's not running, the VM can't do things like share memory with other VMs. Basically, it'll run slower and eat up more memory. And apparently, run it long enough, some resource gets gobbled up bit by bit without being properly released. The kicker is that since re-configuring vmware-tools affects network access, you can't do it remotely via ssh. It must be done via the console either through the web interface (VMware Server Free only) or the VMI for the paid stuff.
VMWare requires that vmware-tools to be reconfigured every time there is a kernel update. Updated kernels need to be loaded first, so a reboot is required. If you use a RedHat or SuSe kernel, the related Vmware-tools modules will load ok. But if it doesn't it'll recompile the modules. So you will need gblic and at least kernel headers to recompile. Depending on the distro, you may need to load kernel sources to get the headers. It's also good to restart the server after reconfiguration and reloading of vmware-tools to test whether there are knock-on effects on other modules. So to recap: restart the machine to load new kernel, reconfig vmware-tools and restart to cleanly load and test vmware tools and other modules. Now times that with the number of VMs you have and look at spending a lot of time doing this.
Recover from a missing kernel : The Solution
This is part two of two parts. You can read about the problem here.
The Solution
The solution was simple. I needed to install a new kernel.
I found that the sysadmin had an iso of the CentOS installation DVD on the VMWare server's datastore. The beauty of most modern distros is that their installation CDs or DVD come with a Repair Mode boot option. I modified the VM's setting to mount the iso as a cdrom for the VM. You may also have to change the VM's BIOS boot options to boot the CD-ROM drive before the hard disk. The VM's settings under Boot has an option to boot straight into the VM's BIOS setting. By default, the wait is too short for you to press the F2 key to enter the BIOS.
So I booted in to installation DVD's repair mode. It was all automatic. That is one of the nice things about using a VM environment: no hardware issues. Your distro supports them on bootup or not (commonly the network interface driver). CentOS found the network interface and configured it, found and mounted volumes and offered advice as to how to chroot to the mounted disks. Which I took. This makes the system think the root directory is the one mounted and not the DVD. Basically it boots into your system from the DVD and then makes the system think that it booted from the hard disk. Other then the running kernel, everything else is going to be loaded from the hard disk. /lib, /usr and /etc were where they should be. If there is no major incompatibility with the kernel, the existing utilities should run fine. I found yum was running ok. Why not? All the rpm databases and config files were right where it expected them to be. I installed latest kernel with yum. No problems because the network card was detected and was up. Once installed, I shut down the VM, removed the ISO from the CD-ROM settings and restarted a-okay.
The Solution
The solution was simple. I needed to install a new kernel.
I found that the sysadmin had an iso of the CentOS installation DVD on the VMWare server's datastore. The beauty of most modern distros is that their installation CDs or DVD come with a Repair Mode boot option. I modified the VM's setting to mount the iso as a cdrom for the VM. You may also have to change the VM's BIOS boot options to boot the CD-ROM drive before the hard disk. The VM's settings under Boot has an option to boot straight into the VM's BIOS setting. By default, the wait is too short for you to press the F2 key to enter the BIOS.
So I booted in to installation DVD's repair mode. It was all automatic. That is one of the nice things about using a VM environment: no hardware issues. Your distro supports them on bootup or not (commonly the network interface driver). CentOS found the network interface and configured it, found and mounted volumes and offered advice as to how to chroot to the mounted disks. Which I took. This makes the system think the root directory is the one mounted and not the DVD. Basically it boots into your system from the DVD and then makes the system think that it booted from the hard disk. Other then the running kernel, everything else is going to be loaded from the hard disk. /lib, /usr and /etc were where they should be. If there is no major incompatibility with the kernel, the existing utilities should run fine. I found yum was running ok. Why not? All the rpm databases and config files were right where it expected them to be. I installed latest kernel with yum. No problems because the network card was detected and was up. Once installed, I shut down the VM, removed the ISO from the CD-ROM settings and restarted a-okay.
Tuesday, November 01, 2011
The introduction to the comments section on http://www.ritholtz.com :
Please use the comments to demonstrate your own ignorance, unfamiliarity with empirical data, ability to repeat discredited memes, and lack of respect for scientific knowledge. Also, be sure to create straw men and argue against things I have neither said nor even implied. Any irrelevancies you can mention will also be appreciated. Lastly, kindly forgo all civility in your discourse . . . you are, after all, anonymous.It goes without saying..
Tuesday, October 25, 2011
Windows vs Linux: Are we still fighting yesterday's war?
Linux-cluttered-desktop (Photo credit: Wikipedia) |
A similarly cluttered Windows desktop. (Photo credit: Wikipedia) |
I ask myself why do I use Linux and not follow the crowd with Windows. The biggest reason is that it's free and I can do so much with free software. The reason most people use Windows is because it's what they are used to. I realize that I am also like them. I am used to Linux. And changing what we are used to is the biggest hurdle to moving from Windows to Linux.
For many years, Linux advocates, yours truly included, have been declaring that the next year will be the year of the Linux desktop. But even with it plethora of quality free software and increasing ease of use, the Linux desktop has still not grown. The effort seemed to go somewhere when Linux powered netbooks were gaining global popularity. People love the first netbooks so much that they didn't care about the OS. As long as it served their purpose, whether it is keeping files on the go, quick editing of photos or portable Internet access, people didn't care. They were cheap, portable and could be used longer than most notebooks. The popularity of these netbooks forced Microsoft to do the unthinkable, backtrack on Vista and extend Windows XP's life (that and the fact that businesses were not budging either). Now given a choice of something familiar (Windows XP, for most people) vs something strange (simplified version of Linux), guess which one people would choose, even for a little bit more money.
So, the dominance of Windows was extended. What ever ground gained was lost by Windows was regained by the time the early adopters got their second netbook, which was as soon as they got tired of dealing with the 8GB solid state hard dis. Really soon. To add insult to injury, Windows 7 looked more like KDE the more you looked at it.
So, the dominance of Windows was extended. What ever ground gained was lost by Windows was regained by the time the early adopters got their second netbook, which was as soon as they got tired of dealing with the 8GB solid state hard dis. Really soon. To add insult to injury, Windows 7 looked more like KDE the more you looked at it.
So is all lost? Is Linux on the desktop going to be within the realm of the technically competent and those who wish to extend the life of their old machines? Will Linux remain the distant third on the desktop (after MacOS) forever?
Today I realized that there are more users using Linux in the office than there were six months ago. Triple, in fact. But they are not using it on the desktop. They are using it on their mobiles phones. A few just got the latest tablets. Some people will call this cheating, calling Android a version of Linux or that mobile phones don't count. It doesn't matter. Android isn't shy about it's Linux roots. In fact, it points to it's use of the Linux kernel as the reason for it's stability.
To make people change, there has to be a driver, an impetus. A reason. Why not work on a way to introduce more people to Linux via Android. Point out that they already use Linux on the phone. Why not try it on the desktop? Doesn't Apple's owe some of it's popularity to the dominance of it's iPhones and iPads? How many people switched to the MacOS because they like their iPhones or iPads? How many people bought Macs because of it's exclusive image, to be part of the in-crowd? These are all reasons for change and people are changing.
I love open source and my wish is to have more people use it. The more people uses open source, the more other people want to contribute to open source. The more, the merrier.
To make people change, there has to be a driver, an impetus. A reason. Why not work on a way to introduce more people to Linux via Android. Point out that they already use Linux on the phone. Why not try it on the desktop? Doesn't Apple's owe some of it's popularity to the dominance of it's iPhones and iPads? How many people switched to the MacOS because they like their iPhones or iPads? How many people bought Macs because of it's exclusive image, to be part of the in-crowd? These are all reasons for change and people are changing.
I love open source and my wish is to have more people use it. The more people uses open source, the more other people want to contribute to open source. The more, the merrier.
Labels:
Commentary,
Linux
Tuesday, October 18, 2011
Linux to the Rescue... Again
The least favorable job a Linux guy can get is... supporting newbie Windows users. While we live in a virus-free, relatively trojan-less environment, our Windows brethren are waist deep in shady toolbars, gotcha embedded web auto-downloads and the un-safe USB drives. It tickles me to no end when a web pop-up tries to convince me that I am looking at my files in windows explorer..on Linux. And while we may feel smug in the knowledge that our understanding of the underlying technology and Internet services allow us to take the necessary precautions, it is these skills that we are often employed to get Windows users back to being productive in the office.
While Linux-To-The-Rescue meant in the past safe partition resizing courtesy of parted (and later libparted-powered tools), all encompassing backups of partitionimage and harrowing hard disk ER with Photorec and Testdisk, for a long time, virus and malware recovery work tools were limited to ClamAV, which itself it rather limited and really designed to detect viruses in e-mail.
Well, now we have new hero on the block: AVG Rescue CD. I had thought about something along these lines some time ago. With the lawlessness of Windows of a few years ago, between Microsoft threatening to turn it's back on XP one more time (favoring Vista against user's wishes) and the overwhelming rejection of the business community to Vista, viruses and trojans seem to propagate at will; building botnets that continue to be reconstituted past when their mother ship has been detroyed. They were getting wilder too, being able to evade (literally) virus scans or rendering installed (but not updated) virus scanners impotent. I had a talk with friends at PandaSecurity about the viability of building a live cd around their command line scanner version that would mount windows partitions and scan them a few years ago. This is the best time to catch the trojans and virus, knife them when they are asleep. I had a problem with malware on my work PC's Windows partition. I ended up booting up from the Linux partition and trying to find the offending files using Clamav. This was before the age of writable NTFS (courtesy of ntfs-3g). So it was a cycle of scanning on Linux, copying down the location of the infected files, booting into Windows in SafeMode and deleting the file and back to booting on Linux and scanning again. Repeat until ClamAV found no more and then I would boot into Windows properly and run the updated Windows antivirus. It took a course of two days.
I heard nothing back from Panda. But a few months back, Panda also came out with a working LiveCD version that you can boot into and scan the PC with. I've used Panda's and AVG's and for now, I prefer AVG's LiveCD because it is light, works quickly to boot up, easy to update the pattern and program, has a character-based menu driven interface and is compatible with a lot of network hardware. It wasn't as compatible a few versions ago (not recognizing some on-board NIC), but even then it worked on more computers than the Panda Security version. Panda's LiveCD's GUI demands, which I think uses direct VESA / framebuffer rendering, makes it incompatible with a lot of PCs I use. AVG also bundles some tools commonly found on standard Linux recovery CDs like, PhotoRec and TestDisk. Both of these solutions read the disk in it's entirety and takes a long time to finish (read: hours). Users can't work on anything while it is running either (although one friend did marvel at what he could do with the Links text-based web browser on the AVG version that supports console switching). Users hate to wait but then again, it's their fault they run Windows. :)
While Linux-To-The-Rescue meant in the past safe partition resizing courtesy of parted (and later libparted-powered tools), all encompassing backups of partitionimage and harrowing hard disk ER with Photorec and Testdisk, for a long time, virus and malware recovery work tools were limited to ClamAV, which itself it rather limited and really designed to detect viruses in e-mail.
Boot up screen - Notice the Memtest86+ memory tester |
Main Menu |
Labels:
Recovery
Tuesday, October 11, 2011
Wide Pictures and Bad Packages: A Hugin tale
I love my Kodak Z1012. Specifically I love it's in-camera panorama picture auto-stitching feature. I've tried some of the newer cameras' click and sweep panorama picture feature and I am not impressed. The quality is a bit off, like it was a video grab instead of a static picture. Nothing as good as the quality in the picture below which is built from three snaps.
It happened that I was out on a trip and my significant other asked me to stop so that she could take in a nice panoramic view. The road was narrow and downhill, so I stayed in the car while she stood out and took a deep breath. And a few snaps, none of which used the panorama feature. When we got home I realized what had happened and decided this was the opportunity to try out Hugin, a photo stitcher that was featured in Linux Format. There was even a tutorial the for it in the following issue. It is useful as both a straight forward stitcher to create panoramic picture or to create some truly interesting photos. Besides, it had to be good because it was good enough to raise patent violation concerns.
So I started by installing Hugin. Except that the Mandriva Software Manager, powered by urpmi and rpm, said that it couldn't figure out where to get a particular library file. Not the first time this has happened to me, I just copied the name of the file and fed it to rpm.pbone.net, THE place for RPMS. It gave me the name of the Mandriva package and a link where I could download it. I was tempted to click on the file and have Software Manager install it. But I also know that when I do that, it does not check for dependencies. So that would open a whole can of worms.
So I found the same package in Software Manager and installed it that way. I tried installing Hugin again but this time it flat out refused. It didn't complain about the missing library. It just said that it can't install. I was getting annoyed but quickly realized that there was a way. The old-fashioned way.
Rpm.pbone.net has a feature that can check for missing files that are required by a package. It does require enabling Javascript. What it does is that it just looks for them on your hard disk. If it can't find it on your disk, rpm.pbone.net will provide a link to the package that offers the file. Nice.
So I settled on the following process.
Interesting cloud formation in an afternoon sky |
So I started by installing Hugin. Except that the Mandriva Software Manager, powered by urpmi and rpm, said that it couldn't figure out where to get a particular library file. Not the first time this has happened to me, I just copied the name of the file and fed it to rpm.pbone.net, THE place for RPMS. It gave me the name of the Mandriva package and a link where I could download it. I was tempted to click on the file and have Software Manager install it. But I also know that when I do that, it does not check for dependencies. So that would open a whole can of worms.
So I found the same package in Software Manager and installed it that way. I tried installing Hugin again but this time it flat out refused. It didn't complain about the missing library. It just said that it can't install. I was getting annoyed but quickly realized that there was a way. The old-fashioned way.
Rpm.pbone.net has a feature that can check for missing files that are required by a package. It does require enabling Javascript. What it does is that it just looks for them on your hard disk. If it can't find it on your disk, rpm.pbone.net will provide a link to the package that offers the file. Nice.
So I settled on the following process.
- Select a file that rpm.pbone.net says hugin needs but I don't have.
- Click on the filename to search for that package that has it
- Find the package in Software Manager and install it.
- Re-run the check for missing required files
- Go to 1. if there are any missing files.
I like to work according to a process. Sure, I could bulldoze thru and install hugin from the console and fix each errors that pops up. Or worse still, build from source. Been there done, done that. But running through a process like the above gives my head enough space to look at my work for flaws and deal with them. It also gives me a way to backtrack in case I do something wrong.
Well, I did found something strange. Some of the files the package requires are provided by the package itself. That is, if I click on that missing file, it would say that I need hugin installed for hugin to install. Talk about circular reasoning. It seems that whoever packaged Hugin for Mandriva, got their RPM pre-packaging info files wrong. Another tell-tale sign that the packager messed up is that the Hugin package also requires /bin/sh. Seen that before on other bad packages. Smelled like a bad cut and paste job.
After installing all the of the files rpm.pbone.net found through SoftwareManager, I downloaded hugin from rpm.pbone.net and installed it (Software Manager/urpmi auto adding --no-deps).
It worked like a charm. Hugin, that is. I'm still fiddling with it to come out with a good picture. It works like a charm but has a KDE-esque number of configuration options.
Now where is that tutorial?
After installing all the of the files rpm.pbone.net found through SoftwareManager, I downloaded hugin from rpm.pbone.net and installed it (Software Manager/urpmi auto adding --no-deps).
It worked like a charm. Hugin, that is. I'm still fiddling with it to come out with a good picture. It works like a charm but has a KDE-esque number of configuration options.
Now where is that tutorial?
Thursday, October 06, 2011
The real legacy of Steve Jobs
When people are going to talk about Steve Jobs, they will most likely point to his most recent successes at Apple, notably the iPad and the iPhone. They will talk about it bringing computing and the Internet to the masses, beyond the 'computer literate' or even the 'computer interested'. They will point out how it made using a computer be so natural that we have stopped talking about using a computer to just simply using it for something.
Some will even look beyond that and talk about him and the Macintosh. They will talk about how the Mac brought the GUI and the mouse to forefront and raised the standard of which how people expect to work with computers. Xerox PARC may have invented it, but it was the Macintosh that captured the imagination. People talking about Jobs will highlight the success of the Macintosh popularizing 'fringe' standards like a network connection built-in rather than as an add-on card and the 3.5 in floppy disk.
But to truly comprehend Steve Job's influence on computing and personal computers, you have to go back to the beginning. You have to go back to when Steve Jobs and Steve Wozniak set out to build and sell the first pre-assembled personal computer. Somewhere then, a spark went off that convinced a young Steve Jobs that this was what he wanted to do. This was what he wanted to shape and influence. This was where he would put his stamp on this world. This was his domain.
Apple Computer and Apple was an expression of Steve Jobs. Steve Wozniak was happy to be the engineer but Steve Jobs knew he had to be the lead, the one in the driver's seat. Think about everything significant that came out from Apple and there was Steve's stamp. Early on, he not only understood how to improve existing technologies to make them relevant to a wider audience and market them as products but also how to manage and handle engineers so that they could produce their best. He took on the image of the creator, basking in the light of adulation but also taking the heat of failed ventures, shielding the engineers away from the public's wrath (although according to some, not from his).
His passion was infectious and with it he sold dreams. Dreams that could only be realized through the computer Apple was making and selling. He understood that the computer was just a tool, unlike other computer companies then (and a few still now) that sold the computer on how beefy the specification were and how many features it came with. As a tool, he understood that the computer was really inconsequential. What the computer made was what really mattered. And the GUI was the first step in making the computer step out of the way and becoming a partner in that process of creation, the process of creating what mattered to the user.
As computers became more ubiquitous, Apple took it to the next level by looking for ways it's computers would positively affect their users, enhancing that relationship, further moving the computer's technicality into the background. The computer became colorful. The computer became beautiful. The computer began doing one thing very well. Until we stopped thinking about the computer and just did things with it. In every step of that evolution was one of Steve Job's imprint, his vision. He lead and others followed.
Steve Jobs also should be remembered for being the new CEO of a new generation. A CEO that understood that a company was about the quality of its people and not solely about the numbers in the balance sheet. He injected his passion in to his work as CEO, changing the notion of the CEO from the topmost manager to the driver, the leader of the company, setting it's path and navigating it through troubled waters. And that passion filtered down, regardless of what the company was facing. Apple was declared dead so many times (Michael Dell said on this month in 1997 that Apple should be shut down after Gil Amelio was fired as CEO), it is more than ironic that it is now is the biggest technology company in the world.
I've done my piece on Apple after Jobs so there is not much to add to that. But don't over look the other significant contribution of Steve Jobs. People should also remember that it was Steve's drive and money that helped kept Pixar alive long enough to fulfill it's full potential. And that is the other genius of Steve Jobs: the ability to know a good idea when he saw one. He saw the potential in the first Apple computer, the potential in the technology of the Xerox Alto, the need for a cheaper Apple Lisa (resulting in the Macintosh), the genius of the magicians at Pixar, the beauty that can inspire computer users that was the iMacs, the need for an easy and cheap way to get more music on to digital players, the desire to both communicate and do simple tasks on phones and the potential that a simple, mobile touch interface can change the way we work with computers. The legacy of Steve Jobs is not just at the birth of the personal computer but it's from there and the evolution of the personal computer to becoming part of who we are and what we do.
TWIT.TV special on Steve Jobs
Some will even look beyond that and talk about him and the Macintosh. They will talk about how the Mac brought the GUI and the mouse to forefront and raised the standard of which how people expect to work with computers. Xerox PARC may have invented it, but it was the Macintosh that captured the imagination. People talking about Jobs will highlight the success of the Macintosh popularizing 'fringe' standards like a network connection built-in rather than as an add-on card and the 3.5 in floppy disk.
But to truly comprehend Steve Job's influence on computing and personal computers, you have to go back to the beginning. You have to go back to when Steve Jobs and Steve Wozniak set out to build and sell the first pre-assembled personal computer. Somewhere then, a spark went off that convinced a young Steve Jobs that this was what he wanted to do. This was what he wanted to shape and influence. This was where he would put his stamp on this world. This was his domain.
Apple Computer and Apple was an expression of Steve Jobs. Steve Wozniak was happy to be the engineer but Steve Jobs knew he had to be the lead, the one in the driver's seat. Think about everything significant that came out from Apple and there was Steve's stamp. Early on, he not only understood how to improve existing technologies to make them relevant to a wider audience and market them as products but also how to manage and handle engineers so that they could produce their best. He took on the image of the creator, basking in the light of adulation but also taking the heat of failed ventures, shielding the engineers away from the public's wrath (although according to some, not from his).
His passion was infectious and with it he sold dreams. Dreams that could only be realized through the computer Apple was making and selling. He understood that the computer was just a tool, unlike other computer companies then (and a few still now) that sold the computer on how beefy the specification were and how many features it came with. As a tool, he understood that the computer was really inconsequential. What the computer made was what really mattered. And the GUI was the first step in making the computer step out of the way and becoming a partner in that process of creation, the process of creating what mattered to the user.
As computers became more ubiquitous, Apple took it to the next level by looking for ways it's computers would positively affect their users, enhancing that relationship, further moving the computer's technicality into the background. The computer became colorful. The computer became beautiful. The computer began doing one thing very well. Until we stopped thinking about the computer and just did things with it. In every step of that evolution was one of Steve Job's imprint, his vision. He lead and others followed.
Steve Jobs also should be remembered for being the new CEO of a new generation. A CEO that understood that a company was about the quality of its people and not solely about the numbers in the balance sheet. He injected his passion in to his work as CEO, changing the notion of the CEO from the topmost manager to the driver, the leader of the company, setting it's path and navigating it through troubled waters. And that passion filtered down, regardless of what the company was facing. Apple was declared dead so many times (Michael Dell said on this month in 1997 that Apple should be shut down after Gil Amelio was fired as CEO), it is more than ironic that it is now is the biggest technology company in the world.
I've done my piece on Apple after Jobs so there is not much to add to that. But don't over look the other significant contribution of Steve Jobs. People should also remember that it was Steve's drive and money that helped kept Pixar alive long enough to fulfill it's full potential. And that is the other genius of Steve Jobs: the ability to know a good idea when he saw one. He saw the potential in the first Apple computer, the potential in the technology of the Xerox Alto, the need for a cheaper Apple Lisa (resulting in the Macintosh), the genius of the magicians at Pixar, the beauty that can inspire computer users that was the iMacs, the need for an easy and cheap way to get more music on to digital players, the desire to both communicate and do simple tasks on phones and the potential that a simple, mobile touch interface can change the way we work with computers. The legacy of Steve Jobs is not just at the birth of the personal computer but it's from there and the evolution of the personal computer to becoming part of who we are and what we do.
TWIT.TV special on Steve Jobs
Labels:
Commentary
Tuesday, October 04, 2011
New ideas for WebOS Part 2
This is Part 2 of ideas of what to do with WebOS. For HP or whoever owns WebOS by now. Read Part 1 here.
Go vertical in Business
The Blackberry Playbook is a dud by all accounts. So why not a make a proper business tablet? There is a market for it even though Apple would like you to not think of them. A real business tablet that actually has a proper mail client that works with normal mail servers. Offer a ultra-high security communications back-end service that offers premium features, such as guaranteed delivery and end-to-end encryption. What does that have to do with WebOS? Not much. What does that have to do with selling the tablet? Everything. Businesses love security and centralized management. Neither are available in other tablets platforms.
Especially centralized management. It is the most counter intuitive idea to the tablet concept. The tablet is about mobility and connectivity. But fleet management is a big reason why business opt for Blackberry phones. Remote service provisioning and data security. Bring that idea to the tablet and you have a runaway hit.
Other business friendly features: Instant Messaging so simple to use, it doesn't get in the way of the message while being secure end to end. And works with HP printers from the get go. Plus whatever a businessman would expect from a tablet on the go. Edit PowerPoint on WebOS? Cool.
Business is conducted everywhere and that is where a business tablet should work. So get the office / productivity apps to work right on the tablet. We'd don't like to think of it but there are times when we just can't connect to the network. Clouds are great but what can you do when it rains? We still need the device to work. Make office apps run on the tablet and push advanced options to the cloud. Most people use only 10% of the full capabilities of MsOffice anyway. Give them the right set of 15% and you're good to go.
Go vertical in Business
The Blackberry Playbook is a dud by all accounts. So why not a make a proper business tablet? There is a market for it even though Apple would like you to not think of them. A real business tablet that actually has a proper mail client that works with normal mail servers. Offer a ultra-high security communications back-end service that offers premium features, such as guaranteed delivery and end-to-end encryption. What does that have to do with WebOS? Not much. What does that have to do with selling the tablet? Everything. Businesses love security and centralized management. Neither are available in other tablets platforms.
Especially centralized management. It is the most counter intuitive idea to the tablet concept. The tablet is about mobility and connectivity. But fleet management is a big reason why business opt for Blackberry phones. Remote service provisioning and data security. Bring that idea to the tablet and you have a runaway hit.
Other business friendly features: Instant Messaging so simple to use, it doesn't get in the way of the message while being secure end to end. And works with HP printers from the get go. Plus whatever a businessman would expect from a tablet on the go. Edit PowerPoint on WebOS? Cool.
Business is conducted everywhere and that is where a business tablet should work. So get the office / productivity apps to work right on the tablet. We'd don't like to think of it but there are times when we just can't connect to the network. Clouds are great but what can you do when it rains? We still need the device to work. Make office apps run on the tablet and push advanced options to the cloud. Most people use only 10% of the full capabilities of MsOffice anyway. Give them the right set of 15% and you're good to go.
Go vertical in Education
The killer feature would be the ability to build a child-safe browsing environment with child-safe browsers (via a proxy connection to your subscription-based safe proxy service) or curated links at a predefined gateway. Build tools for collaborated learning for older kids while centralized controls would be nice for the younger set. Think of the teachers. Find ways or build products to be able to pull a particular tablet screen and put it on the big monitor or projected to the wall. Wirelessly. Or to turn off a tablet of an unruly kid until he behaves. Thrown in a lockable cart for charging and evening storage and you're golden.
Combine a few from the above
Vertical markets are always afraid of being trapped into buying a product that has limited appeal. The ideas is to invest in a product that everyone is using to leverage on consumer pricing. And pricing, as we have seen, will make or break the future of WebOS. So focus on the markets but offer the product on consumer channels. Sell to the consumer market by touting it's market specific features.
I am sure my ideas are half cooked at best. I wish WebOS and the people who develop it and who will be supporting it all the success they deserve.
The killer feature would be the ability to build a child-safe browsing environment with child-safe browsers (via a proxy connection to your subscription-based safe proxy service) or curated links at a predefined gateway. Build tools for collaborated learning for older kids while centralized controls would be nice for the younger set. Think of the teachers. Find ways or build products to be able to pull a particular tablet screen and put it on the big monitor or projected to the wall. Wirelessly. Or to turn off a tablet of an unruly kid until he behaves. Thrown in a lockable cart for charging and evening storage and you're golden.
Combine a few from the above
Vertical markets are always afraid of being trapped into buying a product that has limited appeal. The ideas is to invest in a product that everyone is using to leverage on consumer pricing. And pricing, as we have seen, will make or break the future of WebOS. So focus on the markets but offer the product on consumer channels. Sell to the consumer market by touting it's market specific features.
I am sure my ideas are half cooked at best. I wish WebOS and the people who develop it and who will be supporting it all the success they deserve.
Labels:
Commentary,
Thinking aloud
Thursday, September 29, 2011
Choosing an Android smartphone
This is in response to a comment at the end of a previous post asking what smartphone to buy.
There is no point of me saying such-and-such model is good because these things change all the time. Like anything else, it that depends on what you want to use it for. Gaming, Social Media (Facebook-ing), just checking mail or an attempt to replace your PC.
Three elements make up a smartphone.
The physical attributes of the phone is a very personal preference. Some like a big screen, others want a keyboard. Some are looking for HDMI output, others want a phone small enough for their purse or fanny pack. Nothing beats going to the shops and holding one in your hand. Choice is what Andriod users have in abundance.
The apps that can run are important especially if you can't live without Macromedia Flash and Firefox. Some don't care. What you do need to know are the quirks about the particular Andriod version. Version 2.1 can't run apps stored on the SD card (argh! Palm Pilot flashback!), Version 2.2 is what I use but you may demand the latest and greatest, Gingerbread (V2.3). Any limitation to the apps is tied to the particular phone. Like Flash is looking for phones with an ARM7 processor (or so I am told). Don't settle for 2.1 unless you intend to root the phone and install custom ROMs. In that case, find the cheapest and good luck. Try not to skimp on memory because there is this quirk that even though you have tons of free space and apps on the the SD card, the apps installed do eat up phone memory too, even if it is on the SD card.
Another aspect is what apps are available specifically for a phone. I love that my phone has both a tethering (able to connect PC to phone and use it as a modem) and portable hotspot options. Apparently not every phone has and not all carriers allow this. Some apps a dependent on the phone features itself, like an enhanced music player. So as you are looking, notice the unique features each phone has.
Finally, you may or may not care about service and what phones are available for what service. Remember it is a phone so coverage is important. For Malaysia, rule of thumb is Digi is the cheapest but with the worst network. Don't be surprised if you're just out of the city area and left with GPRS. In town, Digi is great. Celcom has the largest network but they charge an arm and a leg for unlimited data. Don't know about Maxis, though.From their website, they are about the same as Celcom. Think about where you live and what is the coverage like. But if you are connecting mainly via Wifi you may not care as much.
There is no point of me saying such-and-such model is good because these things change all the time. Like anything else, it that depends on what you want to use it for. Gaming, Social Media (Facebook-ing), just checking mail or an attempt to replace your PC.
Three elements make up a smartphone.
- The phone
- The apps
- The service or carrier
The physical attributes of the phone is a very personal preference. Some like a big screen, others want a keyboard. Some are looking for HDMI output, others want a phone small enough for their purse or fanny pack. Nothing beats going to the shops and holding one in your hand. Choice is what Andriod users have in abundance.
The apps that can run are important especially if you can't live without Macromedia Flash and Firefox. Some don't care. What you do need to know are the quirks about the particular Andriod version. Version 2.1 can't run apps stored on the SD card (argh! Palm Pilot flashback!), Version 2.2 is what I use but you may demand the latest and greatest, Gingerbread (V2.3). Any limitation to the apps is tied to the particular phone. Like Flash is looking for phones with an ARM7 processor (or so I am told). Don't settle for 2.1 unless you intend to root the phone and install custom ROMs. In that case, find the cheapest and good luck. Try not to skimp on memory because there is this quirk that even though you have tons of free space and apps on the the SD card, the apps installed do eat up phone memory too, even if it is on the SD card.
Another aspect is what apps are available specifically for a phone. I love that my phone has both a tethering (able to connect PC to phone and use it as a modem) and portable hotspot options. Apparently not every phone has and not all carriers allow this. Some apps a dependent on the phone features itself, like an enhanced music player. So as you are looking, notice the unique features each phone has.
Finally, you may or may not care about service and what phones are available for what service. Remember it is a phone so coverage is important. For Malaysia, rule of thumb is Digi is the cheapest but with the worst network. Don't be surprised if you're just out of the city area and left with GPRS. In town, Digi is great. Celcom has the largest network but they charge an arm and a leg for unlimited data. Don't know about Maxis, though.From their website, they are about the same as Celcom. Think about where you live and what is the coverage like. But if you are connecting mainly via Wifi you may not care as much.
Labels:
Recommend,
Thinking aloud
Tuesday, September 27, 2011
HP seeking new directions for WebOS? Here's some.
Ok, here a few ideas...
If you have gotten this far, chances are you aren't from HP. But if you are thinking about buying or licencing WebOS, here a few ideas I am sharing.
Licence it out
What is Apple's business model? They sell premium priced hardware. Their OS is designed to take advantage of the hardware to the max since they control exactly what they are. The don't charge for the OS separately so the development cost for it is fixed and one-time (sorta). The more Macs they sell, don't translate in to more software sales. But the more hardware they sell (PCs, tables, phones) the more their revenue. They do also get a cut from things they sell on their marketplaces but the model is still the same, more sold means more profit. More tablets sold is like more store fronts being opened. That is their strength.
That dependency on hardware sales is also it's weakness. They rely on the software to sell premium priced hardware. Hardware-wise they are no different than most PCs. In fact, the same hardware specs costs a lot less in the PC world. To counter Apple, take the opposite direction but with the same intention. Sell more hardware and make WebOS a driver for consumers to buy more hardware. That means cheaper hardware with the same performance. Focus on blanketing the market first and then break it into low-end, mid-end and premium hardware segment. Build a low-cost, easy to build model for the masses and exclusive, blinged-out, celebrity endorsed versions for the trendy.
For example, make WebOS work on a commodity platform like the numerous no-name Android-compatible tablets platforms. Then licence it to any Tom, Dick and Harry (or Chen, Wong and Lee). Those hardware manufacturers would love another OS for their hardware platform because it will sell more of their hardware. Think Microsoft in the 80s when they sold the OS on IBM PCs to other manufacturers.
Labels:
Commentary,
Thinking aloud
Friday, September 23, 2011
Welcome to Techsplatter.com!
Looking at the past few posts (and the pile in draft), I have come to the realization that my posts are no longer limited to my linux adventures anymore. A few commentary crept in and now is the most read around. To do the title justice, I've decided to rename this site to techsplatter.com. Here some of the changes you can expect to see:
- More commentary posts, my take on current tech news and trends that affect me
- Better tagging, specifically two sets of tags on the side, one by topic and other by category. So if you are here for the linux stuff, there will be a linux tag for you filter the posts.
- More graphics and links to external posts and videos
Thursday, September 08, 2011
HP split is nothing like IBM's
Senior management at HP are probably wondering what is the noise surrounding about HP saying that they are interested in selling off / reviewing their PC division. It's not like something on this scale wasn't done before. Didn't IBM do it and walked away stronger than before?
I think they need a reality check.
When IBM sold off it's PC division, there wasn't as much noise although it was a major event. But they did it after an extremely long process. Analysts were telling IBM to sell it's PC division since the mid-90s, possibly even earlier. We all now know from the stock market crash, how valuable advice from analysts are. IBM in the past recognized the value of the IBM PC brand and the need to keep that front and center. In front of executives and right in the center of the table. That money losing brand was the 'public face' of the large servers and networks in the data center, where few are allowed to go. The large and profitable server and networks business and services. This was a time when computers were becoming even more prevalent in business in general, beyond the previous domain of enterprises. Having the brand on the desktop was still important. It says something about your business when you can afford to be using IBM desktop PCs instead of other brands. So it balances out in the end. Lou Gerstner, IBM's CEO at the time saw it's value but realized the inevitable. He put in place the process to split the PC division before he left and it was completed only a few years afterwards.
What made IBM successful is that although they are a large corporation, they understood the process of making sales. The tech guys would come up with great products, the sales people would shake hands and push the products out to the customer, the senior executive would bring the CEO of the customer out for golf. That basic concept of focusing on getting Job Number 1 hasn't changed even though the products and technology has. In the end, this commitment to the customer is now the main IBM brand, not the PC.
And what tech! Those original IBM PCs were built tough like tanks. I have had IBM keyboards that last longer than the PC it came with. When it came time for them to sell to Lenovo, IBM made sure Lenovo continued carrying on that tradition. I still have old-timers call their Lenovo laptops, IBM laptops (talk about brand loyalty). I don't blame them. The hardware are still built tough. I don't see that kind of brand loyalty with HP. [Full disclosure: I just realized my 4 year-old PC under the table is a Lenovo. Great service: I've had two motherboards and a hard disk replaced on-site with no questions asked]
HP is not chopped liver but my experience with them are still a mixed bag. They make good hardware but the term "good service" doesn't automatically come to mind. It is spotty at best, with some products better supported than others. A tell tale sign? After HP Networking bought 3Com, you have to have a support contract to get access to previously free software or firmware, often the same software that came free on a CD with equipment (which usually is lost within the first few days). Contrast that to the experience accessing HP printer drivers, where software, often updated drivers are free even after the warranty of the product runs out. And guess which one I had to pay more for?
Splitting up is also not alien to HP. HP used to be even more spread out in terms of technology. It used to have business units doing medical imaging, telecommunication networking, semi-conductors and scientific test equipment. These were spun-off into Agilent Technologies so that HP could focus on servers, storage and computing. Looking at Agilent's history could give a clue as to how HP might look like down the road. Agilent began life as an 8 billion dollar company in 1999. 10 years later, it is worth slightly less. In that time, it has sold off it's medical equipment, semiconductor and network test equipment divisions, focusing on the scientific equipment market. It seems to have a policy on increasingly narrowing it's focus in pursuit of sustainability and profits. Will the to-be-split HP Personal Computing unit become like this? First splitting then shedding it's low-profit computer divisions until all that is remaining is the printer unit? Is this the first step on that road?
Motorola also split into two to allow each company to focus on it's own market. Each are doing well enough. Motorola Mobility is being purchased by Google (for a much more complex reason than purely financial). I am going to take a leap here and say that the Motorola split and the HP-Agilent split was about technology and focus on markets. It is about a company that has too many of them to manage and decided it would be better off to split itself up to focus their resources on their specific markets. This proposed HP split is more about what the direction of the CEO is, not about technology. HP PC division wasn't losing money, just not making enough profit. The litmus test is whether they sell off the printer division. If the split was about technology, it would also sell off the profitable printer division which is in the consumer half of the consumer-enterprise split. They may have to include it as part of the PC division to make it attractive. But why would you sell the goose that is laying the golden eggs? Customers, myself included, will see this as HP turning it's back on us.
HP is not a stranger to bad decisions. Come on, this is the company that turned down the Woz's Apple 1. But the way that the announcements were made about the fate of the WebOS tablets and the possibility of selling off the PC division is very suspect. You can look at it one of two ways. Either HP was trying to appease the markets by ditching a low performing unit whose operating profit was 5.7% (and a mere 38 billion in revenue) or it has been working on it some time ago as part of a broader plan to make over the company into something that the CEO better understands. HP's current CEO is from the financial and enterprise world, previously in SAP. The consumer market is probably something he just doesn't want to deal with.
Like in my previous post, it really makes me feel old when I see things like this happen again.
I think they need a reality check.
When IBM sold off it's PC division, there wasn't as much noise although it was a major event. But they did it after an extremely long process. Analysts were telling IBM to sell it's PC division since the mid-90s, possibly even earlier. We all now know from the stock market crash, how valuable advice from analysts are. IBM in the past recognized the value of the IBM PC brand and the need to keep that front and center. In front of executives and right in the center of the table. That money losing brand was the 'public face' of the large servers and networks in the data center, where few are allowed to go. The large and profitable server and networks business and services. This was a time when computers were becoming even more prevalent in business in general, beyond the previous domain of enterprises. Having the brand on the desktop was still important. It says something about your business when you can afford to be using IBM desktop PCs instead of other brands. So it balances out in the end. Lou Gerstner, IBM's CEO at the time saw it's value but realized the inevitable. He put in place the process to split the PC division before he left and it was completed only a few years afterwards.
What made IBM successful is that although they are a large corporation, they understood the process of making sales. The tech guys would come up with great products, the sales people would shake hands and push the products out to the customer, the senior executive would bring the CEO of the customer out for golf. That basic concept of focusing on getting Job Number 1 hasn't changed even though the products and technology has. In the end, this commitment to the customer is now the main IBM brand, not the PC.
And what tech! Those original IBM PCs were built tough like tanks. I have had IBM keyboards that last longer than the PC it came with. When it came time for them to sell to Lenovo, IBM made sure Lenovo continued carrying on that tradition. I still have old-timers call their Lenovo laptops, IBM laptops (talk about brand loyalty). I don't blame them. The hardware are still built tough. I don't see that kind of brand loyalty with HP. [Full disclosure: I just realized my 4 year-old PC under the table is a Lenovo. Great service: I've had two motherboards and a hard disk replaced on-site with no questions asked]
HP is not chopped liver but my experience with them are still a mixed bag. They make good hardware but the term "good service" doesn't automatically come to mind. It is spotty at best, with some products better supported than others. A tell tale sign? After HP Networking bought 3Com, you have to have a support contract to get access to previously free software or firmware, often the same software that came free on a CD with equipment (which usually is lost within the first few days). Contrast that to the experience accessing HP printer drivers, where software, often updated drivers are free even after the warranty of the product runs out. And guess which one I had to pay more for?
Splitting up is also not alien to HP. HP used to be even more spread out in terms of technology. It used to have business units doing medical imaging, telecommunication networking, semi-conductors and scientific test equipment. These were spun-off into Agilent Technologies so that HP could focus on servers, storage and computing. Looking at Agilent's history could give a clue as to how HP might look like down the road. Agilent began life as an 8 billion dollar company in 1999. 10 years later, it is worth slightly less. In that time, it has sold off it's medical equipment, semiconductor and network test equipment divisions, focusing on the scientific equipment market. It seems to have a policy on increasingly narrowing it's focus in pursuit of sustainability and profits. Will the to-be-split HP Personal Computing unit become like this? First splitting then shedding it's low-profit computer divisions until all that is remaining is the printer unit? Is this the first step on that road?
Motorola also split into two to allow each company to focus on it's own market. Each are doing well enough. Motorola Mobility is being purchased by Google (for a much more complex reason than purely financial). I am going to take a leap here and say that the Motorola split and the HP-Agilent split was about technology and focus on markets. It is about a company that has too many of them to manage and decided it would be better off to split itself up to focus their resources on their specific markets. This proposed HP split is more about what the direction of the CEO is, not about technology. HP PC division wasn't losing money, just not making enough profit. The litmus test is whether they sell off the printer division. If the split was about technology, it would also sell off the profitable printer division which is in the consumer half of the consumer-enterprise split. They may have to include it as part of the PC division to make it attractive. But why would you sell the goose that is laying the golden eggs? Customers, myself included, will see this as HP turning it's back on us.
HP is not a stranger to bad decisions. Come on, this is the company that turned down the Woz's Apple 1. But the way that the announcements were made about the fate of the WebOS tablets and the possibility of selling off the PC division is very suspect. You can look at it one of two ways. Either HP was trying to appease the markets by ditching a low performing unit whose operating profit was 5.7% (and a mere 38 billion in revenue) or it has been working on it some time ago as part of a broader plan to make over the company into something that the CEO better understands. HP's current CEO is from the financial and enterprise world, previously in SAP. The consumer market is probably something he just doesn't want to deal with.
Like in my previous post, it really makes me feel old when I see things like this happen again.
Labels:
Commentary,
I'm too old for this
Tuesday, September 06, 2011
Mandriva or Mageia?
I've been putting this off long enough. I am a long time Mandriva user from the days of Mandrake. Not exclusive, of course. It's the distro I use at home and on my laptop at work. I promote it to novices and other Linux users alike. I think I use it because it appeals to the lazy part of me. I get things done with little or no hassle. No fireworks. Not too much bling. Not many surprises.
Ever since it was forked into Mageia, I realized that I would soon have to choose. But since there was nothing concrete from the fork, I waited. Then Mageia 1 came out. I waited some more. Now the new Mandriva is out. The updates are getting fewer and father in between for my Mandriva 2010 Spring. Normally, it is that time when I turn on the backports repo and feed off that while I read the forums about possible show-stoppers. There wasn't much in the last few releases. So I think there shouldn't be much in the way.
I guess the real question is, how do I choose which one. I use Gnome on Mandriva, so there is that Gnome 3 choice also. After see-sawing back and forth I've decided to burn LiveCds of both to kick the tires a bit. Only Mandriva doesn't have that anymore. And they only support KDE with a new tablet like interface.
The Mandriva upgrade instructions look frightening, largely because the English is confusing.
I'm having nightmares about my former Russian math tutor repeating that to me again and again. Signs of influence from the Russian investors?
Not everything is going against Mandriva. There is no PLF in Mageia, so I'll need to figure out how that works out for me.
This is one of those times I want to shout out, " I HAVE A GREAT SYSTEM. I CUSTOMIZED IT AND IT WORKS FOR ME. WHY DO I HAVE TO KEEP REINSTALLING AND START OVER?" Especially with stuff like wine setups lying around, upgrades means re-configuring those again.
I find it funny complaining about change. Especially from someone who has probed monitor refresh rates to configure X windows (Look it up kiddies. There is no more spectacular way to make your monitor into a paperweight). Change is why I don't have to do that anymore. But I think we have reached a plateau. Hopefully my choices won't lead me to cliff at the end of that plateau.
Ever since it was forked into Mageia, I realized that I would soon have to choose. But since there was nothing concrete from the fork, I waited. Then Mageia 1 came out. I waited some more. Now the new Mandriva is out. The updates are getting fewer and father in between for my Mandriva 2010 Spring. Normally, it is that time when I turn on the backports repo and feed off that while I read the forums about possible show-stoppers. There wasn't much in the last few releases. So I think there shouldn't be much in the way.
I guess the real question is, how do I choose which one. I use Gnome on Mandriva, so there is that Gnome 3 choice also. After see-sawing back and forth I've decided to burn LiveCds of both to kick the tires a bit. Only Mandriva doesn't have that anymore. And they only support KDE with a new tablet like interface.
The Mandriva upgrade instructions look frightening, largely because the English is confusing.
When you use --download-all option urpmi will download all the packages first and then begins to install all of them. It is strongly recommended option for migration to a new release with urpmi. It is used to provide reliable update, you need to download and update a lot of packages. If you do not use this option and during update process you face Internet connection problems, you will get a very bad situation when only part of system will be updated, that will result in problems with correct system working.
I'm having nightmares about my former Russian math tutor repeating that to me again and again. Signs of influence from the Russian investors?
Not everything is going against Mandriva. There is no PLF in Mageia, so I'll need to figure out how that works out for me.
This is one of those times I want to shout out, " I HAVE A GREAT SYSTEM. I CUSTOMIZED IT AND IT WORKS FOR ME. WHY DO I HAVE TO KEEP REINSTALLING AND START OVER?" Especially with stuff like wine setups lying around, upgrades means re-configuring those again.
I find it funny complaining about change. Especially from someone who has probed monitor refresh rates to configure X windows (Look it up kiddies. There is no more spectacular way to make your monitor into a paperweight). Change is why I don't have to do that anymore. But I think we have reached a plateau. Hopefully my choices won't lead me to cliff at the end of that plateau.
Labels:
Install,
Linux,
Mandriva or Mageia
Tuesday, August 30, 2011
Apple after Jobs
Steve Jobs and Bill Gates at the fifth D: All Things Digital conference (D5) in 2007 (Photo credit: Wikipedia) |
There is that other company that Jobs have left and worked out ok, Pixar. He believed in having good people around in a company. His hiring of Scully from Pepsi in Apple's early day is an example of that. He refined this belief further in Pixar, where he has a team that has taken it to great successes, stayed hungry and welcome (and look for) change. I remember John Lasseter's comment in the mid-90s on how they at Pixar love the fact that Jobs was getting busy back at Apple. They now effectively controls Disney, with John Lasseter heading Disney Animation Studios. Jobs is the largest single shareholder via Disney's purchase of Pixar. Lesson: making a company great is a team sport.
Jobs has left Apple before.Contrary to popular belief that he was fired, Jobs left on his own will. Jobs wanted control over the direction and results of his vision. The board was worried about how much it would cost. We all know what Apple leadership went through after that. Guy Kawasaki puts it best in the documentary Welcome to Macintosh where he says "everybody wanted to be somebody else". He probably meant each wanted each other's success and tried emulating them. One CEO, Michael Spindler, wanted to sell Apple to Sun or IBM. All that while, I believed that the only way to fix Apple was to bring Steve Jobs back. Not that the Apple faithful didn't dream in those times of his return. All of us was right. Lesson: Apple is an expression of Steve Jobs's vision of looking and creating the future, instead of just looking at the balance sheet.
Another company with iconic founders, Hewlett-Packard, began life in the garage, much like Apple. In fact, there were the original home-garage computer company. Bill Hewlett and Dave Packard built it into a mult-billion company. They enshrined their beliefs in management as the "The HP Way". However, I feel that it was largely abandoned after the tech bubble burst in the late 90s and began to lose it's way as a technology leader. It became yet another computer company with many interests, with no distinctive features other than it's corporate maneuverings, sales forecasts and stock price movements. The exception is probably in printing where the brand is strong and products are respected. HP is now at a crossroads, with a CEO clearly looking for a buyer for it's 'low-profit' division, despite what it says otherwise. HP is a company lost not because of itself but it's leaders who have decided to let the numbers do the leading. Their most recent move in looking to sell their PC division is purely financial. And their customers can see right through it. They are worried about their support contracts, their investments in technology and more importantly in the people at HP. The question on their minds are: "Will the HP of tomorrow be the same HP I'm talking to today?" Lesson No 1: Anybody can have a vision and be visionary but the real question is: what is that vision? Lesson No 2: In the pursuit of profits, don't leave your customers behind.
Apple's current management is well suited to continue Steve Jobs' legacies. But soon it will need a new visionary leader. Someone who is committed to the values of Apple and is surrounded by a team willing to follow.
You see, Jobs's deal is that he wants to change the world. He wants to change the world by changing how people feel. He affects how people feel by changing or controlling how they interact with their world, whether their experience is visual or through touch. He believes that by making that experience of interacting with Apple products "revolutionary" and "magical", it will make people feel good and thus positively affects their world and the world in general. He understood that while the computer can do useful things, it is also a useful tool to impress people. Those that are impressed will go and buy the same computers. So, computers need to be useful and impressive. Now that's vision. Making a buck along the way is not bad, too.
Apple will survive after Jobs, it's just not going to be this Apple.
Another company with iconic founders, Hewlett-Packard, began life in the garage, much like Apple. In fact, there were the original home-garage computer company. Bill Hewlett and Dave Packard built it into a mult-billion company. They enshrined their beliefs in management as the "The HP Way". However, I feel that it was largely abandoned after the tech bubble burst in the late 90s and began to lose it's way as a technology leader. It became yet another computer company with many interests, with no distinctive features other than it's corporate maneuverings, sales forecasts and stock price movements. The exception is probably in printing where the brand is strong and products are respected. HP is now at a crossroads, with a CEO clearly looking for a buyer for it's 'low-profit' division, despite what it says otherwise. HP is a company lost not because of itself but it's leaders who have decided to let the numbers do the leading. Their most recent move in looking to sell their PC division is purely financial. And their customers can see right through it. They are worried about their support contracts, their investments in technology and more importantly in the people at HP. The question on their minds are: "Will the HP of tomorrow be the same HP I'm talking to today?" Lesson No 1: Anybody can have a vision and be visionary but the real question is: what is that vision? Lesson No 2: In the pursuit of profits, don't leave your customers behind.
Apple's current management is well suited to continue Steve Jobs' legacies. But soon it will need a new visionary leader. Someone who is committed to the values of Apple and is surrounded by a team willing to follow.
You see, Jobs's deal is that he wants to change the world. He wants to change the world by changing how people feel. He affects how people feel by changing or controlling how they interact with their world, whether their experience is visual or through touch. He believes that by making that experience of interacting with Apple products "revolutionary" and "magical", it will make people feel good and thus positively affects their world and the world in general. He understood that while the computer can do useful things, it is also a useful tool to impress people. Those that are impressed will go and buy the same computers. So, computers need to be useful and impressive. Now that's vision. Making a buck along the way is not bad, too.
Apple will survive after Jobs, it's just not going to be this Apple.
Labels:
Commentary
Monday, August 22, 2011
Will WebOS be another opportunity lost?
You know you have been in the game too long when you see things happen twice. Or more.
When I heard about what HP did with WebOS tablets and their future 'direction' on it, I was amused and upset at the same time. Amused because the way it was announced speaks volumes on the decision makers themselves rather than WebOS itself. You can bet that it was no knee jerk reaction. This was planned for some time by those who opposed HP buying WebOS or did not see it's value. They were just waiting for an excuse. How else to explain the suddenness of the decision? Why else would you talk smack on something that you wanted to sell? "My car is crap, would you like to buy it?" What normally would happen is to they'd talk about it's good points and try to get the best value from the sale. When you say bad things about that you want to sell, you want it to be valued low enough so that the reluctant buyer sees it as a bargain instead. When the value is low enough, people who would not normally buy something like it, may be tempted to do so.
I am upset because WebOS represents good technology. When the tablets came out, WebOS got favorable reviews. Some reviewers did complain about some rough edges but forgave them because the tablet was a first model and bound to have growing pains. They expected that HP would work out the kinks in the next model. I was looking forward to picking one up.
But history is littered with good intention and great technologies. The most analogous example I can think of a is PC-GEOS. For the briefest of time, PC/GEOS and DR-DOS represented a strong challenge to Windows 3.0. PC/GEOS, later GeoWorks, was graphical desktop environment that was advanced in it's day. GeoWorks came with a word processor, graphics editor and communications software. It had a Motif-like UI with advanced features such as scalable fonts and Postscript support. It provided multi-tasking, tear-away menus and an advanced API. The API provided services for almost all of the basic functions for desktop software. The word processor was about 25kb because almost of of the functions were system calls. And since it was the early 1990s, it worked well with only 640kb of RAM. Yes, the early 1990s. Windows 3.0 still had bit-mapped fonts. Some credit it for making Microsoft to come out with Windows 3.1 just after a year it released 3.0, just to add scalable font support (it still used bit-mapped fonts for the OS).
If you want to try it yourself, you can download a basic version of it called Ensemble Lite from Breadbox. There are general instructions on how to get it running on DOSBOX or you could run in a VM running FreeDOS. Here is video of a working example.
GeoWorks was a lost opportunity to put ahead a an easy to user, technically superior system that worked on existing computers. It was also a lost opportunity to make using computers less about knowing about computers than getting work done. The two biggest gripes users had about GeoWorks then was that the word processor didn't do tables and there was no spreadsheet. People had little problem using it because it was very stable. In short, it was also a lost opportunity to put applications before operating systems.
WebOS by most accounts is a system that could offer a choice other then Apple and Android for tablets. Competition is the key to keep innovation humming. Apple has already chosen to litigate while it innovates. The iPads are also still not considered enterprise ready while Apples doesn't care about the enterprise. Androids will always be struggling to keep a balance between openess and security. It also has to balance between apps running locally and depending on the cloud to deliver productivity. WebOS could be that middle ground between flexibility and security, offering fewer apps but have apps that just work out of the box and aren't afraid to live on the box. All the while continuing to push the OS into the background. Which Microsoft can't and won't do.
GeoWorks is a great product that didn't gain prominence because it couldn't compete against Microsoft's business practices back then (which effectively made PC makers pay for Windows for every machine shipped regardless of whether Windows was bundled or not). This was a time when people still ran other graphical operating systems on PCs and Windows was still version 3.0. GeoWorks didn't do disk operations so it still needed MS-DOS or DR-DOS so it wasn't like it was cutting into existing DOS sales. It also failed because it was hard and expensive to develop for. Sales were so bad, the company behind it later looked to revenue from sales of the SDK to help keep it running, what we now know as suicidal. This created a catch-22. People won't use it because there are no apps and developers won't develop for it because there a few users.
At first I ran GeoWorks on my PC but I eventually moved on to Linux 0.99pre12(?) and I bought an old 640kb Laptop (with a lead-acid battery!) to run GeoWorks. Printing was a snap because I printed to a file using the postscript printer driver. I would then pipe the file to a postscript printer for output. Sweet.
In the end, Geoworks became an ultra-niche product and a promise of better computing unfulfilled. Don't believe me? Try it yourself, guess when you thought the OS came out and tell yourself it came out in 1991.
When I heard about what HP did with WebOS tablets and their future 'direction' on it, I was amused and upset at the same time. Amused because the way it was announced speaks volumes on the decision makers themselves rather than WebOS itself. You can bet that it was no knee jerk reaction. This was planned for some time by those who opposed HP buying WebOS or did not see it's value. They were just waiting for an excuse. How else to explain the suddenness of the decision? Why else would you talk smack on something that you wanted to sell? "My car is crap, would you like to buy it?" What normally would happen is to they'd talk about it's good points and try to get the best value from the sale. When you say bad things about that you want to sell, you want it to be valued low enough so that the reluctant buyer sees it as a bargain instead. When the value is low enough, people who would not normally buy something like it, may be tempted to do so.
I am upset because WebOS represents good technology. When the tablets came out, WebOS got favorable reviews. Some reviewers did complain about some rough edges but forgave them because the tablet was a first model and bound to have growing pains. They expected that HP would work out the kinks in the next model. I was looking forward to picking one up.
But history is littered with good intention and great technologies. The most analogous example I can think of a is PC-GEOS. For the briefest of time, PC/GEOS and DR-DOS represented a strong challenge to Windows 3.0. PC/GEOS, later GeoWorks, was graphical desktop environment that was advanced in it's day. GeoWorks came with a word processor, graphics editor and communications software. It had a Motif-like UI with advanced features such as scalable fonts and Postscript support. It provided multi-tasking, tear-away menus and an advanced API. The API provided services for almost all of the basic functions for desktop software. The word processor was about 25kb because almost of of the functions were system calls. And since it was the early 1990s, it worked well with only 640kb of RAM. Yes, the early 1990s. Windows 3.0 still had bit-mapped fonts. Some credit it for making Microsoft to come out with Windows 3.1 just after a year it released 3.0, just to add scalable font support (it still used bit-mapped fonts for the OS).
GeoWorks Ensemble running on DosBox on SUSE |
GeoWorks was a lost opportunity to put ahead a an easy to user, technically superior system that worked on existing computers. It was also a lost opportunity to make using computers less about knowing about computers than getting work done. The two biggest gripes users had about GeoWorks then was that the word processor didn't do tables and there was no spreadsheet. People had little problem using it because it was very stable. In short, it was also a lost opportunity to put applications before operating systems.
WebOS by most accounts is a system that could offer a choice other then Apple and Android for tablets. Competition is the key to keep innovation humming. Apple has already chosen to litigate while it innovates. The iPads are also still not considered enterprise ready while Apples doesn't care about the enterprise. Androids will always be struggling to keep a balance between openess and security. It also has to balance between apps running locally and depending on the cloud to deliver productivity. WebOS could be that middle ground between flexibility and security, offering fewer apps but have apps that just work out of the box and aren't afraid to live on the box. All the while continuing to push the OS into the background. Which Microsoft can't and won't do.
GeoWorks is a great product that didn't gain prominence because it couldn't compete against Microsoft's business practices back then (which effectively made PC makers pay for Windows for every machine shipped regardless of whether Windows was bundled or not). This was a time when people still ran other graphical operating systems on PCs and Windows was still version 3.0. GeoWorks didn't do disk operations so it still needed MS-DOS or DR-DOS so it wasn't like it was cutting into existing DOS sales. It also failed because it was hard and expensive to develop for. Sales were so bad, the company behind it later looked to revenue from sales of the SDK to help keep it running, what we now know as suicidal. This created a catch-22. People won't use it because there are no apps and developers won't develop for it because there a few users.
At first I ran GeoWorks on my PC but I eventually moved on to Linux 0.99pre12(?) and I bought an old 640kb Laptop (with a lead-acid battery!) to run GeoWorks. Printing was a snap because I printed to a file using the postscript printer driver. I would then pipe the file to a postscript printer for output. Sweet.
In the end, Geoworks became an ultra-niche product and a promise of better computing unfulfilled. Don't believe me? Try it yourself, guess when you thought the OS came out and tell yourself it came out in 1991.
Labels:
Commentary,
I'm too old for this
Wednesday, August 17, 2011
Webmin: The Unsung Hero
Webmin is probably one of the best kept secrets of sysadmins around. Everybody uses it but rarely talks about it. Less still admit using it. Why? Because it makes the difficult config jobs point-and-click easy. It makes what seems to take countless command line commands into a few clicks of the mouse. That is probably why it's an open secret. It does take away some of the mystique of being a sysadmin. Managers, if they knew, would demand faster turnarounds. But it still needs you to know what you are doing.
Basically webmin is a web-based config front-end for your system. I recommend it all around. Even if you run your own personal Linux desktop, I recommend you installing it. Even if you are Mr. Security Conscious, install it and configure it so that you can only access if from localhost. Because it provides something more valuable beyond that what it does superficially. I'll get to that in a minute.
Webmin is a collection of server-side scripts, separated into modules, to run local commands to configure your system. It throws up webpages that recasts the various command line options for commands that configure one component of your system. Each module corresponds to a particular component of service. Some offer interactive tools, like access to a java-based file manager. It covers everything from booting up, boot services to server services like Samba and DNS.
It hides the nitty gritty and allows you to focus on the decisions both technical and managerial. I have used webmin for a long time, I think over a decade. I've seen it' growing pains. It's epic battles of configuration controls with SuSe (one of the reasons I stopped using SuSe regularly) was an example of how much respect developers should place upon users. Suse, at boot time, kept over-writing standard configuration files (which webmin modifies) with values from it's own config file. It chose to favor it's own config files over that which the user has chosen. It was the first step towards a registry-like model and users voted otherwise.
Some things still don't work great, like Samba. But other than that, are rock solid. It hides the nitty gritty so well, that I used it briefly to manager a Sun Server. I was thrust the responsibility when someone foolishly bought a Sun server because "It was what the vendor uses". The big deal was that it was to run a database (for which there was a linux version available). To Linux users, Sun is different and the same. It has different device naming conventions, slightly different service startup mechanism, to name a few . But it is the same because it is Unix.. So, I installed webmin for Solaris and was able to manage it even though I almost never went to the command line. Manage users, assign resources. Webmin did all I needed.
But the truly valuable service Webmin gives sysadmins is the time to plan and think. When pressed for a deadline or users breathing down your neck to fix a service, webmin offers an overall view of the command options and simplifies it to clicks, freeing you to come out with solutions and make decisions. Rather than focus on and getting tripped by command line options, you can focus on what is possible and choose what is best, knowing that Webmin won't let you send the wrong command options because of typos. Less time to worry on that, results in more time to think. And contrary to what some people think, thinking is a good thing.
Basically webmin is a web-based config front-end for your system. I recommend it all around. Even if you run your own personal Linux desktop, I recommend you installing it. Even if you are Mr. Security Conscious, install it and configure it so that you can only access if from localhost. Because it provides something more valuable beyond that what it does superficially. I'll get to that in a minute.
Webmin is a collection of server-side scripts, separated into modules, to run local commands to configure your system. It throws up webpages that recasts the various command line options for commands that configure one component of your system. Each module corresponds to a particular component of service. Some offer interactive tools, like access to a java-based file manager. It covers everything from booting up, boot services to server services like Samba and DNS.
It hides the nitty gritty and allows you to focus on the decisions both technical and managerial. I have used webmin for a long time, I think over a decade. I've seen it' growing pains. It's epic battles of configuration controls with SuSe (one of the reasons I stopped using SuSe regularly) was an example of how much respect developers should place upon users. Suse, at boot time, kept over-writing standard configuration files (which webmin modifies) with values from it's own config file. It chose to favor it's own config files over that which the user has chosen. It was the first step towards a registry-like model and users voted otherwise.
Some things still don't work great, like Samba. But other than that, are rock solid. It hides the nitty gritty so well, that I used it briefly to manager a Sun Server. I was thrust the responsibility when someone foolishly bought a Sun server because "It was what the vendor uses". The big deal was that it was to run a database (for which there was a linux version available). To Linux users, Sun is different and the same. It has different device naming conventions, slightly different service startup mechanism, to name a few . But it is the same because it is Unix.. So, I installed webmin for Solaris and was able to manage it even though I almost never went to the command line. Manage users, assign resources. Webmin did all I needed.
But the truly valuable service Webmin gives sysadmins is the time to plan and think. When pressed for a deadline or users breathing down your neck to fix a service, webmin offers an overall view of the command options and simplifies it to clicks, freeing you to come out with solutions and make decisions. Rather than focus on and getting tripped by command line options, you can focus on what is possible and choose what is best, knowing that Webmin won't let you send the wrong command options because of typos. Less time to worry on that, results in more time to think. And contrary to what some people think, thinking is a good thing.
Friday, August 12, 2011
A Lawsuit a Day, Keeps Competitors At Bay
When did Apple become the Man?
Apple has always been protective about it's copyright. Some of us remember their Interface Wars with HP and Microsoft in the early 1990s. But they have been smart about it. They have been protective of their copyright but generally shared their innovation. They popularized 3.5 inch Floppy Drives, Ethernet, Laser printers, Firewire to name a few.
But their lawsuit against Motorola revealed in their injunction on Samsung's Galaxy Tab smacks of fear of competition. Is Apple so insecure of their post-Jobs era that it will anything to milk the most out what they have now? Apple is respected for it's style and design, quality and innovation. They can always manufacture two of them. But are they running out the third?
Apple has always been protective about it's copyright. Some of us remember their Interface Wars with HP and Microsoft in the early 1990s. But they have been smart about it. They have been protective of their copyright but generally shared their innovation. They popularized 3.5 inch Floppy Drives, Ethernet, Laser printers, Firewire to name a few.
But their lawsuit against Motorola revealed in their injunction on Samsung's Galaxy Tab smacks of fear of competition. Is Apple so insecure of their post-Jobs era that it will anything to milk the most out what they have now? Apple is respected for it's style and design, quality and innovation. They can always manufacture two of them. But are they running out the third?
Labels:
Commentary
Wednesday, August 10, 2011
Nokia afraid of Linux success?
Just to follow up on the last post where I touched on the growing prominence of Linux on the Consumer Computing devices:
The news that Nokia is not going to sell the N9 in the US yet should not be surprising. What surprising is the decision to stop selling phones altogether. The situation is likely this: they don't sell a lot of smartphones (Symbian phones) in the US. The market segment that they are huge in everywhere, the feature phones and basic phones, are not selling in the US and is being eroded by smaller phone makers who sell their phones cheaper. Maybe they are looking at what IBM faced with PC and decided to take that critical step earlier.
IBM sold the original PCs but later were being outsold by other "PC clone" maker. They eventually lost money on the business but kept it around to foster brand recognition. Nothing says your business is successful by having an IBM PC on your desk. Also kept them in visible to the decision makers who would decide on the more profitable sever and services business. Don't get me wrong, those machines were no pushovers. IBM means quality and it shows. These decision makers can't see or touch the servers that they bought but using a quality IBM laptop makes them feel connected somehow.
It seems that Nokia is probably not waiting for that. They are losing money already. But to make their brand disappear feels like they are putting all their eggs in the Windows Phone basket. Microsoft loose nothing either way. Nokia wins and start selling loads of Windows Phones, they make money. Nokia goes bust again for another year despite the Windows phone and MS can pick them up for a song, positioning them squarely against Apple. Why not?
A possible success of the N9 powered by Meego could derail this. They probably had to release the N9 because it was so far down the production pipeline. It's not like it would be a surprise. Their previous Linux-based non-phones were a hit among the tech crowd. N900 showed promise. But if N9 is an improvement on that and the result is a more polished, consumer friendly experience, it would not only be trouble for Andriod and the Iphone but also other Windows Phones.Would they continue making a popular selling phone or would they compete against themselves by having both the N9 and the Windows phone? You have to wonder how much is Nokia listening to MS (remember this is not like MS "helping" Apple)? Samsung does that just fine. They have phones for every segment; Android, BadaOS, basic phones, and they are making money. Even review units of the N9 are not available to the press. Nokia says that they are reviewing market by market. They were surprised that instead of the focus on the hardware platform, which they wanted to highlight with the N9, the entire phone caused a stir. Missing maketplace or not, Flash in a phone solves a lot.
My guess is that it'll be released after the Windows phone to little press or in markets that cannot afford it and be killed off quietly. The march to MSNokia continues...
The news that Nokia is not going to sell the N9 in the US yet should not be surprising. What surprising is the decision to stop selling phones altogether. The situation is likely this: they don't sell a lot of smartphones (Symbian phones) in the US. The market segment that they are huge in everywhere, the feature phones and basic phones, are not selling in the US and is being eroded by smaller phone makers who sell their phones cheaper. Maybe they are looking at what IBM faced with PC and decided to take that critical step earlier.
IBM sold the original PCs but later were being outsold by other "PC clone" maker. They eventually lost money on the business but kept it around to foster brand recognition. Nothing says your business is successful by having an IBM PC on your desk. Also kept them in visible to the decision makers who would decide on the more profitable sever and services business. Don't get me wrong, those machines were no pushovers. IBM means quality and it shows. These decision makers can't see or touch the servers that they bought but using a quality IBM laptop makes them feel connected somehow.
It seems that Nokia is probably not waiting for that. They are losing money already. But to make their brand disappear feels like they are putting all their eggs in the Windows Phone basket. Microsoft loose nothing either way. Nokia wins and start selling loads of Windows Phones, they make money. Nokia goes bust again for another year despite the Windows phone and MS can pick them up for a song, positioning them squarely against Apple. Why not?
A possible success of the N9 powered by Meego could derail this. They probably had to release the N9 because it was so far down the production pipeline. It's not like it would be a surprise. Their previous Linux-based non-phones were a hit among the tech crowd. N900 showed promise. But if N9 is an improvement on that and the result is a more polished, consumer friendly experience, it would not only be trouble for Andriod and the Iphone but also other Windows Phones.Would they continue making a popular selling phone or would they compete against themselves by having both the N9 and the Windows phone? You have to wonder how much is Nokia listening to MS (remember this is not like MS "helping" Apple)? Samsung does that just fine. They have phones for every segment; Android, BadaOS, basic phones, and they are making money. Even review units of the N9 are not available to the press. Nokia says that they are reviewing market by market. They were surprised that instead of the focus on the hardware platform, which they wanted to highlight with the N9, the entire phone caused a stir. Missing maketplace or not, Flash in a phone solves a lot.
My guess is that it'll be released after the Windows phone to little press or in markets that cannot afford it and be killed off quietly. The march to MSNokia continues...
Labels:
Commentary,
Linux
Monday, August 08, 2011
Rise of Consumer Computing and Fall of the OS
Like it or not, Linux as we know is changing. With the rise of the IPad, the face or the interaction between a user and their computer has changed significantly. The interface is simpler, touch interaction, full screen and instant-on. Interface between man and their machines has always been evolving. This most recent wave of change is significant in relation to Linux because it sees the goal of getting Linux everywhere being realized but at the cost of the Linux desktop. For years the battleground over Linux has stalled when it comes to MSOffice. The central role it places at the office, makes it a key target of any effort to implement Linux on the desktop. While OpenOffice is a choice, I have seen many efforts fail because OpenOffice was either too buggy or too MSOffice95. The countering point to this was the rallying cry of "focus on what you want to do, not the applications". To make it more palatable at the workplace, it was modified to "focus on producing work, not the tools". Even though users most of the times use the computer in the office as a super-typewriter, that is not using the majority of the functions in MSOffice, they still demand it to be installed, if anything else, for familiarity. That coupled with office politics and brand-consciousness, together with OpenOffice's own failings noted previously, halted most conversion efforts. Without weaning users off MSOffice, the Linux desktop efforts have either forced to take the "tech-users only" road or give business to Codeweavers for their brilliant CrossOver Office. This has been made worse by offices switching to web-based office systems such as Google Apps. MSOffice's resource appetite has driven users to these types of solutions, bypassing OpenOffice, the junction to desktop Linux.
I believe fundamentally that the relationship between the majority of users and their computers have changed. I am not old enough to be from the generation that had to build their first computers from kits but I have built and restored machine of that age to appreciate what personal computing might have been then. If you trace the evolution of computers in the home, you could draw a line from the first home computers for hobbyists to the personal home computers to the home office computers and laptops to the iPads and tablets (and in cases, smartphones too). In each step of this evolution, the user is still someone who wants to use a computer at home. But that person is no longer the tinkerer of yesteryear. They are not interested in how the computer works. That person now just uses the computer at home without even thinking. They don't think about using the computer, they think in terms of reading e-mail, using facebook and watching YouTube. Call it consumer computing. And if the past is any indication, those tools and concepts will start appearing at offices as users demand tools that are familiar to them to be productive.
In short, the place of the OS in our conscious thinking of computers is almost gone. Think about it. Windows was about hiding the command line. X Windows, KDE, Gnome were about that too. Browsers, HTML, Java and Flash gave us information and interaction within a window, obscuring the OS further. Now not only we don't see the window, we don't even see the OS. It all falls away as we focus on using the computer for whatever we want to use it for.
Will the choice of OS no longer be relevant? Will that work in Linux's favor? I think yes and yes. Look at the Linux underneath Android. It touts it's Linux connection to get the tech guys buy-in but soon enough it wouldn't matter and Google won't mention it anymore. IPad users don't care it runs IOS. They care that it runs.
When it come to running stable for longer, Linux is already there. There is an opportunity here. The 'fall' of the OS's importance is an opportunity for Consumer Computing solutions running Linux. Key to any success is apps and Linux has quality apps in spades. Nokia (soon to be MSNokia) bailed on MeeGo but don't count it out on tablets yet. Intel is hungry for the tablet market and may pull off another netbook-like push with tablet reference machines running MeeGo (followed by hordes of clones from China). Ubuntu is there with it's Unity interface. All it needs is compatible hardware. Not to mention other efforts to make Linux work on the multi-touch interface. Each effort represents potentially more Linux everywhere.
I believe fundamentally that the relationship between the majority of users and their computers have changed. I am not old enough to be from the generation that had to build their first computers from kits but I have built and restored machine of that age to appreciate what personal computing might have been then. If you trace the evolution of computers in the home, you could draw a line from the first home computers for hobbyists to the personal home computers to the home office computers and laptops to the iPads and tablets (and in cases, smartphones too). In each step of this evolution, the user is still someone who wants to use a computer at home. But that person is no longer the tinkerer of yesteryear. They are not interested in how the computer works. That person now just uses the computer at home without even thinking. They don't think about using the computer, they think in terms of reading e-mail, using facebook and watching YouTube. Call it consumer computing. And if the past is any indication, those tools and concepts will start appearing at offices as users demand tools that are familiar to them to be productive.
In short, the place of the OS in our conscious thinking of computers is almost gone. Think about it. Windows was about hiding the command line. X Windows, KDE, Gnome were about that too. Browsers, HTML, Java and Flash gave us information and interaction within a window, obscuring the OS further. Now not only we don't see the window, we don't even see the OS. It all falls away as we focus on using the computer for whatever we want to use it for.
Will the choice of OS no longer be relevant? Will that work in Linux's favor? I think yes and yes. Look at the Linux underneath Android. It touts it's Linux connection to get the tech guys buy-in but soon enough it wouldn't matter and Google won't mention it anymore. IPad users don't care it runs IOS. They care that it runs.
When it come to running stable for longer, Linux is already there. There is an opportunity here. The 'fall' of the OS's importance is an opportunity for Consumer Computing solutions running Linux. Key to any success is apps and Linux has quality apps in spades. Nokia (soon to be MSNokia) bailed on MeeGo but don't count it out on tablets yet. Intel is hungry for the tablet market and may pull off another netbook-like push with tablet reference machines running MeeGo (followed by hordes of clones from China). Ubuntu is there with it's Unity interface. All it needs is compatible hardware. Not to mention other efforts to make Linux work on the multi-touch interface. Each effort represents potentially more Linux everywhere.
Labels:
Commentary
Tuesday, June 28, 2011
MSSkype spells trouble for Citrix?
Did Microsoft buy Skype for itself or was this a clever way to move funds out of the US? If you don't know what the latter means, Skype is registered in Luxembourg, which has a more favorable tax rate than the US. So, there is speculation that the high price for Skype is partly to save on taxes. It did seem odd that Microsoft bought it outright and made it a unit of Microsoft, instead of investing in it. Skype has a very strong brand to stand on it's own.What will happen to it's existing branding agreements? Will we start seeing from Microsoft Hardware division Skype phones?
The alleged tax reason might be a side benefit though, given the fact that Skype can play a prominent role in establishing an on-line office business suite. In the beginning, solutions like Google Docs tease the possibility of an on-line office suite. Microsoft responded with Office Web, basically providing the same functions you would get from the desktop suite. Adding Skype can push this further by adding communication. If Skype could be tightly integrated with the other MS communication tools, Sharepoint and Exchange, it would allow businesses to be able to communicate documents with their customers, bringing them closer and extending their reach at the same time. Who wouldn't want to be able to call their supplier for free? That customer whom you prefer to talk to but costs more for long-distance; well, now he is a click away. How about a supply chain tool using Office Web? Not sure what that invoice is for? Click on the person who signed for the corresponding delivery and get to talk to him.
Let's view a scenario. Supplier and customers are running Sharepoint and Exchange. The two Exchange servers talk to each other over the Internet and exchange info on their users, including Skype account/numbers auto-generated by the Exchange server as part of the user creation process. Now when the need to talk arises, a click will not only send an e-mail, but may include a live invitation to talk that show you whether the person sending it is online and accepting calls. A supplier adds the users to a customer community powered by Sharepoint and they all can optionally share Skype numbers. Meetings can now be scheduled via Exchange and powered by Sharepoint and Skype. Now bring in Office365 and that offering goes over a cloud, lowering barrier for entry with a pay-as-you-use model.
This puts Citrix, which run GotoMeeting, in the Microsoft's crosshairs. Both companies have a close relationship, primarily through cross licencing for Windows Terminal Server and MetaFrame. Or more like MS strongarmed Winframe from Citrix to become Windows Terminal server. What would be worrying is that Microsoft would build a unifying directory service (that runs on a cloud, of course) that would tie their Exchange and Skype users worldwide. How much would companies prefer to Skype than pick up the phone?
The alleged tax reason might be a side benefit though, given the fact that Skype can play a prominent role in establishing an on-line office business suite. In the beginning, solutions like Google Docs tease the possibility of an on-line office suite. Microsoft responded with Office Web, basically providing the same functions you would get from the desktop suite. Adding Skype can push this further by adding communication. If Skype could be tightly integrated with the other MS communication tools, Sharepoint and Exchange, it would allow businesses to be able to communicate documents with their customers, bringing them closer and extending their reach at the same time. Who wouldn't want to be able to call their supplier for free? That customer whom you prefer to talk to but costs more for long-distance; well, now he is a click away. How about a supply chain tool using Office Web? Not sure what that invoice is for? Click on the person who signed for the corresponding delivery and get to talk to him.
Let's view a scenario. Supplier and customers are running Sharepoint and Exchange. The two Exchange servers talk to each other over the Internet and exchange info on their users, including Skype account/numbers auto-generated by the Exchange server as part of the user creation process. Now when the need to talk arises, a click will not only send an e-mail, but may include a live invitation to talk that show you whether the person sending it is online and accepting calls. A supplier adds the users to a customer community powered by Sharepoint and they all can optionally share Skype numbers. Meetings can now be scheduled via Exchange and powered by Sharepoint and Skype. Now bring in Office365 and that offering goes over a cloud, lowering barrier for entry with a pay-as-you-use model.
This puts Citrix, which run GotoMeeting, in the Microsoft's crosshairs. Both companies have a close relationship, primarily through cross licencing for Windows Terminal Server and MetaFrame. Or more like MS strongarmed Winframe from Citrix to become Windows Terminal server. What would be worrying is that Microsoft would build a unifying directory service (that runs on a cloud, of course) that would tie their Exchange and Skype users worldwide. How much would companies prefer to Skype than pick up the phone?
Labels:
Commentary
Tuesday, May 03, 2011
Printing to file lets CUPS print from Flash
Photo credit: Wikipedia) |
Once great thing about Linux is that it's components work good together. Even when they don't, you can always use how they work together to get what you want, at least in some part.
My toddler was asking for a coloring picture for Elmo, the Sesame Street muppet. The official site at www.sesamestreet.org didn't have a picture so I went to the Sesame Street section on the PBS site at kids.pbs.org. Both of them were basically Flash programs. Not pages with Flash elements but a page with probably one big Flash element. There were linked to other pages with the same structure. I think this is sort of a workaround to getting Shockwave-like experience without actually using the memory drain that Shockwave is on Windows. There is no Shockwave for Linux for whatever reason.
Anyway, so I found what I was looking for and clicked on the Flash control to print the picture. A CUPS pop-up came up. Now here is another interesting component. Acquired by Apple, CUPS was the elixir that solved so many problems with printing over lpr for inkjet and non-Postscript, non-PCL printers. I credit CUPS and HPLIP to ending any printer setup and printing issues on Linux, essentially taking the drama away. Really smart on HP's part. By keeping old HP printers printing, means more ink cartridges being sold. And Linux guys keep thing running for a long time.
The thing about CUPS is that it prefers to work in the background. It lets the user facing part be handled by the OS. So when I clicked on the icon to print the picture, Mandriva popped up a different print dialog than I would normally get. This dialog did not offer the "print to file" option. I wasn't concerned initially because I wanted to print to my inkjet. But after clicking on the inkjet and Print, the printer did not print out Elmo. I suspected that there was a problem between the hand-off between Flash and CUPS/Mandriva. I looked at the print queue and there was large job just waiting.
My go-to strategy for jobs being stuck, which is usually a driver problem, is to use the print to file option. This would create a postscript output in a text file which I would convert to PDF. I would then print it again on a another PC in the house or somewhere else. Since postscript is a printer language and PDF is based on postscript, all of the kinks related to printing would have been worked out and the printer driver would just focus on printing an image basically.
But the dialog didn't offer me the print to file option. I looked on the Internet and discovered that there was a separate Print to File printer definition, called CUPS-PDF. It still used the postscript printer driver on the back-. It just deposited the resulting file on the Desktop. I installed the driver from urpmi and printed Elmo again. I checked on the Desktop and the file was there. Or what I thought it was. True enough, it was Elmo in a postscript format. I converted it to PDF and printed it in no time. Total fault to solution time: 10 minutes.
Upon further inspection, the file on the Desktop wasn't really finished. But it was enough for me to create the PDF and get what I want. It seems that the code in Flash to print the picture passed enough info to the printer driver to print the first page. But it didn't send the code to say that the document was done. It assumed that the printer driver would just take the end-of-page marker and send it off to the printer. But CUPS being a good print system dutifully waited for the end-of-document marker in vain.
So not all things in Linux work well together. But even when they don't, Linux stuff offers other ways of getting what you want. And that is all I need.
My go-to strategy for jobs being stuck, which is usually a driver problem, is to use the print to file option. This would create a postscript output in a text file which I would convert to PDF. I would then print it again on a another PC in the house or somewhere else. Since postscript is a printer language and PDF is based on postscript, all of the kinks related to printing would have been worked out and the printer driver would just focus on printing an image basically.
But the dialog didn't offer me the print to file option. I looked on the Internet and discovered that there was a separate Print to File printer definition, called CUPS-PDF. It still used the postscript printer driver on the back-. It just deposited the resulting file on the Desktop. I installed the driver from urpmi and printed Elmo again. I checked on the Desktop and the file was there. Or what I thought it was. True enough, it was Elmo in a postscript format. I converted it to PDF and printed it in no time. Total fault to solution time: 10 minutes.
Upon further inspection, the file on the Desktop wasn't really finished. But it was enough for me to create the PDF and get what I want. It seems that the code in Flash to print the picture passed enough info to the printer driver to print the first page. But it didn't send the code to say that the document was done. It assumed that the printer driver would just take the end-of-page marker and send it off to the printer. But CUPS being a good print system dutifully waited for the end-of-document marker in vain.
So not all things in Linux work well together. But even when they don't, Linux stuff offers other ways of getting what you want. And that is all I need.
Wednesday, April 20, 2011
Vista Flash Odyssey
I still maintain a Vista partition on my PC for my kids stuff or the stuff that they bring back from school to run. These would be educational CDs and stuff. Also it says in my support contract that I have to have Vista there to enjoy their 3 years on-site support (which I have BTW, 2 motherboards replaced FOC). But we use Linux 99% of the time. I figure that if I expose them now, their perception of computers would not be limited to the Microsoft World.
But recognizing the possibility of needing to use Vista for whatever reason, I maintain the partition and I maintain Vista. This means periodically log in an update windows, flash, java, openoffice and the cone / VLC media player. What prompted me this time was that I wanted to move from openoffice to libreoffice. My other Windows PCs have them already so it was more of leveling the playing field, making sure I have similar programs on all of the PCs in the house. It has been a while since I used Vista. So much so I was also installing chrome this time around.
Sometimes I wonder whether I am denying my children access to their educational software by defaulting to Mandriva. There is this great math tutorial program and a interactive language learning kit. If they ask for it, I'll boot up Vista and set things up for them. But they don't mind and I seem to be getting better mileage from Flash demos and YouTube tutorial videos on the Internet anyway.
During the update eveything went well except for updating Flash on IE. I went to the Adobe website and clicked on the button to down load the lastest version of Flash. It downloaded the Adobe downloader, installed it and executed it. It then threw up an error window said the it was unable to get the correct parameters. I figured that the downloader was facing problems with the Internet link. Checked that and it was ok. So I followed the troubleshooting link from Adobe download page.
Basically, it recommended that I stop every single program I can think of that runs flash and then run the uninstaller for Flash. Well, that's great. Even Adobe has little faith in my ability to figure out by looking in the taskbar which apps is using Adobe Flash and locking the flash files. Why? Because it recommended that if it didn't work, try again because I probably missed a program. I humored Adobe for a while and uninstalling and reinstalling the downloader didn't work.
So off to the Internet we go. I found a highly rated advices which advised me to download a file from the Windows Resource Toolkit and a command file for the toolkit to use. That removed or fixes flash related stuff. What is rich is that the command file is from Adobe. So I tried that and yet the dreaded "unable to obtain correct parameter" error came out.
This was getting ridiculous but reminded me of how lucky I am using Linux. Even with flash and it's installation instruction which divert you to the command line, that routine has worked ok for years (don't get me started on the simlar java installation). I realised that my problem wasn't with installing flash, it was the downloader. It was acting as tthe gatekeeper to getting flash. In reality it was nothing but a billboard. So after looking around I found a link to get the installers directly.
The bruhaha in recent years about flash and Apple's refusal to use it (not for technical reasons, I'm sure) seemed to me like case against progression. Not supporting flash is a deal breaker with all sorts of sites using flash to get past the home page. But with Apple's clout and the popularity of ipads amongst senior management, a lot of sites have had to provide flash-free alternatives. After this episode, good riddance to flash and let's move on to HTML5.
But recognizing the possibility of needing to use Vista for whatever reason, I maintain the partition and I maintain Vista. This means periodically log in an update windows, flash, java, openoffice and the cone / VLC media player. What prompted me this time was that I wanted to move from openoffice to libreoffice. My other Windows PCs have them already so it was more of leveling the playing field, making sure I have similar programs on all of the PCs in the house. It has been a while since I used Vista. So much so I was also installing chrome this time around.
Sometimes I wonder whether I am denying my children access to their educational software by defaulting to Mandriva. There is this great math tutorial program and a interactive language learning kit. If they ask for it, I'll boot up Vista and set things up for them. But they don't mind and I seem to be getting better mileage from Flash demos and YouTube tutorial videos on the Internet anyway.
During the update eveything went well except for updating Flash on IE. I went to the Adobe website and clicked on the button to down load the lastest version of Flash. It downloaded the Adobe downloader, installed it and executed it. It then threw up an error window said the it was unable to get the correct parameters. I figured that the downloader was facing problems with the Internet link. Checked that and it was ok. So I followed the troubleshooting link from Adobe download page.
Basically, it recommended that I stop every single program I can think of that runs flash and then run the uninstaller for Flash. Well, that's great. Even Adobe has little faith in my ability to figure out by looking in the taskbar which apps is using Adobe Flash and locking the flash files. Why? Because it recommended that if it didn't work, try again because I probably missed a program. I humored Adobe for a while and uninstalling and reinstalling the downloader didn't work.
So off to the Internet we go. I found a highly rated advices which advised me to download a file from the Windows Resource Toolkit and a command file for the toolkit to use. That removed or fixes flash related stuff. What is rich is that the command file is from Adobe. So I tried that and yet the dreaded "unable to obtain correct parameter" error came out.
This was getting ridiculous but reminded me of how lucky I am using Linux. Even with flash and it's installation instruction which divert you to the command line, that routine has worked ok for years (don't get me started on the simlar java installation). I realised that my problem wasn't with installing flash, it was the downloader. It was acting as tthe gatekeeper to getting flash. In reality it was nothing but a billboard. So after looking around I found a link to get the installers directly.
The bruhaha in recent years about flash and Apple's refusal to use it (not for technical reasons, I'm sure) seemed to me like case against progression. Not supporting flash is a deal breaker with all sorts of sites using flash to get past the home page. But with Apple's clout and the popularity of ipads amongst senior management, a lot of sites have had to provide flash-free alternatives. After this episode, good riddance to flash and let's move on to HTML5.
Labels:
Fix
Thursday, April 14, 2011
Letting go of old programs
As you may know I am a Mandriva user. More hardcore than I thought, I discovered today.
I am lucky because I have padawan now. Eager to learn but patient enough not to bug me all day long.
So the need was to log onto the desktop from remote. Not just access but use the desktop. Mandriva has this tool called rfbdrake. It provide a one stop interface for remote access, going to and setting up. Basically it calls on rdesktop to connect to Windows boxes, VNC for Linux boxes and uses rfb to share out the current desktop. Not to be confused with the brilliant remote access tool on SuSe which spawns vncserver to provide remote desktop access from the point of login. This is much more pedestrian. Just share what I am seeing. Problem is, I couldn't find it on urpmi or on the Software Installer. Now I had procrastinated over some time on fixing a problem that workstation which prevented some updates from being completed. Since both of my problem could be rpm related, I finally set aside some time to do it. The update problem was simple enough. Apparently, the Fortigate firewall triggered some false positives on the files that was being downloaded. So amending the rules slightly to allow the updates to pass thru did the trick. But in the process earlier, the various repositories were also messed up. So removed them all and redownloaded a new set. For good measure, I plunked in plf too.
But after all the updates, I still couldn't get rfbdrake. Time to hunt RPMs on the net then. But horrors, rpm.pbone.net was down. RPMFind was no good either. I had given up on it to find Mandriva RPMs a long time ago. So a hunting on google we go. I finally found it on (of all places) SUNET. Nostalgia engulfed me as I remember the old days of going through SUNET looking for free/shareware software. Then followed the ensuing dependency hell. I was missing rfb itself. Hunt as I may, I could only find one from 2008. Security-wise not good.
Then it dawned to me. I was asking the wrong question. Why was I hung up on rfbdrake? The question would be, what would give me desktop access? If rfb is gone, what are their replacements? I should have learned that I should let go of old programs. The new guys were Vino and krfb. Turns out they worked fine. I miss the unified interface but if it is for the better, why not.
P/S - I am still haunted by my failure to keep a copy of a DOS IVR program (that fit on a floppy!) that ran together with a voice modem (a 33.6kbp with voice capabilites). I am that old. *sigh*
I am lucky because I have padawan now. Eager to learn but patient enough not to bug me all day long.
So the need was to log onto the desktop from remote. Not just access but use the desktop. Mandriva has this tool called rfbdrake. It provide a one stop interface for remote access, going to and setting up. Basically it calls on rdesktop to connect to Windows boxes, VNC for Linux boxes and uses rfb to share out the current desktop. Not to be confused with the brilliant remote access tool on SuSe which spawns vncserver to provide remote desktop access from the point of login. This is much more pedestrian. Just share what I am seeing. Problem is, I couldn't find it on urpmi or on the Software Installer. Now I had procrastinated over some time on fixing a problem that workstation which prevented some updates from being completed. Since both of my problem could be rpm related, I finally set aside some time to do it. The update problem was simple enough. Apparently, the Fortigate firewall triggered some false positives on the files that was being downloaded. So amending the rules slightly to allow the updates to pass thru did the trick. But in the process earlier, the various repositories were also messed up. So removed them all and redownloaded a new set. For good measure, I plunked in plf too.
But after all the updates, I still couldn't get rfbdrake. Time to hunt RPMs on the net then. But horrors, rpm.pbone.net was down. RPMFind was no good either. I had given up on it to find Mandriva RPMs a long time ago. So a hunting on google we go. I finally found it on (of all places) SUNET. Nostalgia engulfed me as I remember the old days of going through SUNET looking for free/shareware software. Then followed the ensuing dependency hell. I was missing rfb itself. Hunt as I may, I could only find one from 2008. Security-wise not good.
Then it dawned to me. I was asking the wrong question. Why was I hung up on rfbdrake? The question would be, what would give me desktop access? If rfb is gone, what are their replacements? I should have learned that I should let go of old programs. The new guys were Vino and krfb. Turns out they worked fine. I miss the unified interface but if it is for the better, why not.
P/S - I am still haunted by my failure to keep a copy of a DOS IVR program (that fit on a floppy!) that ran together with a voice modem (a 33.6kbp with voice capabilites). I am that old. *sigh*
Subscribe to:
Posts (Atom)
Recently Popular
-
You are have been writing the post on and off for the past few hours. You have been diligently pressing save to make sure you didn't l...
-
Why do I care for WebOS? Mainly because it is the continuation of Palm. I want to see another personal computing pioneer who has done s...
-
Google released a new Blogger for Android and I'm happy to report that it is an improvement over the past version. I blog from everyw...
-
I have been spending time trying to wrap my head around Containers, mainly the Docker container. There are others that are up and coming, bu...
-
Is programming an art or science? While numerous proofs can be made on programming languages on their properties, which puts it in the realm...