Roman Sandals

December 23, 2008

Net filter

Filed under: political, technology — Craig Lawton @ 7:40 am

Ars technica says it best:

“So, in summary, it appears that the government is trying to make up for the failure of an earlier PC-based filtering program by rolling out an alternative, ISP-level filtering program that they know won’t fully prevent access to illegal material. They promise not to state what sites are being blocked, even as they promise only illegal content will be. To prepare for the roll out, they’re doing live testing of equipment and protocols they haven’t used in the lab, and not telling the ISPs when the program will be ready. It sounds like all of the worst clichés about government incarnated in a single program.”

August 8, 2008

Lean and Mean

Filed under: business, management, technology — Craig Lawton @ 3:00 pm

Lean manufacturing principles originated in Japan.

People are now applying them in IT: Lean software management, and also other aspects of IT such as Service Management.

In July 2007 it all looked so promising.

A year later what went wrong?

July 29, 2008

Super-user excuses

Filed under: musing, sysadmin, technology — Craig Lawton @ 1:12 pm

System administrators always get super-user access. Third parties, increasingly located wherever, are often granted super-user access as well, usually to smooth project implementations. Super-user access is thrown around willy-nilly because it’s a hell of lot easier than documenting privileges which is really, really boring work.

This leads to poor outcomes: downtime, systems in “undocumentable” states, security holes etc.

The horrible truth is that somebody somewhere must be able to gain super-user access when required. It can’t be avoided.

The other horrible truth is that when you allow super-user access only because properly defining a particular role is hard, you are in effect, giving up control of your environment. This is amplified when more than one team shares super-user access. It only takes one cowboy, or an innocent slip-up, to undermine confidence in an environment.

In this increasingly abstracted IT world, where architecture mandates shared, re-usable applications, where global resourcing mandates virtual remotely-located teams, where IT use and and server numbers exponentially increase and where businesses increasingly interact through gateways, security increasing looks like a feature tacked on at the last minute.

Security costs a lot and adds nothing to the bottom line – though lack of it can and will lead to some big bottom line subtractions.

The mainframe guys had this licked ages ago. The super-user excuse is looking rather thin.

The Age of Authorization is upon us…

Update: An amazing story from San Francisco, which outlines how lack of IT knowledge at the top of an organisation, and too much power devolved to to few IT staff, can cause much grief.

July 27, 2008

It’s really different this time…

Filed under: business, technology — Craig Lawton @ 11:28 am

The seems to be a common thread in the media that the IT industry is in for a downturn because the economy in general is struggling. I think it is different this time.

The last time IT struggled was at the end of the dot-com boom. The US dollar was really high, tech companies had massive inventories to clear and Cisco had been the biggest company in the world. The IT world had been set for a golden age which never arrived.

This time, the US dollar is low, tech companies are lean and in good shape having learnt their lessons, and surprise, surprise, the earnings of the big players are impressive and growing.

Intel, VMware, EMC, Apple, Microsoft, Google, all increased profits impressively. Some didn’t increase earnings enough and were “punished”, but this is clearly market sentiment. For example, VMware increased earnings by 40-ish% instead of 50-ish%, and their share price dropped. Strong international revenues especially are boosting results. SUN still struggles, but they were hit hardest by the dot-com era ending, and they still pull $4 billion in revenue each year.

Now big Australian corporate oligopolies, run by cosy, tech-ignorant boomers, have woken to the fact that they have under-invested in IT for the last decade, and have expensive legacy environments which are due for a big clear out. They have to spend money to make their environments lean; to make their businesses internationally competitive. And it’s a good time for CapEX in US dollars. Not only is IT gear very, very cheap compared to 8 years ago, each aussie dollar goes twice as far in US purchases as it once did.

May 23, 2008

Frustrating in-house systems

Filed under: musing, technology — Craig Lawton @ 1:25 pm

I’m constantly amazed at the crappy performance of in-house applications at the places I’ve worked. Customer-facing applications must perform, or business is lost. In-house applications are never tuned for performance it seems, and this makes work that much harder.

This difficulty is related to the level of brain-memory you are using for your current task. Very short term memory is great, and necessary, when you are flying through a well understood task. But short system interruptions (usually involving the hour glass) force you to use more extended memory times, making the effort that much larger, and less enjoyable.

There are other types of interruptions of course, which have a similar effect, such as people-interruptions (“What are you doing this weekend?”) and self-inflicted-interruptions (such as twitter alerts).

If your system hangs for long enough you may start a new task altogether (so as not to look stoned at your desk) and therefore lose track completely of where you were at.

This forces unnecessary re-work and brain exhaustion!

I see lots of people with “notepad” or “vi” open constantly so they can continually record their work states. This is a good idea but takes practice and is an overhead.

It comes down to this. I want a system which can keep up with me! :-)

And is that unreasonable, with gazillions of hertz and giga-mega-bits of bandwidth available?

May 1, 2008

Going with the cloud

Filed under: management, musing, technology — Craig Lawton @ 5:01 pm

Really interesting article on the Reg’ which should put data centre fretters’ feet firmly back on the ground. It seems the “thought leaders” don’t see data centres disappearing anytime soon because:

  • Security – “… there are data that belongs in the public cloud and data that needs to go behind a firewall. … data that will never be put out there. Period. Not going to happen. Because no matter how you encrypt it, no matter how you secure it, there will be concerns.”
  • Interoperability- “…figure out ways for systems that are … behind the firewall … to interoperate with systems that are in the public cloud”
  • Application licensing complexity.
  • Wrangling code to work over the grid – getting any code written that exploits parallel infrastructure seems to be very difficult.
  • Compliance – “What happens when government auditors come knocking to check the regulatory complicity of an application living in the cloud?”

Also they didn’t cover jurisdictional issues, such has, who do you take to court, and in what country, when there is an issue with data mis-use “in the cloud”.

It makes you wonder about why cloud computing will be any different to grid computing, or thin desktop clients. A great idea, but not enough inertia to overcome ingrained corporate behaviour.

April 7, 2008


Filed under: technology — rchanter @ 8:26 pm

Despite having been more or less web-native since the mid-90s, I’ve never really done much hands-on web design or javascript programming. Still, I read (and listen to) enough tech stuff to get the general idea. Today I decided I needed to do a little javascript to flip between alternative presentations of some data. So I figured, generate all 3 up-front, put them inside divs, and set the CSS display property for the one I wanted. That much I knew before I started.

Off to look for sample code. I realise that for people who actually do web work that this is the equivalent of “Hello World”, but I still needed a little help. A handful of JS and CSS tutorials later, and I found myself on the Yahoo developer site looking at YUI.

15 minutes later, a fully functioning tabbed widget containing my 3 bits of data, completely integrated with the existing stuff (different display options for diff output, for what it’s worth). I am seriously impressed at how good YUI is for grab-and-go code samples. Would have taken me at least an hour from scratch (yeah, I know, I’m a sysadmin, not a proper programmer).

March 27, 2008

Infrastructure Money

Filed under: musing, technology — Craig Lawton @ 2:33 pm

It’s interesting to note that in big IT shops software often accounts for about 50% of the operating budget. Compare this to human resource, or people as they are sometimes referred to, which comes in at about 10% of most budgets.

In the last downturn, server hardware got hammered in cost, hurting the likes of SUN Microsystems, and helping the likes of Dell. People got done what they could, with what they could budget for.

If the current economic storm clouds start hurting IT budgets, will software (and possibly also Storage costs) be the sacrificial lamb, opening doors for the likes of MySQL, Debian,  Tomcat, Nagios etc. to make big in-roads in to the corporate world? Could SUN make some inroads with ZFS (over VxFS)?

December 11, 2007


Filed under: spam, technology, Uncategorized — rchanter @ 10:27 am

So one of our mail servers got listed on spamcop the other day. It’s just an operational hazard of running a mail service of a non-tirvial size really, but still a PITA. Delisting is simple enough, mopping up is harder. I don’t know who I should be most annoyed with:

  1. Spamcop, for being a trigger-happy, FP-prone list (and by extension, Ironport for not doing enough to clean up their act).
  2. The people running mail servers who think spamcop is a safe RBL. This includes a few providers that I would have expected to know better.
  3. The people running rogue autoresponders inside our network, which is the most likely way for reputable senders to hit the spamcop spamtraps.
  4. IBM/Lotus, whose Out-of-Office autoresponder is an utterly brain-dead piece of crap. (and don’t get me started on how unusable mail rules are).

The right answer, of course, is all (or none) in equal measure. But deep down, I think I want to blame Spamcop and Ironport. Now, I’m all for blacklists discouraging backscatter. But no matter what measures the service operator takes, there’s always going to be something back at the mailbox that does The Wrong Thing. And Spamcop (by which I mean Ironport) have a tool that would be exactly the right thing to help distinguish between indiscriminate backscatterers and sites that mostly have the problem under control.Grrr. B’stards.

November 22, 2007

C: Drive

Filed under: musing, technology — Craig Lawton @ 4:08 pm

I just had a thought. I should really back-up my work laptop. I should back-up my C: drive. I fired up the XP Backup tool.

Even plebs know what a C: drive is! But why is it so? Surely, it should be the A: drive if it’s that important. But no, the A: drive was originally assigned for a floppy disk, and I think the B: drive was for a secondary floppy disk from memory. But nobody has floppy disk drives anymore!

As if reading my mind, the XP Backup tool told me that after creating the back-up file it would ask me for a floppy to create a boot disk. Of course, you’d think it’d check to see if I had a floppy drive before asking. I don’t.

Interestingly, Wikipedia lists all the Operating Systems that use drive letter assignment. It reads like one of those Human Rights Watch charts, one which lists the countries that kill more of their own citizens than others: (more…)

Older Posts »

Blog at