The IT Crowd?
(This article was originally published in Entelechy, edition 32, Jan 2012. It is being published here in full with some annotations.)
It has continued to surprise me over the last 3.5 years how few information technology students actually bother to use the innovations of information technology to improve their productivity in any manner. More importantly, they are usually unaware of the products themselves. Recently the issue was brought to the fore when Skish Champi pointed out that Zimbra Collaboration Suite had great calendar integration (and I agree), and we as a college are still struggling around with sending meeting emails and reminders.
We are all experts at using DC++ to share files over our network. Yet, when it comes to sending the same files over the Internet you still stick to (gasp) e-mail. If you send me a 50Mb file over email in 2012, I’m going to knock on your door with a gun in my hand. Use Dropbox or a thousand other such services. You only need to upload once, your multiple devices can continuously sync the files, individual folders can be shared with individual people — this is great for working on projects and such. In addition Dropbox keeps old versions around. Heck, using the Public folder you can even host a complete static website without paying a paisa! Once your file is on Dropbox you can go trigger happy with the public link and send the link in the e-mail instead. Less load for the e-mail servers, and the link can easily be shared via any other medium too.
Similarly, if you are in charge of planning a lot of events (looks at the committees and clubs and faculty), create a new Calendar event in Zimbra and send it to all batches. That way a student simply has to accept the invitation, and he will also occasionally get reminders. Synced with a desktop Calendar application, you can even have your computer play sounds or show messages even when webmail is not open. Now you have no reason to forget or be late for a meeting.
If you are working on a software project, and your way to ‘work as a team’ on the code is to ship around newer versions of the files to everyone via e-mail, your project is already dead. Why? What happens when you suddenly need an older version? Or one feature from Amrita and the other one from Rajni? Spend your time copy-pasting? Let a version control system do it for you!
I’m going to mangle The Unix Philosophy slightly to suit my purposes:
- Use software that does one thing, and does it well.
- Use software that does not lock your data in, or modify it so no other software can use it. A hyperlink should always be available.
- A software which allows its output to be the input for another program, is usually better than one that doesn’t. An API can do wonders.
- Understand your work flow. Don’t hesitate to throw away the clumsy parts.
- Use tools in preference to manual labour, even if you have to detour to build the tools.
So why do we continue to use legacy technology, mouse around and generally not maximise our use of the computer?
One reason is of course inertia. Our biological brains are always trying to survive and if one thing works we aren’t willing to go improve. Another is a lack of curiosity about wanting to hear about new technology.
But it is also an attitude problem. Technology continues to be the poor step-daughter. If someone shows you how to skin potatoes faster, you’ll quickly adopt the technique. But as soon as the metaphor moves onto the computer, there is immediate unwillingness. From the very beginning of our computer education we’re asked to treat the computer as some magic box, and restricted from doing anything outside of that ‘education’. “Usse haat mat lagana nahi to kharab ho jayega.” is what your parents/teacher says. (English: Don’t touch that, you’ll break it). We carry over this belief that computers are frail creatures even when we become more responsible. The second reason is that the internals of computers continue to be treated as unknowns for much of the population. Most people who drive a car have a qualitative idea of how an internal combustion engine works. They also know how to change a tire. But the same people don’t know how to add RAM or the basics of a processor. Worldwide, computer literacy focuses on office suites and the like. Even when you do eventually study the internals, they remains something from the textbook, so that whenever your C program is malfunctioning you don’t once bother to think in terms of how that program is being interpreted in a certain way, and how that can help solve the problem. This disconnect is alarming, I’ve seen computer science professors being unaware of basic computer features. The funniest example I can think of this is when Windows users keep right clicking on the desktop and hitting Refresh when things aren’t moving.
The media is also responsible. World news is quick to highlight the policies and laws that will affect say, retail, or corruption or some industry. Technology media however is focused on product reviews and the next revolutionary technology (which is usually some old concept re-hashed), and sidelines actual problems which will impact privacy, ownership and other fundamental rights in the digital age, so that the notion of computing in itself is never given enough attention by the general public. If car manufacturers specified which brand of fuel you could use in their cars, there would be a hue and cry over anti-competitive behaviour and not giving choice to people. Yet mobile phone carriers fleece people every day with locked, underpowered cell phones. The war on computation keeps going the wrong way.
I also believe, that just as in mathematics, computers require significantly more complex mental models to be manipulated in the mind. A car engine is heat and metal and chemicals and can directly be observed doing something. The levels of abstraction between the electrons racing through the wires and the point and click metaphor of daily interaction are numerous in comparison. As a user you do not of course need to have any inkling of them, but even at the software layer, a computer desktop is much more congested and much less tangible. Most normal people seem to keep missing subtle user interface clues that convey meaning. To those with the mental ability to hold these models and think rationally and logically (the only way in which the computer can think), the paradigms are much clearer.
But as ICT students you are expected to have that mental ability. The first thing to do is to lose fear of the computer. Modern operating systems and applications are robust enough so that they won’t go down just due to clicking in the wrong place. Experiment with preferences, try buttons which have only icons, many a feature can be discovered this way. Second, internalise the knowledge as much as you can by always trying out new things yourself, until they become second nature. Third, think through what you are doing rather than just clicking as your friend told you to. Fourth, remember that like in all engineering disciplines, convention is implicitly followed in computing as well. So concepts you learn in one application (say drag and drop) can be generalized and applied everywhere. Fifth, start thinking of computing tasks as recipes. In cooking, you apply a series of tools, to transform various inputs (ingredients) to the final outcome. The output of the knife becomes the input of the frying pan. On the other hand, most software normal people use tends to be monolithic, one tool that will do all the steps. Good software on the other hand has separate knives and separate frying pans and allows a great deal of flexibility. The Unix command line is the most pervasive and powerful example. Sixth, keep your eyes open for new ways of better using technology. Blogs like lifehacker are an excellent source of such tips. Seventh, pay attention when governments and companies try to cripple your right to compute and the right to information.
With a little active effort, you streamline your computing experience, so that you can devote complete attention to the fantastic things you create.