Is it really possible to give one 'computer' per child in a state like Karnataka to get them to learn? The word computer is so strongly associated with machines and gadgets that the very suggestion is dismissed as being unrealistic. But should a 'computer' be a physical machine? If we could digitize paper, music and video and store them in files on a computer and then bring it alive when needed, why not a digitize a whole computer into a file? These days, computers are so fast that it is possible to feed them a description of an ideal child's computer and have it bring it alive on the screen in a flash. Such 'software' computers can be produced, copied and distributed inexpensively on USB flash memory chips. A child could take such a chip and insert it into the school or home computer and then start a program which would resume a computing session on the monitor. After using it, the child can save the session in a file on the chip and put it back into the school bag. Learning involves a mix of outdoor and indoor plays and reading. Therefore, a single desktop or notebook computer can be shared by many children without interfering with each other's work. Off-school, a child can continue learning on a home computer or a nearby cyber cafe.
A software computer is much more flexible than paper as a learning or teaching medium. The child will never run out of paper, pencils or paint. Textbook authors and teachers can combine audio, video and text to create captivating lessons. Since the software computer is personalized, it can adapt itself to fast learners, slow learners, blind or deaf children very easily. Flash memory chips have become cheaper than textbooks, so the scheme would definitely be affordable.
It is possible to put together such software computers using Squeak. Squeak is a program created specifically as personal learning environment for children. It mimics a typical art and craft play session. Children assemble shapes, sketches and colors into toys and use them to simulate the world around them. Squeak manages to pack a complete multimedia authoring system into a tiny 20MB file called an 'image'. Squeak virtual machine, the program that reconstructs a live session from this image file, runs on Linux, Macintosh and Windows. Its source code is available in public domain for any one to port it to other computing platforms. The image file, like music or video files, can be copied and used on any computer. Digital textbooks and articles can be loaded into Squeak as Projects. Children can also author their own reports or carry out their own art and craft projects. The variety of projects done by school children across the world is a testimony to the versatility of Squeak as a digital learning medium.
I could pack a complete kit containing code to run Squeak on Linux or Windows, two images along with tutorials and documents in less than 128MB! On a 1GB USB flash memory this would leave enough space to carry a few years worth of projects.
Squeak's current handicap is lack of support for Indian languages. Nobody has gotten around to adapting it for Indian languages so far. Once this support is added, then Indian school children can enjoy their own personal software computer that will never become obsolete.
Friday, June 8, 2007
Saturday, April 7, 2007
LCD monitors: not by size alone
The dominant aspect of LCD monitors that figures in ads and sales pitches is the diagonal size. Many online web-sites and shops classify LCD monitors by vendors, prices and diagonal sizes. Sizes also figure prominently in ads, press releases and sales pitches.
Diagonal size could have been a major determinant in purchase decisions for CRT monitors, but it is not sufficient for LCD monitors. While CRT monitors came only in 4:3 aspect ratio, LCD monitors also come in 16:9 (aka widescreen) aspect ratio. There are other widescreen formats like 15:9 or 16:10 but 16:9 is the most common. For the same size, widescreen monitors are cheaper for a good reason - their area is smaller than their 4:3 counterpart by about 11%. The cost of a monitor depends on the area so the widescreen monitor is cheaper than the standard format.
For working with documents and web-surfing, I prefer standard monitors to widescreen monitors as the former display more lines of text. Widescreen monitors are better suited for videos and games. Of course, if you have a video card can rotate screen and your monitor can rotate on its axis, you can get the best of both worlds.
Next time you shop around for a LCD monitor, don't go by size alone.
Diagonal size could have been a major determinant in purchase decisions for CRT monitors, but it is not sufficient for LCD monitors. While CRT monitors came only in 4:3 aspect ratio, LCD monitors also come in 16:9 (aka widescreen) aspect ratio. There are other widescreen formats like 15:9 or 16:10 but 16:9 is the most common. For the same size, widescreen monitors are cheaper for a good reason - their area is smaller than their 4:3 counterpart by about 11%. The cost of a monitor depends on the area so the widescreen monitor is cheaper than the standard format.
For working with documents and web-surfing, I prefer standard monitors to widescreen monitors as the former display more lines of text. Widescreen monitors are better suited for videos and games. Of course, if you have a video card can rotate screen and your monitor can rotate on its axis, you can get the best of both worlds.
Next time you shop around for a LCD monitor, don't go by size alone.
Thursday, March 15, 2007
Enduring bits - Unix
Over the past few decades, certain software creations have withstood the test of time. They have not only retained their original charm but also resulted in many derivatives (and imitations :-)) around their basic architectural elements. Unix is one of them. It gave us the concept of unified namespace, files as simple sequence of bytes, standard input/output, pipes, a rich job control language in the form of shell, and a simple syscall interface wrapped around a standard C library.
The software came with a complete toolchain that any programmer could use to look into the innards, extend it or modify it in various ways. Back in the early eighties when I first got my hands on the Unix distribution, the kernel was only about 60,000 lines of code and quite remarkable for its elegant implementation of architectural elements. I am still amazed that just a couple of hours of reading Kernighan and Pike's book on Unix Programming Environment is sufficient for someone to put together complex pipelines for searching, editing and sorting text.
While the hardware capability has gone up a thousand fold, the kernel and the syscall libraries continue to remain simple, flexible and elegant. Of course, the number of variants have grown into thousands today but that is a testament to the strong conceptual integrity and sound architectural foundation.
There is one quirk in Unix that has never been fixed. The unified namespace that works so well with serial, parallel, sound, storage and other devices breaksdown when it comes to network interfaces. Network devices dont register themselves in the standard device tables. They use their own set of tables. There is no special file type that caters to network devices or ports. There is no /dev/eth0 or /dev/tcp/53 or /dev/udp/80. Network devices use their own set of i/o system calls.
I can transfer a file to disk by "cp hello.txt /media/sda/hello.txt" and not have to worry about the physical structure on the disk, but I cannot stream a file to another machine over a network with "cp hello.txt /dev/tcp/9000". I can restrict a bunch of users from using a CDROM, but there is not /dev/eth1 whose permissions I can set on a owner or group basis.
Is it because the network stuff was designed in Berkeley on the West Coast while the namespace stuff was done by Unix Systems Lab on the East Coast? When teams cannot (or should not) communicate amongst themselves on a regular basis, architectural convergence is difficult to achieve (see Conway's Law).
The Video drivers and X11 is another sub-system that is horribly divergent from the clean namespace interface. Taking a screenshot should have been as simple as copying /dev/window to a disk file. Now we have hundreds of system calls and each sub-system comes with its own set of tools.
Plan 9 set out to fix the problem, but it was hobbled by a restrictive license till a few years back. It continues to plod along. I guess Unix is so well-entrenched in the market that people are willing to put up with its quirks rather than take up something radically different. But then, the distributed operations intrinsic to Plan 9 could tilt the balance in its favor with the arrival of multi-core chips.
Perhaps, somewhere, some student could be cooking up a disruptive innovation with Plan 9.
The software came with a complete toolchain that any programmer could use to look into the innards, extend it or modify it in various ways. Back in the early eighties when I first got my hands on the Unix distribution, the kernel was only about 60,000 lines of code and quite remarkable for its elegant implementation of architectural elements. I am still amazed that just a couple of hours of reading Kernighan and Pike's book on Unix Programming Environment is sufficient for someone to put together complex pipelines for searching, editing and sorting text.
While the hardware capability has gone up a thousand fold, the kernel and the syscall libraries continue to remain simple, flexible and elegant. Of course, the number of variants have grown into thousands today but that is a testament to the strong conceptual integrity and sound architectural foundation.
There is one quirk in Unix that has never been fixed. The unified namespace that works so well with serial, parallel, sound, storage and other devices breaksdown when it comes to network interfaces. Network devices dont register themselves in the standard device tables. They use their own set of tables. There is no special file type that caters to network devices or ports. There is no /dev/eth0 or /dev/tcp/53 or /dev/udp/80. Network devices use their own set of i/o system calls.
I can transfer a file to disk by "cp hello.txt /media/sda/hello.txt" and not have to worry about the physical structure on the disk, but I cannot stream a file to another machine over a network with "cp hello.txt /dev/tcp/9000". I can restrict a bunch of users from using a CDROM, but there is not /dev/eth1 whose permissions I can set on a owner or group basis.
Is it because the network stuff was designed in Berkeley on the West Coast while the namespace stuff was done by Unix Systems Lab on the East Coast? When teams cannot (or should not) communicate amongst themselves on a regular basis, architectural convergence is difficult to achieve (see Conway's Law).
The Video drivers and X11 is another sub-system that is horribly divergent from the clean namespace interface. Taking a screenshot should have been as simple as copying /dev/window to a disk file. Now we have hundreds of system calls and each sub-system comes with its own set of tools.
Plan 9 set out to fix the problem, but it was hobbled by a restrictive license till a few years back. It continues to plod along. I guess Unix is so well-entrenched in the market that people are willing to put up with its quirks rather than take up something radically different. But then, the distributed operations intrinsic to Plan 9 could tilt the balance in its favor with the arrival of multi-core chips.
Perhaps, somewhere, some student could be cooking up a disruptive innovation with Plan 9.
Monday, March 12, 2007
Buzzwords and oxymoron
Of late, there is a deluge in buzzwords and oxymorons in technical press. Here is an example from EETimes (emphasis mine):
I can understand if companies use terms like pen drives or thumb drives to push their flash memory sticks. They are trying to sell into a market dominated by disk drives. But, professional journals like IEEE and ACM have no excuse for using terms like USB drive for purely solid-state memory sticks. The disease seems to afflict only flash memory with USB interfaces. Others terms like SecureDigital or CompactFlash or MMC are treated propertly.
Mmmm... should a flash stick develop i/o errors, will a few drops of mustard oil get it going again smoothly :-)?
Intel on Monday introduced its first solid-state drive, a device that uses NAND flash memory for common PC or embedded application operations, instead of the slower spinning platters common in traditional hard-disk drives.A solid-state memory is built with silicon chips and doesn't have a motor, so where is the question of a drive or a disk ? This is no printer's devil. The article goes on to talk about a drive with no moving parts. What exactly were the news editors doing when checking this article?
The Z-U130 offers a faster storage alternative for locating boot code, operating systems, and commonly accessed libraries. The drive, which has no moving parts and is available in 1-Gbyte to 8-Gbyte densities....
I can understand if companies use terms like pen drives or thumb drives to push their flash memory sticks. They are trying to sell into a market dominated by disk drives. But, professional journals like IEEE and ACM have no excuse for using terms like USB drive for purely solid-state memory sticks. The disease seems to afflict only flash memory with USB interfaces. Others terms like SecureDigital or CompactFlash or MMC are treated propertly.
Mmmm... should a flash stick develop i/o errors, will a few drops of mustard oil get it going again smoothly :-)?
Friday, January 12, 2007
iPhone - all smoke and no fire
Apple announced the all-new iPhone this week amidst much fanfare and rhetoric about "revolutionary new mobile phone". I hold Apple to high standards in innovation and this product announcement came in as a big disappointment. iPhone, stripped of all rhetoric, is just a wi-fi palmtop computer with a cellular half-modem (MacModems?) bundled with a cellular broadband service. Will it meet the same fate as Newton?
Many communication features, touted in the demo, have been around for a very long time and some (like threaded view of SMS) are incremental innovations. How does making a call automatically by selecting an address book entry become a "revolutionary" feature? It has been around for ages. A 2-megapixel camera is hardly something to rave about these days.
Cellphones have been around for many years now and enough use cases are available to create truly innovative phones. How many times have we been annoyed by cellphones ringing during inappropriate moments? How many times have we looked at our cellphone to check our current location? How many times have we had to answer "Where are you now?" over a cellphone and wished we knew? How about those calls from pesky telemarketers who keep calling you repeatedly? Ever wished your phone could speak out caller id instead of ringing while driving?
iPhone would be revolutionary if only it could screen calls with whitelists (announce only calls from a select group), blacklists (dont let these calls through), timed "sleep" (do not disturb for next x minutes), announce callers while watching video or shooting photos but delay it to end of track while listening to music. Then I can attend meetings in peace and block out those pesky callers. If it had a GPS receiver, I could sms my co-ords while requesting a taxi, seeking directions in an online map or simply letting someone know where I am. While driving, I could use voice control to take a call in speaker mode without taking my hands off the wheel.
iPhone has wi-fi and broadband but doesn't come with a VoIP service like Skype. This is a huge omission. I hope they open up the software stack for third-party developers. I dont see how the platform can survive for long without software extensions. The symmetric rectangular form factor is suited for a palmtop computer but not a phone. It is so easy to get it upside down when responding to call. Handheld devices are tested for feel and orientation. Did Apple get so bowled over by the landscape mode that they let this slip by? Soft buttons and lack of voice control will exclude visually impaired folks. The high-resolution screen is nice for photos and movies but will be a big strain on the eyes for reading text. It should be good for reading scanned visiting cards.
iPhone announcement was not a total dud. The idea of using a pinching action to zoom in and out is really revolutionary. The "pinch" breaks the implicit notion that a touch screen is just for pointing. Back in 1960s, when mainframes processed jobs in batches, Ivan Sutherland invented direct interaction with his Sketchpad system and pioneered many of the graphical interfaces that we take for granted today. It is good to see this line of innovation still being pursued in Apple.
Many communication features, touted in the demo, have been around for a very long time and some (like threaded view of SMS) are incremental innovations. How does making a call automatically by selecting an address book entry become a "revolutionary" feature? It has been around for ages. A 2-megapixel camera is hardly something to rave about these days.
Cellphones have been around for many years now and enough use cases are available to create truly innovative phones. How many times have we been annoyed by cellphones ringing during inappropriate moments? How many times have we looked at our cellphone to check our current location? How many times have we had to answer "Where are you now?" over a cellphone and wished we knew? How about those calls from pesky telemarketers who keep calling you repeatedly? Ever wished your phone could speak out caller id instead of ringing while driving?
iPhone would be revolutionary if only it could screen calls with whitelists (announce only calls from a select group), blacklists (dont let these calls through), timed "sleep" (do not disturb for next x minutes), announce callers while watching video or shooting photos but delay it to end of track while listening to music. Then I can attend meetings in peace and block out those pesky callers. If it had a GPS receiver, I could sms my co-ords while requesting a taxi, seeking directions in an online map or simply letting someone know where I am. While driving, I could use voice control to take a call in speaker mode without taking my hands off the wheel.
iPhone has wi-fi and broadband but doesn't come with a VoIP service like Skype. This is a huge omission. I hope they open up the software stack for third-party developers. I dont see how the platform can survive for long without software extensions. The symmetric rectangular form factor is suited for a palmtop computer but not a phone. It is so easy to get it upside down when responding to call. Handheld devices are tested for feel and orientation. Did Apple get so bowled over by the landscape mode that they let this slip by? Soft buttons and lack of voice control will exclude visually impaired folks. The high-resolution screen is nice for photos and movies but will be a big strain on the eyes for reading text. It should be good for reading scanned visiting cards.
iPhone announcement was not a total dud. The idea of using a pinching action to zoom in and out is really revolutionary. The "pinch" breaks the implicit notion that a touch screen is just for pointing. Back in 1960s, when mainframes processed jobs in batches, Ivan Sutherland invented direct interaction with his Sketchpad system and pioneered many of the graphical interfaces that we take for granted today. It is good to see this line of innovation still being pursued in Apple.
Subscribe to:
Posts (Atom)