$msg = "";
$myaddress = "sacha" + "@" + "sachachua.com";
$page = "LinuxChallenge.php";
$page_title = "Linux Challenge";
$page_updated = "2004-12-04";
$http_equiv = "Content-Type";
$meta_content = "text/html; charset=utf-8";
$maintainer = "mailto:email@example.com";
$home = "WelcomePage.php";
$index = "WikiIndex.php";
$style = <<
|A1||X||Write project paper (2002.12.04)|
|A2||X||Read winning entries from last year (2002.12.04)|
|A3||X||Finalize topic (2002.12.04)|
I'd like a way for people to see what I'm working on. I'd like a way to share information easily. I'd like a way to keep an insanely detailed journal. I'd like a way to leave reminders that trigger when I leave a room, or at a specified time, or when I meet someone. I'd like something that integrated with my Emacs - my BBDB, my mail, everything. ... but I want to be able to share this with other people by giving them scraps of paper with buffer contents.
DO NOT GET DISTRACTED.
Write file://../../notebook/school/current/cs197/proposal.tex Write file://../../notebook/school/current/cs197/thesis.tex Write BpiScienceAward
Wearable technology promises to make these adaptations available anywhere they are needed. Blind people with a wearable computer can use a customized environment anywhere they go. The wearable computer can act as an intermediary between them and otherwise inaccessible computing resources. Focusing on speech synthesis and low-cost devices, the Assistive Wearable Computing project seeks to develop ways in which blind people can use computing resources more effectively.
Research will initially focus on adapting existing software to a speech-only assistive wearable computing paradigm and simplifying interfaces so that the software can be used without visual output. Human-computer interaction insights, speech synthesis improvements and new forms of input and output will also be explored. Multimedia capture and playback, automatic speech recognition and optical character recognition can greatly add to the data a blind person can use.
In one year, the Center hopes to offer a complete, low-cost off-the-shelf solution for blind users. Possible applications range from simple Internet surfing, text messaging and e-mail to programming complex applications with full speech feedback.
Possible future research areas include human-computer interaction, accessibility, artificial intelligence, speech synthesis and recognition, and image recognition.
|B||X||Compile a list of packages relevant for the blind and visually impaired (2002.10.15)|
|B||C||Make speakup package for Debian (2002.10.15)|
|B||C||Help with accessible-debian-installer (2002.10.15)|
|B||C||Close delYsid's RFP: http://bugs.debian.org/160783 (2002.10.15)|
|B||C||Package mbrola for non-free (nil)|
|B||C||Find a way for blind people to manually detect hardware on boot (nil)|
|A||X||Make Emacspeak Gnus usable. (nil)|
|B||X||Get Emacspeak CVS and run it (nil)|
|B||C||Add Emacspeak advice to the ido functions. (2002.09.20)|
|B||C||Add Emacspeak advice to the planner functions. (2002.09.20)|
|B||C||Modify text-mode menus so that I can easily access it with emacspeak and a keyboard (2002.09.20)|
#B X Track down emacspeak DEL bug (2002.09.12) Emacspeak and an emacspeak interface to Festival lite are a pretty neat combination. I upped the pitch a little so that I could hear it better over background noise - you just have to fiddle with the source. Nice for audio-only wearables or to supplement display. You can set character echo on so that you're sure that the keys you typed in are correct.
In the Philippines, carriers have begun offering fixed-line SMS services that let users without mobile phones send messages using the keypad and receive them through text-to-speech synthesis.
Islaphone subscribers get text serviceshttp://www.inq7.net/bus/2001/may/18/bus_7-1.htm
#B _ Hook into BBDB so that I can easily mail/text people (2002.09.13) #B _ Hook into BBDB so that I can easily take notes on conversations with people (2002.09.20) #B _ Use a real database for contacts (2002.09.20) #B X Check out BBDB in a real database (2002.09.16) bbdb keeps track of contacts. Regexp searchable with customizable fields.
Conversation logging#B _ Add a way for me to easily start a conversation, adding BBDB records. (2002.09.21) Popup conversation buffer F9 c SPC - start a conversation or continue the current one
C-c h - hello prompts for a string. Searches BBDB. If found, person is added to the conversation. If there are any reminders for that person, an audible icon is played. Timestamp (last-spoke) is updated. C-c b - bye prompts for a string. Can use history mechanism to select person who left. C-c c - end of conversation Data is saved. C-c SPC - add quote Select person speaking via history mechanism, if more than one. Insert time-stamp and name.
#C X Write about planner.el and allout.el contention for C-c bindings #C X Find out why files are not honoring local variables. #C X Bind planner-renumber-tasks to a shortcut key #B X Remove planner.el clobbering of things like C-c C-n #B _ Add a way to see the deadline (2002.09.30) #B X Add recurring tasks (2002.09.30) #B X Make all names inside the TODO parenthesis wikified. (2002.09.20) #B _ Make planner more Emacspeak-friendly. (2002.09.13) #B _ Add a view of all tasks in a particular file; occur, maybe? (2002.09.30) #B _ Add a way to browse through the task categories in a planner file. (2002.09.30) #B X Figure out a way to have global task IDs so that I can consistently refer to tasks even if I raise or lower their priorities. (2002.09.19) #B _ Add a way for me to see all active tasks that haven't been marked as completed yet. (2002.09.12) #B _ Find planner rescheduling bug when starting from the wiki page (2002.09.20) #B X Fix task sorting - don't sort when saving. Just do sorts manually. (2002.09.19) #B X Add a way to move all unfinished tasks to tomorrow. (2002.09.16) #B X Add planner-goto-tomorrow and planner-goto-yesterday (2002.09.16) #B X Add a way to procrastinate tasks to the next day. (2002.09.13) #B X Find way to make tasks inside planner pages automatically renumber. (2002.09.13) #B X Fix task rescheduling. (2002.09.11) #B X Get planner to work well with allout [either modify allout or the wiki] (2002.09.12) #B X Find a way for planner to automagically append notes stolen from anywhere (2002.09.12) #B X Add remember.el functionality to planner.el - the ability to pop up a window, add the timestamped note to the planner file for today, and C-c C-c to save and close. (2002.09.12) Keeps track of tasks, appointments and notes. Allows me to easily reschedule tasks. Uses wiki tech. Can link to BBDB records, e-mail addresses, webpages... Thu Sep 12 15:45:31 2002 Modified planner.el to sort completed tasks down to the end of the list. Added planner-remember and planner-remember-region to planner-config.el
#B _ Get gnus to recognize Alamin messages and bring up the appropriate phone book entry (2002.09.20) #B X Get audible alerts for new mail and text messages (2002.09.30) #B X Integrate mairix with gnus, providing easy mail search (2002.09.11) #B X Retrieve a copy of my old post to firstname.lastname@example.org about my wearable configuration (2002.09.11)
#B X Find out why help buffers aren't showing up in a different buffer. (2002.09.19) #B X Fix allout.el bug with LinuxChallenge #B X Turn off automatic bold/underline/italic in emacs-wiki (2002.09.13) #B X Add a Twiddler lock so that I don't accidentally enter keystrokes while walking. (2002.09.13) Oops. I just realized that I break task references... but oh, what the heck, they're already broken when you move tasks up and down. What then is a robust way to refer to tasks? Maybe I can have an elided task-information appended to every task, or I can add a note to the parenthesis... What about links to relevant records?
Excellent way of organizing hierarchical data. I use it for my school notes. There's fontlocking code on the wiki. #B X Get allout's fontlocking to cooperate with Planner mode. (2002.09.30) #B _ Find a way to do the collapsing subtrees thing. (2002.09.30)
remember.el allows you to quickly jot down notes, timestamping and filing them away in an outlined ~/.notes. You can probably modify it to keep track of your context, too. I've merged this functionality into my copy of planner.el Hey, remember.el _can_ do that. Pfft. Moving it back out of planner.el
The Rememberance Agent suggests relevant documents based on a configurable number of words around your cursor. It can index mail and other data, making it quite useful. You can apt-get install remembrance-agent on Debian systems. I find it too slow on my 300 MHz machine, though, and screen space is fairly limited. However, it might be nice to have a searchable interface - I tend to remember query keywords anyway...
Handy for looking at webpages on my hard disk or when I'm connected to the Net. Plenty of keyboard shortcuts, so nice for wearable use.
I'm going to try and see if I can store pictures of people so that I can use that to help me learn names <-> faces...
Software distribution for normal use, with automatic networkAutomatic network detection and configuration, allowing the wearable user to use available network printers and other resources.
Software distribution for advanced useSoftware distributions for specific user needs: mathematical software, educational software, foreign language dictionaries...
Equipment neededFor prototyping, the Center needs a light, powerful and stable wearable computer with long battery life and wireless networking.
Actual deployment will be done on new subnotebook and notebook models
|B||X||Subscribe to blinux-list (2002.09.20)|
|B||X||Subscribe to emacspeak list. (2002.09.20)|
|B||X||Subscribe to blinux-develop (2002.09.20)|
|B||X||Subscribe to blinux-announce (2002.09.20)|
This page is owned by Sacha Chua. I want to win the contest, so be nice, mmmkay?
I'm going to take Emacs and make it the center of my Linux system. ;) I'm going to make the most common functions accessible with a few keystrokes. I'm going to tie Emacs into practically every way I use my computer. And I'm going to write about the experience so that other people will be able to get similar setups up and running.
Why would I want to do that? Read on.
Wearable computing is an emerging field that promises constant access to information through a computer that is always on and interacting with people. They can be used in the background while people concentrate on other tasks. But wearable computing interfaces and applications still need to be refined. Studies have shown the inadequacy of the windows-icons-menus-pointers (WIMP) paradigm for wearable computing. Work is being done on developing new user interfaces and human-computer interaction guidelines for wearable computing.
An audio-based wearable makes wearable computing much more feasible. Head-mounted displays are currently too expensive for consumers and their conspicuousness makes social acceptance of wearable computing unlikely. Headphones are easy to buy and are more acceptable because they are frequently associated with listening to music. However, output is much more limited, and the interface considerations are very different.
Other applications of a simplified computing interface include assistive technology. Accessibility is often an afterthought in device design, and many of the devices we use everyday do not consider disabled users. For example, blind people have a hard time sending and receiving text messages and e-mail, or even using cellphones and computers in general. They are also at a disadvantage in a desktop-oriented world.
Similarly, a simplified computing interface can help people with other disabilities. For example, typing or fine-grained mouse control may be next to impossible. Large text display on a regular-sized monitor requires careful rethinking of information displayed on screen. Adaptations could include hardware designed for specific needs such as an oversized keyboard, an eye-tracker, or even a mouth-operated input device. Typing can be slow and painful, so keystrokes must be minimized and the interface simplified. A wearable computer can help overcome some of the difficulties disabled people face - for example, people with aphasia or other speech impairments can select frequently needed phrases from a menu.
My goal is to develop a Linux system that allows people to easily communicate, store and retrieve information while relying on limited input and output capabilities. I will put together and document an easily-configured working environment suitable for wearable computing use. I will focus on audio-only output and one-handed keyboard input, but the system can easily be extended to take advantage of limited visual display (large font display) and other input and ouptut methods.
By limiting the number of keystrokes needed to launch often-used commands, providing a simple interface for e-mail, text, and other tasks, and documenting the aspects of the system, I hope to make it easier for other people to get into wearable computing and assistive technology.
Possible future directions for this research:
I have been working with Emacs, the popular, extremely customizable editor that runs on many platforms.
Speech output is handled by Emacspeak, a software package that adds audio feedback to Emacs. Emacspeak interfaces with a wide range of hardware and software speech synthesizers. I am currently using a free speech engine called Festival Lite, and I am slowly simplifying the interface while learning more about Emacs.
I chose Emacs because it already has a lot of the functionality I need. Emacs has modules for almost every task, and it is easy to extend it with my own code. Emacspeak's support for various synthesizers makes the solution more portable, and I can easily share my improvements with others.
It is easy to integrate pre-existing free software into my Emacs environment. For example, a freely-available GSM SMS gateway installed on my computer allows me to send and receive text messages through my Emacs mail reader. Emacs' address book and planner capabilities are also quite promising.
I need to tie all of these different elements into a coherent, easy-to-use system. I need to write it up and make it easier for other people to get started. I've been posting descriptions of my system on various mailing lists and I've received some encouraging mail from other wearable enthusiasts telling me that they found my write-ups useful.
I need comments and feedback from other people who are interested in wearable computing or assistive computing.
For assistive computing, I want to get in touch with blind people who are interested in using computers more effectively. I want to find out how they're currently using their computers (if they are) and what their wishlist is. I also want to hear ideas on how to make computers more accessible for people who have other disabilities.
For wearable computing, I want to know who's interested and what's stopping them from getting into it. ;) It's really fun and interesting. <laugh>
Oh, and I need a _lot_ of help in learning how to write research proposals and conducting research.
It's nothing too ambitious - just a simpler system that I can use while walking around.
After I refine the system and produce lots of documentation (adding to the Wearable-HOWTO, perhaps), I can start exploring alternative interfaces and paradigms. The lifestreams metaphor looks particularly interesting...
More details to follow as I continue fleshing out the idea.
As Doc Mana's seen, I've been quite active on local mailing lists. After describing my research interests on the digitalfilipino and plug mailing list, I received quite a few messages offering help, funding, encouragement... Even some of the students have expressed interest, although they feel that the pricetag on wearable computing parts puts this field out of reach. From my research and experimentation, however, I've learned that you don't actually need fancy visual displays - audio, a good keyboard and maybe some form of wireless communication is already pretty good.
I can go Java if you guys want. We're trying to go for the wireless Java thing, right? That's certainly a possibility. I'm also having a lot of fun (and success) working with Linux. I think Ateneo will be able to get into this easily if it chose to.
Right now I'm learning more about Emacs so that I can simplify its interface. For example, text mode menus still need some rework. I've already begun to contribute to the modules I use often - it's surprisingly easy to break into, you know...
One of the interesting side effects of learning more about wearable computing is that you also get a bit exposed to assistive technology and to human-computer interaction issues. If we develop wearable computing as one of our competencies, we may also be able to branch out into assistive technology, which fits in quite nicely with our "men and women for others" principle and allows us to help the many disabled people in the Philippines. This is a possibility the department may want to explore.
As far as I know, here's the situation:
Wearable computing is pretty advanced in MIT, GaTech and other universities that have started early. CharmIT (www.charmed.com), Xybernaut and other companies sell commercial wearables and hobbyists on the wear-hard list also hack together their own.
The National University of Singapore is in the process of establishing a wearable computing research group under Dr. Adrian Cheok. They're planning to focus on Bluetooth and other wireless technologies.
We have CE students who are interested in wearables - Allan de Jesus and Rhyan Andrade, both working on a wearable keyboard in CE150. We have some CS students who are interested, but they don't have anything to hack on.
UP is thinking of getting into wearables. Contact Rommel Feria for more details.
The biggest barrier I see right now is lack of hardware. Visual isn't very important, but convenience and battery life is. I use my laptop as the core of my wearable computer setup, and at 1.67kg it was one of the lightest laptops of its time. Still, I find it somewhat inconveniently large and heavy. I don't need a built-in display or keyboard. A subnotebook like the Fujitsu Lifebook series or the Sony PictureBook would be.. well... pretty darn cool, and I feel that I'd be able to test an even more wearable computer with one of those. If I could get _any_ machine I wanted, though, I'd probably go for the Transmeta version of the CharmIT (www.charmed.com), partly because it has everything, and partly because Thad Starner (MIT guy who went to GaTech - one of the first borgs) recommends having a standard platform so that wearable users can swap improvements more easily.
Okay, time for me to get back to work..
This is the report of my Master of Engineering research project on Web Accessibility for Visual Impaired Users from the Client side. With the launch of the Web Accessibility Initiative in April 1997 during the sixth International World Wide Web Conference, it became clear that building and redesigning the Web to be accessible to people with disability would become an important directive of the World Wide Web Consortium (W3C). Overlaid with a graphical user interface (GUI), people with visual disabilities are may be the most concern community of users with disabilities. Following the W3C initiative, a lot of countries and governments made accessibility a big concern.
In this context, lot of researches are sponsored or funded by governments and/or organisation. Web Accessibility Ireland is a RINCE research project funded by AIB aim at Web Accessibility for Visual Impaired Users from both server and client side.
One of the biggest problem on the client side today is the complexity and the high cost of available solution for blind people. On this way they are put in a disadvantage compared to sighted people. The contribution of this project to the overall is to solve that problem, providing visual impaired users with a simple, cheap and customised interface for accessing the Internet.
Wearable UI White Paper http://naga.mit.edu/wearui.html
What do we want from a wearable computing interface? http://www.cs.washington.edu/sewpc/papers/clark.pdf
PhD Nine Month Report http://www.ecs.soton.ac.uk/~pass99r/research/9monthsfirst/ A short overview of the state of wearable computing and a survey of projects
Designing a Palm user interface, Part 1 http://www-106.ibm.com/developerworks/library/us-shrunk/?dwzone=usability
Designing a Palm user interface: Part 2 http://www-106.ibm.com/developerworks/library/us-shrunk2/?dwzone=usability http://www.acm.org/sigchi/bulletin/1997.4/bass.html http://leb.net/blinux/
T. V. Raman, a technology consultant for Adobe Systems who is also blind, believes that attempting to retrofitting speech onto graphic environments is a mistake. While you could get away with speech readers in a very poor visual environment 10 years ago, says Raman, speech readers in the GUI world are like "standing around and feeling the different parts of an elephant to figure out it's an elephant."
Raman believes that Rosmaita's approach of graphics-as-difficult-to-access and text-as-simple-to-access presents a false dichotomy, especially considering the many confusing ways in which text can be formatted.
Raman designed an audio desktop that builds speech capabilities directly into applications, as opposed to adding speech capabilities afterward. At Adobe, he's currently working to take the graphic PDF files and make them "useable in as many ways as possible."
Currently, Emacspeak cannot render mathematical expressions intelligently. In the early 90's, however, T.V. Raman wrote a program called AsteR (Audio System for Technical Reading) which is specifically designed to read LaTeX documents, including mathematics. See his web site at http://www.cs.cornell.edu/home/raman/ for further specifics. An online demonstration of AsteR is included, along with Raman's Ph.D. thesis in various formats.
In short, whereas commercial programs for DOS, OS/2 and Microsoft Windows which provide speech access to applications are essentially concerned with providing an auditory rendering of the screen display, Emacspeak has access to, and takes full advantage of, the underlying logic of the application to present a speech interface that parallels, but is not simply derived from, the visual interface. For this reason the quality of the spoken interaction is superior to that which can be achieved via screen readers. Furthermore, Emacspeak supports auditory icons, audio formatting (that is, the use of prosodic characteristics of the speech to convey meaning) and other interface features which enhance the user's experience and improve the effectiveness of the interaction. These features aren't available from popular screen access packages.
Finally of course, Emacspeak runs under Unix-like operating systems, which have many advantages in themselves, including a wealth of useful software, a high degree of configurability and so forth. Naturally, Emacs itself superbly exemplifies these characteristics.
|B||X||Write proposals and abstracts for three projects and send it to Rommel Feria (2002.09.17)|
|B||X||Write a proposal for accessible wearable technology (2002.09.16)|
|B||X||Ask for hardware - like a subnotebook with a working sound card... (2002.09.16)|
|B||X||Meet with Rommel Feria (2002.09.13)|
|B||X||Look at IBM technology and see if there's anything I can use from there (2002.09.12)|
include "include/footer.inc.php" ?>