[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

On Braille and Linux

AFAIK, there isn't any braille screen input support for Linux, so I've been considering trying to implement some.  To be clear, I'm thinking about a program that detects and characterizes touch events and then recognizes gestures, and finally reports on these as Unicode sequences.


My approach is based on the Elixir programming language.  For the curious, I'm planning to construct a directed acyclic graph (DAG) of lightweight processes, flowing from the touch screen to a back end application which would handle braille input and feed it to various Linux CLI tools.  The graph would be created and used by means of an Elixir pipeline:

	get_action() |> add_gesture()

start_reporter() starts up a "reporter" process which listens for touch and gesture events, boils them down a bit, and reports the resulting Unicode to the client application.  add_gesture() is the front end for a set of gesture recognition processes.  It starts these up as needed, broadcasts messages about actions to them, then mostly gets out of the way.


One major issue is that I'm not at all clear on how to deal with the output.  What back-end programs should I target and what interface(s) do they generally want to deal with from input devices?  For example, is there an easy way in Linux for a user mode process to emulate a keyboard device? 

More generally, I'd like to get some feedback on a11y, system and user interfaces, etc.  For example:

- What back-end programs should I target?
- What kinds of gestures would folks want?
- What sorts of interfaces should I present?

Advice, caveats, clues, and pointers would all be welcome...


Reply to: