This project explores the use of touch gestures to interact with colors on a screen by producing sound based on hue, saturation and value of a pixel. By touching a pre-defined or custom image, the user could generate sounds based on a fixed mapping of color to note. A speech mode also announces the color at the touch-point. Read Full Report