Mentions légales du service

Skip to content

VPL advanced sensor event block - interface proposal

Created by: ddoak

GENERAL

The VPL sensor event sliders allows the user to create complex conditional events which extend the functionality and potential of advanced mode.

However I feel that the current implementation of the interface is potentially confusing and hard to learn / interpret.

The following is a proposal for a revision of the interface. I have tried to be brief and to the point but also to justify and explain the revised elements.

I have summarised the proposal in the accompanying diagram.

(a good test of this design would be to ask if the diagram is self-explanatory - so it might be useful to try to interpret the diagram before reading my explanatory text below!)

vpl_sensor_event

DESIGN GOALS

representation should be as intuitive as possible

representation should be as consistent as possible

keep the mapping to the physical robot as simple and direct as possible (position / colour)

don't make changes just for the sake of making it distinguishable from the previous release i.e. don't throw away things that were good!

try to make a system which is self explanatory and can be decoded without additional information

DETAILS OF PROPOSAL

SLIDERS

values should be quantised

why?

usability
consistency  - already done for motors, acc and timer

sliders would be better represented like micrometer calipers (see image)

why?

clarity - asymmetry emphasises nature of upper / lower bound
more prominent vertical overlap emphasises their relationship and behaviour (can't push past each other)

lower left slider should represent close threshold (fig 1)

why?

lower and left positioning intuitively feels like closer
for 5 out of the 7 sensors this is a direct mapping (up screen is moving away from the robot)
don't let the design be led by the fact that closer proximity generates higher numbers - we can't see the numbers
mentally the image of closer is smaller
the VPL user is thinking about distance! (not magnitude of sensor reading)

black solid area should be red (fig 1)

why?

sensor led is more red as object is closer
all practical experience of touching and using the robot reinforces the red = close mapping
black has no intuitive meaning in this context

upper left slider should represent further threshold (fig 3)

why?

upper right position intuitively feels like further away - (reasons as above)

white solid area is correct (fig 3)

why?

sensor led is off when object is far away
all practical experience of touching and using the robot reinforces the white / no led = far mapping

middle range between the sliders should be pink bar (fig 2) (red/white) left/right gradient fill if possible

why?

mirrors the sensor led behaviour (transition between red light and off as object moves away)
reinforces the continuity of values *and* reinforces the 3 bands

representation?

this could be an axis aligned rectangle or maybe a parallelogram between the 4 relevant slider vertices
    (parallelogram more visually represents switching function but might look quirky)

SENSOR BUTTONS

Button colour should reflect the appearance of the sensor when the event is triggered (as much as possible)

why?

keep the mapping simple and intuitive
other colours require the user to learn a non-intuitive mapping


1. red
    = close ( prox > lower )

2. pink ( gradient)

    = mid ( lower < prox <upper)

3. white

    = far (prox > y)

4. grey

    = any

buttons should cycle mode (on click) in this order (close -> mid -> far -> any) i.e. 1,2,3,4

why?

current order is counterintuitive - (close -> far -> mid -> any) feels like 1,3,2,4 to me

DISPLAY OF SLIDERS (when and how to render)

(refer to attached image)

if only one prox sensor is being watched in the event (not grey/any) then the irrelevant sliders should not be displayed (or be greyed out) i.e.

    1) close should hide upper slider and mid range bar

    2)  mid should grey out upper and lower slider (but still visible as interface drag handles)

    3) far should hide lower slider and mid range bar

    4) no sensors watched => no sliders / bars


why? 

this directly mirrors what is happening in the code - unused conditionals are deleted (or commented out)
users will most likely use just one sensor per event (at first anyway) so don't confuse the representation with extra detail

this should also apply to cases where multiple sensors are being watched but all in the same mode (close / far / mid)

for cases in which multiple sensors are watched in different modes the rendering is as in figs 5 - 8.


these visibility rules will require new code - but they are logical and well defined

DYNAMIC BEHAVIOUR

This interface as described will give dynamic cues (hints) to the user.

Imagine clicking on a led button to transition between close and mid range settings (i.e red led button -> pink led button)

the visual transition fig 1 -> fig 2 gives a clear explanatory indication of the change in the event settings (further clicks will cycle fig 2 -> fig 3 -> fig 4 -> fig 1...)

CIRCULAR LED INDICATORS

These could be removed from the current implementation

why?

they are not neccesssary - the information they provide can be carried by other interface elements 
they are ambiguous (positioning *between* sensor leds)
they clutter the screen

(in their favour they do correspond closely to the physical robot) 

It is likely that I have missed some cases or arguments - please feel free to comment.