Localisation - looking at the results in detail

As seen in the previous example localisation is performed inside the blackboard by different localisation knowledge sources. In this example we will perform a single localisation of a anechoic white noise signal coming from 0° using GmmLocationKS. The example can be found in the examples/localisation_look_at_details folder which consists of the following files:

Blackboard.xml
localise.m
SceneDescription.xml

For details on Blackboard.xml and SceneDescription.xml have a look at our previous example. Here, we will focus on the details after we performed the localisation. So, first run:

>> bbs = localise;

in Matlab. This runs the blackboard and does the localisation, but does not print any results, it only returns the blackboard as bbs. The blackboard itself stores lots of data in itself, see Dynamic blackboard memory for details. To see what is currently available in the memory run:

>> bbs.blackboard.getDataLabels()

ans =

    'headOrientation'
    'locationHypothesis'
    'sourcesAzimuthsDistributionHypotheses'

To analyse the localisation performance of the model we ask the blackboard to return the localisation data:

>> perceivedAzimuths = bbs.blackboard.getData('locationHypothesis')

perceivedAzimuths =

1x9 struct array with fields:

    sndTmIdx
    data

It returns a relatively complicated struct that comes with time stamps sndTmIdx and the corresponding data which again is a struct containing different things, here is the output for one time stamp:

>> perceivedAzimuths(9).data

ans =

  LocationHypothesis with properties:

    sourcesPosteriors: [72x1 double]
       sourceAzimuths: [72x1 double]
      headOrientation: 340
              azimuth: 20
      relativeAzimuth: 0
                score: 0.7028

But don’t worry there is an easy way to get an overview of the results. First, we are only interested in the summary of the localisation result:

>> sourceAzimuth = 0; % the actual source position
>> [loc, locError] = evaluateLocalisationResults(perceivedAzimuths, sourceAzimuth)

loc =

     0


locError =

     0

As you can see the source was localised at 0° meaning that the localisation error is also 0°. After that we would like to have a more detailed view on what happened during the localisation:

>> displayLocalisationResults(perceivedAzimuths, sourceAzimuth)

------------------------------------------------------------------------------------
Reference target angle:   0 degrees
------------------------------------------------------------------------------------
Localised source angle:
BlockTime   PerceivedAzimuth    (head orient., relative azimuth)    Probability
------------------------------------------------------------------------------------
  0.56        0 degrees     (  0 degrees,      0 degrees)   1.00
  1.02        0 degrees     (  0 degrees,      0 degrees)   0.65
  1.58        0 degrees     ( 20 degrees,    340 degrees)   0.56
  2.04        0 degrees     (  0 degrees,      0 degrees)   0.61
  2.51        0 degrees     ( 20 degrees,    340 degrees)   0.50
  3.07        0 degrees     (  0 degrees,      0 degrees)   0.69
  3.53        0 degrees     ( 20 degrees,    340 degrees)   0.73
  4.09        0 degrees     (  0 degrees,      0 degrees)   0.62
  4.55        0 degrees     (340 degrees,     20 degrees)   0.70
------------------------------------------------------------------------------------
Mean localisation error: 0
------------------------------------------------------------------------------------

Here, we see that the head was turned twice during the localisation and that the perceived location was always at 0°, but the model has not have always the same confidence that the source was really located there, which you can see by the Probability values. In the cases when they were to low they triggered a head movement in order to see if the values would be higher for another head position.

So far, we looked at all the details going on in the blackboard. In order to localise the blackboard uses different cues - like ITDs and ILDs - that are provided by the Auditory front-end. It might be of interest to have a detailed look on them. In order to see which are available, run:

>>  bbs.listAfeData

Available AFE data:

  'filterbank'
  'time'
  'input'
  'itd'
  'innerhaircell'
  'ild'
  'crosscorrelation'
  'head_rotation'
  'ratemap'

All those cues can be plotted with bbs.plotAfeData(cueName), for example:

>> bbs.plotAfeData('time');
../../_images/afe-time.png

Fig. 73 Left and right ear signals.

>> bbs.plotAfeData('ild');
../../_images/afe-ild.png

Fig. 74 ILDs between the two ear signals over time.

The next one is not really a cue provided by the Auditory front-end, but is it good to know in which position the head was pointing at what time:

>> bbs.plotAfeData('head_rotation');
../../_images/afe-head-rotation.png

Fig. 75 Head rotations of the model during localisation.