Tracking Snowfall with Distributed Webcams

I am interested in tracking the snow season of my favourite ski resort in the Alps, to help get a good idea of when are the best times to go.  MATLAB Trendy is a good way of tracking this kind of information on a day by day basis.  However, in order to achieve this I would like to be able to analyse webcam data to measure snow coverage.  Fortunately the Avoriaz resort website allows users to download historical webcam data, which allowed me to train a very simple linear model which produces a metric based on the red, green and blue channels of a webcam image.

Eventually I should be able to track the average snow coverage for the entire region, but applying the model to all the available webcams in the region.

Read in Data

This is being trained by looking at an automatically generated .MP4 file which depicts a time lapse of Avoriaz over the course of a year.  We can use this to train a basic model which will provide a metric indicative of snow coverage.

    v = VideoReader('Station_1-Year_2012-09-23.mp4');
    allFrames = v.read;

Train Predictive Model

We begin by training the model on some sample data. For the purposes of this script I’ve taken every fourth frame, and manually selected which ones have snow coverage.

    frames = allFrames(:,:,:,1:4:end);
    nFrames = size(frames, 4);
    figure
    subplot(3,1,1); imshow(allFrames(:,:,1,1)); title('Red Channel');
    subplot(3,1,2); imshow(allFrames(:,:,2,1)); title('Green Channel');
    subplot(3,1,3); imshow(allFrames(:,:,3,1)); title('Blue Channel');

Here we take each of the frames and get the average red, green, and blue values of the entire image

    [red,green,blue] = extractRedGreenBlue(nFrames,frames);

I then define my target array, and train my model. In this case, zero indicates no snow, and one indicates snow

    target = [0 0 1 1 1 1 0 0 0]';
    predictors = [red' green' blue'];
    fit = LinearModel.fit(predictors, target);

We can visualise the model response as follows

    predictions = fit.predict(predictors)
    plotBarChart(red, green, blue, predictions);

Vaildate Model with Separate Test Data

This model needs to be tested on some of the other frames from the video. We can follow the same process, by changing the start index and using the model parameters from our trained model.

    redCoeff =  0.042413;
    greenCoeff = -0.1688;
    blueCoeff =  0.09018;
    constant = 4.3638;
 
    frames = allFrames(:,:,:,3:4:end);
    nFrames = size(frames, 4);
 
    [red,green,blue] = extractRedGreenBlue(nFrames,frames);

We can see that the results are pretty good

    predictions = [red' green' blue']*[redCoeff; greenCoeff; blueCoeff] + constant;
    plotBarChart(red,green,blue, predictions);

Deploy Model to the web

In order to perform this analysis in realtime, we need to get hold of a live image. This is possible form the website http://www.snoweye.com/

    a = urlread('http://www.snoweye.com/?page=fr-ch-portesdusoleil');

Find the image which corresponds to Avoriaz

    [~, ~, ~, matches] = regexp(...
        a,...
        '/grabs/[0-9]+/[0-9]+\.jpg(?=([\W]*title[\W]*=[\W]*''Avoriaz''))'...
        );
 
    urls = cellfun(...
        @(x)['http://www.snoweye.com' x], ...
        matches, ...
        'UniformOutput', false...
        );
    myUrl = urls{1};

Read image and generate predictor values

    im = imread(myUrl);
    imshow(im)
 
    red = mean(mean(im(:,:,1)));
    green = mean(mean(im(:,:,2)));
    blue = mean(mean(im(:,:,3)));

Calculate prediction, which is in line with what we would expect. A Summer’s day provides a value close to zero.

    prediction = fit.predict([red green blue]);
    disp(prediction)
 
       prediction = -0.0162

Integration with Trendy

MATLAB Trendy can be used to track the RGB information of the webcam photos which are on http://www.snoweye.com/ by downloading all images on the page, and saving the average pixel colour. Admittedly we have trained the model to work with data from Avoriaz, so it is not clear whether it will work well for the aggregated information of all the photos on the Portes du Soleil page. I can come back to this in a year to see how it’s fared, and improve the model if necessary.

    a = urlread('http://www.snoweye.com/?page=fr-ch-portesdusoleil');

Find the webcam images

    [~, ~, ~, matches] = regexp(a, '/grabs/[0-9]+/[0-9]+\.jpg');
    uniqueMatches = unique(matches);
    urls = cellfun(...
        @(x)['http://www.snoweye.com' x], ...
        uniqueMatches, ...
        'UniformOutput', false ...
        );

Read in all the images

    red = zeros(length(urls),1);
    green = zeros(length(urls),1);
    blue = zeros(length(urls),1);
 
    for i = 1:length(urls)
        myUrl = urls{1};
        im = imread(myUrl);
 
        red(i) = mean(mean(im(:,:,1)));
        green(i) = mean(mean(im(:,:,2)));
        blue(i) = mean(mean(im(:,:,3)));
    end

Find the mean value of the red green and blue channels for all images on the page

    redPredictor = mean(red);
    bluePredictor = mean(blue);
    greenPredictor = mean(green);
 
    predictors = [redPredictor bluePredictor greenPredictor]
 
    coeffs = [redCoeff; greenCoeff; blueCoeff];
    constant = 4.3638;

Make a prediction for the red green and blue channels of all images on the page

    predictions = [red green blue]*coeffs + constant;
    overallPrediction = predictors*coeffs + constant;

On trendy this code has been implemented using the following four entries. It will be a few months though before we can validate the results. It also appears that the RGB information is slightly different so I will most likely have to retrain the online model to improve model performance.

Leave a Reply