For Data Visualization we have used NASA Worldwind


Cumbum (Theni District, Tamilnadu, India)

Green- Coconut farm Yellow- Partly Coconut farm Red- Not a Coconut farm

Our Deep Learning coconut model can able to predict the coconut farm in a specified area with 80% accuracy, still work in progress on the neural network architecture to achieve greater prediction accuracy.

For Crop Health analysis we have used 4-band multispectral satellite imagery (blue, green, red, near-infrared) with pixel size 3m

More crops coming soon with more coverage area.

In the area where our Deep Learning codes for coconut model has been tested, we show the Vegetation Index for the Month of January and February 2017

The vegetation index displayed represents relative chlorophyll content, which correlates with vegetation vigor and productivity. The red tones represent low relative chlorophyll content while the green ones show high relative chlorophyll content.

Our Implementation Shows you can find a Crop (example : coconut crop) based on Deep Learning and use the In-season Multispectral Imagery in the same location to find the Crop Health.