You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Repository for Land Cover Recognition Model Training Using Satellite Imagery
Welcome to the dedicated repository for advancing land cover recognition through the application of state-of-the-art models on satellite imagery. This repository serves as a comprehensive resource for researchers and practitioners in the field, providing access to research code, detailed setup instructions, and guidelines for conducting experiments with satellite image timeseries data.
Featured Research Publications
This repository highlights contributions to the field through the following research publications:
Context-self contrastive pretraining for crop type semantic segmentation -
Published in IEEE Transactions on Geoscience and Remote Sensing, this work introduces a novel supervised pretraining method for semantic segmentation
of crop types exhibiti performance gains along object boundaries. Additional information is available in the README_CSCL.md document.
Environment Setup
Installation of Miniconda
For the initial setup, please follow the instructions for downloading and installing Miniconda available at the official Conda documentation.
Environment Configuration
Creating the Environment: Navigate to the code directory in your terminal and create the environment using the provided .yml file by executing:
conda env create -f deepsatmodels_env.yml
Activating the Environment: Activate the newly created environment with:
source activate deepsatmodels
PyTorch Installation: Install the required version of PyTorch along with torchvision and torchaudio by running:
Configuration: Specify the base directory and paths for training and evaluation datasets within the data/datasets.yaml file.
Experiment Configuration: Use a distinct .yaml file for each experiment, located in the configs folder. These configuration files encapsulate default parameters aligned with those used in the featured research. Modify these .yaml files as necessary to accommodate custom datasets.
Guidance on Experiments: For detailed instructions on setting up and conducting experiments, refer to the specific README.MD files associated with each paper or dataset.
License and Copyright
This project is made available under the Apache License 2.0. Please see the LICENSE file for detailed licensing information.