R

Deploy machine learning models with R Shiny and ONNX

Python is often the go-to language for machine learning, especially for training deep learning models using the PyTorch or TensorFlow libraries. Python definitely provides nice tools for deploying such models on the web as REST APIs or GUI web applications. However, models can also be exported to the ONNX format and subsequently be used for inference using an ONNX runtime. Conversion to ONNX format, as opposed to doing inference using PyTorch, is beneficial as the ONNX runtime comes in a much smaller package in terms of size and is very efficient.

Plant ID app (part 2): REST API

In part 1 of this blog post, we downloaded ~25.000 images of 100 plant species and trained a deep learning classification model. The 100 plant species are included in the Danish stream plant index (DVPI). In part 2, we create a REST API with endpoints/services that can be accessed from a very simple landing page. All code from parts 1 and 2 of this blog post can be found on GitHub.

Plant ID app (part 1): Data and model training

Plants species can be truly difficult to tell apart and this job often requires expert knowledge. However, when images are available computer vision methods can be used to guide us in the right direction. Deep learning methods are very useful for image analysis. Training convolutional neural networks have become the way to solve a wide range of image task including segmentation, classification, etc. Here, we will train a lightweight image classification model to identify 100 different plant species.

Shiny app for interactive time-series processing

Recently, there was a need for a way to cut and manipulate some timeseries data that had been collected to quantify greenhouse gas emissions. After collection, it is necessary to manually explore the data and select parts of the time-series for further analysis. This was an obvious case for an R Shiny app that could easily be shared with others and used for interactive processing of the data. Furthermore, the short time from idea –> sketch –> prototype –> test –> deployment is just incredible.

Creating mosaics from Sentinel 2 satellite imagery

Satellite imagery are collected at large scale and made freely available by institutions ESA and NASA. This data is collected at high spatial (10-30 m) and temporal (~2 weeks) resolution making it ideal for many applications. However, going from raw satellite imagery to nice looking image mosaics can be quite a mouthful. Here, I show how to use the gdalcubes R-package to produce a nationwide image mosaic of Denmark.

New R-package for flow routing on digital elevation models

Digital elevation models (DEMs) are very convenient for modeling water flow. Some of the applications include delineation of watersheds, flowlines, or deriving useful other useful measures such as the ‘height above nearest drainage’ (HAND, link to another post on this). As a consequence of climate change, the frequency of extreme precipitation events is expected to increase in the future. Therefore, knowing the whereabouts of water is highly relevant and an important tool for the management of surface water in the landscape.

Shiny apps for creating lake bathymetric maps

In a previous post I showed how to use R for creating bathymetric maps for lakes. To make this process even easier, I have created two apps using Shiny. The maps can be downloaded, opened in Google Earth on both desktop and mobile making it easy to bring along. Try theme out! Shiny for interactive data exploration The R Shiny framework is a simple way to turn R analysis or pipelines into interactive web applications.

Bathymetric maps and interpolation with R

Knowing the depth of in aquatic environments are of interest to many e.g. sailors in coastal waters or anglers in lakes. We can measure the depth at different geographic coordinates and use this information to produce bathymetric maps and contour lines. Often however, measurements are only obtained from relatively few points which means that interpolation is required to produce continuous and pretty maps. Higher quality maps can be produced using modern sonar and echo-sounder technology but this may also require gaps to be interpolated.

Politics and environmental monitoring of aquatic systems

Monitoring of the aquatic environment is necessary to support proper management. The Environmental Protection Agency and Danish Centre For Environment And Energy are responsible for the large scale monitoring of the aquatic environment in Denmark. This covers lakes, streams, coastal habitats and more. Recent and historic monitoring data are publicly available from the surface water database ODA. This is an amazing initiative which I use extensively in my research.