AI for Conservation / Feed

Artificial intelligence is increasingly being used in the field to analyse information collected by wildlife conservationists, from camera trap and satellite images to audio recordings. AI can learn how to identify which photos out of thousands contain rare species; or pinpoint an animal call out of hours of field recordings - hugely reducing the manual labour required to collect vital conservation data.


AI volunteer work

Hello All, I have recently joined this group and going through the current feeds and discussions i already feel that its the right group i'm search for sometime.I'm a software...

1 0

Hi Phani,

An entry point might be to participate in a challenge related to conservation on:

You could also reach out to a conservation organization (e. g. WWF or something smaller and more local) and ask them directly whether there's an opportunity for you to volunteer, perhaps even suggest an idea and maybe they find it useful.

I hope you find an opportunity you're looking for!

See full post

ChatGPT for conservation

Hi, I've been wondering what this community's thoughts are on ChatGPT? I was just having a play with it and asked:"could you write me a script in python that loads photos and...

42 9

You can already achieve both of them with your prompt. 
Or, if you're not using ChatGPT specifically but another LLM that you can fine tune, you can use RAG or fine tuning to extra train the algorithm on the data you want it to extract information from.
With ChatGPT you can create your custom GPT now.

In an earlier post in this discussion, I wondered how the funders are responding. Today, I stumbled on the following Request for Expressions of Interest for short term consultancy with the title 'Development of a Scoping Study on the use of AI in Evaluations for AF-TERG, CIFs E&L Initiative and GEF IEO, GCF IEU' ( My stress ) on the website of the Adaptation Fund. It's a huge fund ( USD 1 Billion ), so probably bogged down in bureaucracy and probably not the fastest to respond to new developments.

Here is the link to the EoI
See full post

Mass Detection of Wildlife Snares Using Airborne Synthetic Radar

Mass Detection of Wildlife Snares Using Airborne Synthetic RadarFor the last year my colleauges Prof. Mike Inggs (Radar - Electrical Engineering, Unviversity of Cape Town) and...

15 2

In my experience, the preference for trapping animals using different types of snares varies depending on factors such as traditional customs, geographical location, availability and accessibility of materials, terrain, ease of transporting materials, and the type of animal targeted, ranging from buffaloes to medium or small-sized antelope. Based on my experience working in open woodland savannah protected areas (where poachers prefer using wired snares to hunt big game and even small game) and in closed canopy rainforests (where poachers prefer using nylon snares to hunt medium to small-sized antelope). It would be great if the technology will be modified to be capable of detecting both types of snares.

Hi Godfrey, unfortunately the technology wont work on nylon snares. Radar is limited to detecting metal. What I am learning is that in Forest habitat where poachers are catching small antelope like duikers and suni's there is a higher proportion of thick nylon snares. In the areas where I operate more than 90% of the snares are metal, mainly multistranded cable (like brake cables) or single strand like fencing wire. The poachers use metal because the larger antelope like nyala, hartebeest, wildebeest, buffalo break nylon snares or can bite through  them. The prefer multi-stranded wires like brake cable wire because they pull through the loop more reliably than single strand (fencing wire) and therefore are more effective. Multistranded wires are also more flexible and easier to coil up and travel with. Radio waves at around  2GHz  can penetrate vegetation and forest canopy but cannot penetrate tree trunks and thick branches, so there is also a limitation there but it could be dealt with by having multiple passes on different flight paths over an area so snares shielded from detection by a tree trunk at one angle becomes detectable at another angle. 


I have been concentrating on trying to get funding for Airborne Sythetic Aperture Radar on the basis of snare detection for 2 reasons:

  1. Detecting and prcisely locating snares will have the biggest conservation impact 
  2. Initially running the detection algorithms will take place as post processing after a flight mission in the cloud.  It is therefore betterr suited to statiic targets that will still be in the location recorded during the mission.

Post processing of the radar will shift to real-time onboard processing and reporting via a satellite connection, but this would take quite a lot more development. 

See full post

Has anyone combined flying drone surveys with AI for counting wild herds?

My vision is a drone, the kind that fly a survey pattern. Modern ones have a 61 megapixel camera and LIDAR which is a few millimeter resolution, they are for mapping before a road...

3 0

Hi Johnathan!

Here are a few examples where UAVs and AI has been used to spot animals.,accurate%20than%20traditional%20counting%20approaches.

A google scholar search as this will find many more:

One thing often forgotten when considering UAVs for aerial surveys like these are that maximum height above ground is normally about 100-120m. This really limits the area one can cover.



That was one of the things I was wondering about, the height that it can resolve animals at. At some resolution it must be able to tell different animals apart. 


My application is for invasive herds, or uncontrolled large animal herds such as wild horses or urban deer. In phase 2 we apply contraceptives to them to humanely reduce numbers.

I'm not an expert in this field, but have been doing some self study for a local project... Resolving animals in an image is not purely height related - but rather a combination of height and focal length (distance between the camera lens and the image sensor) (with some other factors) - Ground Sampling Distance (GSD) or Spatial Resolution is often used interchangeably (but there are slight differences). Flying low with a wide angle lens, or high with a telephoto lens can have the same GSD... 

See the following for general discussions of GSD (and how to calculate it) vs Spatial Resolution:

I don't know much about the use of LIDAR for identifying animals but this seems a very interesting article to start with:

As to what GSD/Spatial Resolution is needed: it depends on the animal size. It seems that 0.5cm GSD is best to recognize cattle size animals (, but elephants have been identified at 31 cm resolution in South Africa using satellite data ( 

For comparison, Google Earth images generally range from 60cm GSD and up to 20+m, depending on location (

Another pratical issue to deal with is that animals move - especially move fast when disturbed by low flying drones - and it could cause significant estimation issues.

Now if we can get 1cm GSD satellite images of large areas it would be REALLY helpful :-)

As an aside, my ideal scenario is to eventually replace current plane/helicopter based animal surveys with automated options. My specific study area is about 25000ha / 250km2 / 62000 acres, and 3 helicopters (with trained personnel) takes a full day to count animals - at a fairly high cost in local currency (ZAR 250k quoted). I had one estimate to do a image based survey with LIDAR, and just to image this large area at approximately 3cm GSD will take approximately a week of plane flying time for about the same cost...

See full post

MegaDetector v5 release

Some folks here have previously worked with our MegaDetector model for categorizing camera trap images as person/animal/vehicle/empty; we are excited to announce MegaDetector...

4 4

Hi @dmorris,

might you have encountered this issue while working with Mega detector v5?

The conflict is caused by:
pytorchwildlife depends on torch==1.10.1
pytorchwildlife depends on torch==1.10.1
pytorchwildlife depends on torch==1.10.1


if yes what solution helped?

See full post

Pytorch-Wildlife: A Collaborative Deep Learning Framework for Conservation (v1.0)

Welcome to Pytorch-Wildlife v1.0At the core of our mission is the desire to create a harmonious space where conservation scientists from all over the globe can unite, share, and...

11 4

Hi everyone! @zhongqimiao was kind enough to join Variety Hour last month to talk more about Pytorch-Wildlife, so the recording might be of interest to folks in this thread. Catch up here: 

Hi @zhongqimiao ,

Might you have faced such an issue while using mega detector

The conflict is caused by:
pytorchwildlife depends on torch==1.10.1
pytorchwildlife depends on torch==1.10.1
pytorchwildlife depends on torch==1.10.1


if yes how did you solve it, or might you have any ideas?

torch 1.10.1 doesn't seem to exist

See full post

Drop-deployed HydroMoth

Hi all, I'm looking to deploy a HydroMoth, on a drop-deployed frame, from a stationary USV, alongside a suite of marine chemical sensors, to add biodiversity collection to our...

3 1

Hi Sol! This seems like an awesome project! I have a few questions in response: Where were you thinking of deploying this payload and for how long? 

Regarding hydromoth recorders, there have been several concerns that have popped up in my work with deploying the them at this depth because it's a contact type hydrophone which means it utilizes the case to transmit the sound vibrations of the marine soundscape to the microphone unlike the piezo element based hydrophones. 

  • At 30-60m you will likely have the case leak after an extended period of time if not immediately. The O-ring will deform at this depth, especially around the hinge of the housing. The square prism shape is not ideal for deep deployments you describe.  
  • After that depth and really starting at about 50m, a major concern is synthetic implosion from the small air pocket of the hydromoth not having a pressure release valve and lithium ion batteries getting exposed to salt water. This type of reaction would cause your other instruments to probably break or fail as well. 
  • You are unlikely to get a signal with a reinforced enclosure. The signal is generated via the material and geometry of the housing. The plastic will probably deform and mess with your frequency response and sound to noise ratio. If you place it against metal, it will dampen the sound quite a lot. We tried to do this, but the sensitivity is quite low with a large amount of self noise. 
  • A side note: for biodiversity assessments, the hydromoth is not characterized and is highly directional, so you wouldn't be able to compare sites through your standard aocustic indices like ACI and SPL.  

    That said if you are deploying for a short time, a hydrophone like an Aquarian H1a attached through a penetrator of a blue robotics housing that contains a field recorder like a zoom recorder may be optimal for half a day and be relatively cheaper than some of the other options. You could also add another battery pack in parrallel for a longer duration. 


Hi Matthew,

Thanks for your advice, this is really helpful!

I'm planning to use it in a seagrass meadow survey for a series of ~20 drops/sites to around 30 m, recording for around 10 minutes each time, in Cornwall, UK.

At this stage I reckon we won't exceed 30 m, but based on your advice, I think this sounds like not the best setup for the surveys we want to try.

We will try the Aquarian H1a, attached to the Zoom H1e unit, through a PVC case. This is what Aquarian recommended to me when I contacted them too.

Thanks for the advice, to be honest the software component is what I was most interested in when it came to the AudioMoth- is there any other open source software you would recommend for this?

Best wishes,


Hey Sol, 

No problem at all. Depending on your configuration, the Audiomoth software would have to work on a PCB with an ESP32 chip which is the unit on the audiomoth/hydromoth, so you would have to make a PCB centered around this chip. You could mimic the functionality of the audiomoth software on another chip, like on a raspberry pi with python's pyaudio library for example. The problem you would have is that the H1A requires phantom power, so it's not plug and play. I'm not too aware with the H1e, but maybe you can control the microphone through the recorder that is programmable through activations by the RPi (not that this is the most efficient MCU for this application, but it is user friendly). A simpler solution might be to just record continuously and play a sound or take notes of when your 10 min deployment starts. I think it should last you >6 hours with a set of lithium energizer batteries. You may want to think about putting a penetrator on the PVC housing for a push button or switch to start when you deploy. They make a few waterproof options. 

Just somethign else that occured to me, but if you're dropping these systems, you'll want to ensure that the system isn't wobbling in the seagrass as that will probably be all you will hear on the recordings, especially if you plan to deploy shallower. For my studies in Curacao, we aim to be 5lbs negative, but this all depends on your current and surface action. You might also want to think about the time of day you're recording biodiversity in general. I may suggest recording the site for a bit (a couple days or a week) prior to your study to see what you should account for (e.g. tide flow/current/anthropogenic disturbance) and determine diel patterning of vocalizations you are aiming to collect if subsampling at 10 minutes. 



See full post

WILDLABS AWARDS 2024 - No-code custom AI for camera trap species classification

We're excited to introduce our project that will enable conservationists to easily train models (no code!) that they can use to identify species in their camera trap images.As we...

7 4

Happy to explain for sure. By Timelapse I mean images taken every 15 minutes, and sometimes the same seals (anywhere from 1 to 70 individuals) were in the image for many consecutive images. 

Got it. We should definitely be able to handle those images. That said, if you're just looking for counts, then I'd recommend running Megadetector which is an object detection model and outputs a bounding box around each animal.

Hi, this is pretty interesting to me. I plan to fly a drone over wild areas and look for invasive species incursions. So feral hogs are especially bad, but in the Everglades there is a big invasion of huge snakes. In various areas there are big herds of wild horses that will eat themselves out of habitat also, just to name a few examples. Actually the data would probably be useful in looking for invasive weeds, that is not my focus but the government of Canada is thinking about it.

Does your research focus on photos, or can you analyze LIDAR? I don't really know what emitters are available to fly over an  area, or which beam type would be best for each animal type. I know that some drones carry a LIDAR besides a camera for example. Maybe a thermal camera would be best to fly at night.

See full post


 We are incredibly thankful to WILDLABS and Arm for selecting the MothBox for the 2024 WILDLABS Awards.  The MothBox is an automated light trap that attracts and...

7 5

Already an update from @hikinghack

Yeah we got it about as bare bones as possible for this level of photo resolution and duration in the field. The main costs right now are:


Pi- $80

Pijuice -$75

Battery - $85

64mp Camera - $60

which lands us at $300 already. But we might be able to eliminate that pijuice and have fewer moving parts, and cut 1/4 of our costs! Compared to something like just a single logitech brio camera that sells for $200 and only gets us like 16mp, we are able to make this thing as cheap as we could figure out! :)

See full post

WILDLABS AWARDS 2024 - BumbleBuzz: automatic recognition of bumblebee species and behaviour from their buzzing sounds 

The 'BumbleBuzz' team (@JeremyFroidevaux, @DarrylCox, @RichardComont, @TBFBumblebee, @KJPark, @yvesbas, @ilyassmoummad, @nicofarr) is very pleased to have been awarded the...

2 5

Super great to see that there will be more work on insect ecoacoustics! So prevalent in practically every soundscape, but so often over-looked. Can't wait to follow this project as it develops!

See full post

Completely irrational animals...

Article from Ars Technica about how difficult it is to detect and avoid kangaroos...

3 2
Thanks for the info Rob! Lot´s of research going on in this field. After decades of trying to warn the animals (first red reflectors which are not working with ungulates and now...
The idea is not new. It has been tested twenty years ago:short english summary: https://...
Hi Robin,Thanks for all that information! Yes the 'at-grade' crossing idea (the second message article summary you linked to) is a really good one I think. Going to try and...
See full post

AI for Conservation!

Hi everybody!I just graduated in artificial intelligence (master's) after a bachelor's in computer engineering. I'm absolutely fascinated by nature and wildlife and I'm trying to...

7 5

Welcome, Have you considered participating in any of the AI for Good challenges. I find it is good way to build a nice portfolio of work. Also contributing to existing open source ML projects such as megadetector or to upstream libraries such as PyTorch is good way to getting hired. 




We could always use more contributors in open source projects. In most open source companies Red Hat, Anaconda, Red Hat and Mozilla, people often ended up getting hired largely due to their contributions on open source projects. These contributions were both technical such as writing computer code and non-technical such as writing documentation and translating tools in their local language. 


See full post

The Variety Hour: 2024 Lineup

You’re invited to the WILDLABS Variety Hour, a monthly event that connects you to conservation tech's most exciting projects, research, and ideas. We can't wait to bring you a whole new season of speakers and...

See full post

Catch up with The Variety Hour: March 2024

Variety Hour is back! This month we're talking about making AI more accessible with Pytorch, new developments from WildMe and TagRanger, and working with geospatial data with Fauna & Flora. See you there!

3 10
Unfortunately, I can't be there. When will you upload the recording?    
See full post