Group

Camera Traps / Feed

Looking for a place to discuss camera trap troubleshooting, compare models, collaborate with members working with other technologies like machine learning and bioacoustics, or share and exchange data from your camera trap research? Get involved in our Camera Traps group! All are welcome whether you are new to camera trapping, have expertise from the field to share, or are curious about how your skill sets can help those working with camera traps. 

discussion

Conservation Technology for Human-Wildlife Conflict in Non-Protected Areas: Advice on Generating Evidence

Hello,I am interested in human-dominated landscapes around protected areas. In my case study, the local community does not get compensation because they are unable to provide...

2 0

This is an area where my system would do very well in:



 

 

Also, as you mention areas dominated by humans, there is a high likelyhood that there will be enough power there to support this system, which provides very high performance and flexibility but it comes with a power and somewhat a cost cost.



Additionally, it's life blood comes with generating alerts and making security and evidence gathering practical and manageable, with it's flexible state management system.



Ping me offline if you would like to have a look at the system.

Hi Amit,

The most important thing is that the livestock owners contact you as soon as possible after finding the carcass. We commonly do two things if they contact us on the same day or just after the livestock was killed:

  1. Use CyberTracker (or similar software) on an Android smart phone to record all tracks, bite marks, feeding pattern and any other relevant signs of the reason for the loss with pictures and GPS coordinates. [BTW, Compensation is a big issue -- What do you do if the livestock was stolen? What do you do if a domestic animal killed the livestock? What if it died from disease or natural causes and was scavenged upon by carnivores afterwards?]
  2. In the case of most cats, they would hide the prey (or just mark it by covering it with grass or branches and urinating in the area). In this case you can put up a camera trap on the carcass to capture the animal when it returns to its kill (Reconyx is good if you can afford it - we use mostly Cuddeback with white flash). This will normally only work if the carcass is fresh (so other predators would not be able to smell it and not know where it is yet), so the camera only has to be up for 3-5 days max.

This is not really high-tech, but can be very useful to not only establish which predator was responsible (or if a predator was responsible), but also to record all the evidence for that.

See full post
discussion

Passionate engineer offering funding and tech solutions pro-bono.

My name is Krasi Georgiev and I run an initiative focused on providing funding and tech solutions for stories with a real-world impact. The main reason is that I am passionate...

2 1

Hi Krasi! Greetings from Brazil!



That's a cool journey you've started! Congratulations. And I felt like theSearchLife resonates with the work I'm involved round here. In a nutshell, I live at the heart of the largest remaining of Atlantic forest in the planet - one of the most biodiverse biomes that exist. The subregion where I live is named after and bathed by the "Rio Sagrado" (Sacred River), a magnificent water body with a very rich cultural significance to the region (it has served as a safe zone for fleeing slaves). Well, the river and the entire bioregion is currently under the threat of a truly devastating railroad project which, to say the least is planned to cut through over 100 water springs! 



In face of that the local community (myself included) has been mobilizing to raise awareness of the issue and hopefully stop this madness (fueled by strong international forces). One of the ways we've been fighting this is through the seeking of the recognition of the sacred river as an entity of legal rights, who can manifest itself in court, against such threats. And to illustrate what this would look like, I've been developing this AI (LLM) powered avatar for the river, which could maybe serve as its human-relatable voice. An existing prototype of such avatar is available here. It has been fine-tuned with over 20 scientific papers on the Sacred River watershed.



And right now myself and other are mobilizing to manifest the conditions/resources to develop a next version of the avatar, which would include remote sensing capacities so the avatar is directly connected to the river and can possibly write full scientific reports on its physical properties (i.e. water quality) and the surrounding biodiversity. In fact, myself and 3 other members of the WildLabs community have just applied to the WildLabs Grant program in order to accomplish that. Hopefully the results are positive.



Finally, it's worth mentioning that our mobilization around providing an expression medium for the river has been multimodal, including the creation of a shortfilm based on theatrical mobilizations we did during a fest dedicated to the river and its surrounding more-than-human communities. You can check that out here:



 

https://vimeo.com/manage/videos/850179762



 

Let's chat if any of that catches your interest!

Cheers!

Hi Danilo. you seem very passionate about this initiative which is a good start.
It is an interesting coincidence that I am starting another project for the coral reefs in the Philipines which also requires water analytics so I can probably work on both projects at the same time.

Let's that have a call and discuss, will send you a pm with my contact details

There is a tech glitch and I don't get email notifications from here.

See full post
discussion

Underwater camera trap - call for early users

Hi!The CAMPHIBIAN project aims at developing an underwater camera trap primarily targeting amphibian such as newts, but co-occurring taxa are recorded as well such as frogs, grass...

7 4

Many thanks for your contribution to the survey! We are now summarizing the list of early users and making our best to propose a newtcam to all in due time. 

All the best!

Xavier

See full post
discussion

Jupyter Notebook: Aquatic Computer Vision

Dive Into Underwater Computer Vision Exploration OceanLabs Seychelles is excited to share a Jupyter notebook tailored for those intrigued by the...

3 0

This is quite interesting. Would love to see if we could improve this code using custom models and alternative ways of processing the video stream. 

This definitely seems like the community to do it. I was looking at the thread about wolf detection and it seems like people here are no strangers to image classification. A little overwhelming to be quite honest 😂

While it would be incredible to have a powerful model that was capable of auto-classifying everything right away and storing all the detected creatures & correlated sensor data straight into a database - I wonder if in remote cases where power (and therefore cpu bandwidth), data storage, and network connectivity is at a premium if it would be more valuable to just be able to highlight moments of interest for lab analysis later? OR if you do you have cellular connection, you could download just those moments of interest and not hours and hours of footage? 

Am working on similar AI challenge at the moment. Hoping to translate my workflow to wolves in future if needed. 

We all are little overstretched but it there is no pressing deadlines, it should be possible to explore building efficient model for object detection and looking at suitable hardware for running these model on the edge. 

 

 

See full post
discussion

Replacement screen for Bushnell cameratrap

Hi all,I have an arboreal camera trap array using the Bushnell E3 Trophy cam. One of the cameras has suffered damage at the hands of white faced capuchins. The camera trap still...

3 0

Hey Lucy!

You should be able to pick up a small piece of infrared emitting plastic online for super cheap that would allow for the IR lights to pass through, but block UV from coming in. Anything should be able to be glued and sealed using expoy, which shouldn't damage any electronic components, but will ensure weatherproofing.

 

Goodluck!

 

Best,

Travis

I have fixed Bushnell TrophyCam IR windows with plastic cut from the bottom of a supermarket fruit package. Any thin, clear plastic will be OK. I stuck it in with silicone, but make sure you get the neutral cure type that does not emit acetic acid as it sets.

See full post
discussion

Need advice - image management and tagging 

Hello Wildlabs,Our botany team is using drones to survey vertical cliffs for rare and endangered plants. Its going well and we have been able to locate and map many new...

6 0

I have no familiarity with Lightroom, but the problem you describe seems like a pretty typical data storage and look up issue.  This is the kind of problem that many software engineers deal with on a daily bases.  In almost every circumstance this class of problem is solved using a database.

In fact, a potentially useful analysis is that the Lightroom database is not providing the feature set you need.

It seems likely that you are not looking for a software development project, and setting up you own DB would certainly require some effort, but if this is a serious issue for your work, you hope to scale your work up, or bring many other participants into your project, it might make sense to have an information system that better fits your needs.

There are many different databases out there optimized for different sorts of things.  For this I might suggest taking a look at MongoDB with GridFS for a couple of reasons.

  1. It looks like you meta data is in JSON format.  Many DBs are JSON compatible, but Mongo is JSON native.  It is especially good at storing and retrieving JSON data.  Its JSON search capabilities are excellent and easy to use.  It looks like you could export your data directly from Lightroom into Mongo, so it might be pretty easy actually.
  2. Mongo with the GridFS package is an excellent repository for arbitrarily large image files.
  3. It is straightforward to make a Mongo database accessible via a website.
  4. They are open source (in a manner of speaking) and you can run it for free.

Disclaimer: I used to work for MongoDB.  I don't anymore and I have no vested interest at all, but they make a great product that would really crush this whole class of problem.

See full post
discussion

Recycled & DIY Remote Monitoring Buoy

Hello everybody, My name is Brett Smith, and I wanted share an open source remote monitoring buoy we have been working on in Seychelles as part of our company named "...

2 1

Hello fellow Brett. Cool project. You mentioned a waterseal testing process. Is there documentation on that?

I dont have anything written up but I can tell what parts we used and how we tested.



Its pretty straightforward, we used this M10 Enclosure Vent from Blue Robotics:

 

Along with this nipple adapter:

Then you can use any cheap hand held break pump to connect to your enclosure. You can pump a small vacuum in and make sure the pressure holds.

Here's a tutorial video from blue robotics:

 





Let me know if you have any questions or if I can help out.

See full post
discussion

Cheap camera traps with "Timelapse+" mode?

Hi everyone,I have a fairly specific query about camera trap time lapse functionality. I am looking for cheap models that have something similar to Bushnell's "Timelapse+" mode,...

10 0

Thank you @mguins  and @NickGardner for your praise and addition. I had not thought of the backup possibility, but it sure is a good point, Michelle. I find it amazing how often one reads about and experiences camtrap malfunction. Even the relatively cheap ones are still quite a lot of money for what is, at the end of the day, a relatively simple piece of electronics and a plastic container.

Frank's idea of using 2 camera traps is inspired!

I've fiddled with cheap camera traps a bit, and some (most?) of them use a low power, inaccurate timer for the time lapse function instead of the accurate real time clock.  This is ok for Michelle's purpose, but not for Nick's as he needs to specify the exact time of day to trigger.

I made this interface to allow a camera trap to be triggered by an external device.  To it you could attach, say, a timer programmed to fire at the desired times, to cause a capture.  A $4 DS3231 RTC module could do the job, after the alarm times have been programmed into it with , for example, an Arduino.

Hi Nick, 

Any update from your project? did you find good price value Camera Traps?

We in Indonesia don't have local suppliers for any research grade Camera Traps like Bushnell, Browning or Reconyx. So we need to import them and the price inflated a lot even without the distributor. So, me and my team recently use the China model like GardePro or Meidase one ($40-60). Though we bought it in the US in small quantity if some of our friends travel back to Indonesia. They have more feature than typical Bushnell with same price range. The images are AI upscale, but doesn't really bother us. So I am curious if you found any good Camera Traps to recommend? Thanks!

Cheers,

Dhanu

See full post
discussion

Using "motion extraction" for animal identification

Hi all, I am no expert in the underlying machine learning models, algorithms, and AI-things used to identify animals in the various computer-vision tools out there (...

6 0

Hi Dhanu,

Our group moved to Wildlife Insights a few years back (for a few reasons but mostly ease of data upload/annotation by multiple users) so I haven't tried EcoAssist. This being said, I will look into it as a pre-WildlifeInsights filter to analyze the tens of thousands of images that get recorded when camera traps start to fail, or get confused with sun spots (which can be common at one of our sites, a south-facing slope with sparse canopy cover).

Thanks for sharing!

 

See full post
discussion

Wildlife Conservation for "Dummies"

Hello WILDLBAS community,For individuals newly venturing into the realm of Wildlife Conservation, especially Software Developers, Computer Vision researchers, or...

3 4

Maybe this is obvious, but maybe it's so obvious that you could easily forget to include this in your list of recommendations: encourage them to hang out here on WILDLABS!  I say that in all seriousness: if you get some great responses here and compile them into a list, it would be easy to forget the fact that you came to WILDLABS to get those responses.

I get questions like this frequently, and my recommended entry points are always (1) attend the WILDLABS Variety Hour series, (2) lurk on WILDLABS.net, and (3) if they express a specific interest in AI, lurk on the AI for Conservation Slack.

I usually also recommend that folks visit the Work on Climate Slack and - if they live in a major city - to attend one of the in-person Work on Climate events.  You'll see relatively little conservation talk there, but conservation tech is just a small subset of sustainability tech, and for a new person in the field, if they're interested in environmental sustainability, even if they're a bit more interested in conservation than in other aspects of sustainability, the sheer number of opportunities in non-conservation-related climate tech may help them get their hands dirty more quickly than in conservation specifically, especially if they're looking to make a full-time career transition.  But of course, I'd rather have everyone working on conservation!

Some good overview papers I'd recommend include: 

I'd also encourage you to follow the #tech4wildlife hashtags on social media! 


 

 

I'm also here for this. This is my first comment... I've been lurking for a while.

I have 20 years of professional knowledge in design, with the bulk of that being software design. I also have a keen interest in wildlife. I've never really combined the two; and I'm starting to feel like that is a waste. I have a lot to contribute. The loss of biodiversity is terrifying me. So I’m making a plan that in 2024 I’m going to combine both.

However, if I’m honest with you – I struggle with where to start. There are such vast amounts of information out there I find myself jumping all over the place. A lot of it is highly scientific, which is great – but I do not have a science background.

As suggested by the post title.. a “Wildlife Conservation for Dummies” would be exactly what I am looking for. Because in this case I’m happy to admit I am a complete dummy.

See full post
discussion

Testing Raspberry Pi cameras: Results

So, we (mainly @albags ) have done some tests to compare the camera we currently use in the AMI-trap with the range of cameras that are available for the Pi. I said in a thread...

9 0

And finally for now, the object detectors are wrapped by a python websocket network wrapper to make it easy for the system to use different types of object detectors. Usually, it's about 1/2 a day for me to write a new python wrapper for a new object detector type. You just need to wrap in the network connection and make it conform to the yolo way of expressing the hits, i.e. the json format that yolo outputs with bounding boxes, class names and confidence level.

What's more, you can even use multiple object detector models in different parts of a single captured image and you can cascade the logic to require multiple object detectors to match for example, or a choice from different object detectors.

It's the perfect anti-poaching system (If I say so myself :) )

Hey @kimhendrikse , thanks for all these details. I just caught up. I like your approach of supporting multiple object detectors and using the python websockets wrapper! Is your code available somewhere?

Yep, here:

Currently it only installs on older Jetsons as in the coming weeks I’ll finish the install code for current jetsons.


Technically speaking, if you were an IT specialist you could even make it work in wsl2 Ubuntu on windows, but I haven’t published instructions for that. If you were even more of a specialist you wouldn’t need wsl2 either. One day I’ll publish instructions for that once I’ve done it. Though it would be slow unless the windows machine had an NVidia GPU and you PyTorch work with it.

See full post
discussion

Open-Source design guide for a low-cost, long-running aquatic stereo camera

Katie Dunkley's project has been getting a heap of attention in the conservation tech community - she very kindly joined Variety Hour to give us a walkthrough of her Open-Source...

2 0

This is awesome - thanks for sharing Stephanie!! We actually were looking around for a low-cost video camera to augment an MPA monitoring project locally and this looks like a really great option!

 

Cheers, Liz

Thank you for sharing! Super interesting, as we don't see many underwater stereo cameras! We also use Blue Robotics components in our projects and have found them reliable and easy to work with. 

See full post
discussion

Apply to Beta test Instant Detect 2.0

Hi WildLabs,ZSL is looking for Beta testers for Instant Detect 2.0. If you are a conservationist, scientist or wildlife ranger with experience working with innovative...

1 2

Will you accept personal/hobbyist focused on conservation on their small plots of land (10-100 acres)?

I would, and know others, who would happily pay more than the official conservationists rate for the service, which could help to further subsidize the project. (Referring to your statement here: https://wildlabs.net/discussion/instant-detect-20-and-related-cost)

See full post
discussion

Mesh camera trap network?

Does anyone have something to share about wireless camera traps that make use of a mesh-network type of architecture. One such solution, BuckeyeCam allows cameras to route images...

24 0

Hi Sam,

Impressive!  Any chance the LoRa code is open source?  I should like to take a gander.

Thanks

See full post
discussion

Subsea DIY Burnwire for Deep-sea BRUVS

Hello everyone. I'm part of a team working on a low-cost, deep-sea camera (BRUVS) project and we're currently facing challenges with our subsea burnwire release system. We're...

9 0

Yeah from memory we found it difficult to get the relatively high voltage (~50VDC) and current (can't remember) in a small package, but we had almost no experience back then and gave up fairly quickly. We also found it difficult to get much help from the company if I remember correctly...

so is the problem with the nichrome waterproofing everything? I picture something like coating the nichrome in high temp grease (especially where it's in contact with the nylon line and the line itself) and encapsulating the entire thing in a semi-flexible silicone (so the line can slip out after detechment) with something buoyant to help pull it towards the surface maybe? Speaking of, how are tags being recovered (i.e. do they need to pop to the surface)? 

Hi Titus,

We've used this design/procedure for many years with our Deep Sea Camera systems, with good reliability.  Not OTS but not hard to make and most of the materials come out to be inexpensive per unit.  The most expensive item is the M101 connector ($25ea), but if you get them with extra length on the cable, you can essentially cut it off at the point where it joins the burn-loop and reuse that connector until it gets too short.  You'd also need an F101 connector integrated with your BRUV, this connecting with the burnwire and forming the the positive side of the circuit, and a ground - our ground connection goes to a large bolt on the frame near the burnwire loop - but that connector generally shouldn't need replacement unless it gets damaged.

These burnwires generally break in 3-7min, burning at about 1Amp, ~14.5V.  A thinner version of the coated wire could go faster or with less power required.

We do also employ galvanic releases as backups.  I really like redundancy on recovery mechanisms!  The ones we use are made by International Fishing Devices, Inc.  Various distributors sell certain models of their products (i.e. different time durations) but if you contact them directly, they can also make custom duration ones for you.

 

Hi Titus,

I've used latching solenoids as a release in a fresh water application. The product linked to is the one I have used, but has been discontinued (it's been quite a while).  Anyway these little devices hold a plunger in place with a permanent magnet, but release the plunger when a coil is energised that counters the magnet.  The holding force is not great, but more than enough to keep the safety on a mechanical trigger.  The whole device can be potted and sealed (ideally under vacuum to eliminate voids).  When pushing the plunger in to arm the solenoid, there is a definite click when the magnet kicks in, to confirm the locked state.

A similar device is the electropermanent magnet, which doesn't have a plunger, in fact it has no moving parts.  You provide the steel piece that that this device will release when energised, as with a latching solenoid. It generally has greater holding force than a latching solenoid. I've used these in a seawater application.  It's worth noting that there exist ferromagnetic stainless steels that can be used here to avoid corrosion.

Thanks,

-harold

See full post
discussion

Thermal cameras for monitoring visitors in highly vulnerable conservation areas

Hi everybody, Im Alex González, a consultant and researcher in sustainable tourism and conservation. I'm currently consulting a conservation organisation for the development...

7 0

Hi,



This is a really late answer but I was new to wildlabs then. I have a security appliance that uses state of the AI models and user defined polygon areas of interest that generates video alerts of intrusions in typically under a second.

Although its setup to install automatically on an NVidia AI on the edge boxes of your intentions were to monitor a great deal of cameras you could also install it on a desktop with a high end GPU for very high performance. At home I use a desktop with an rtx 2080ti and monitor around 15 cameras and a thermal imaging camera (old one).

I have also tested a high end model (yolov7) on a high end thermal imaging camera image and it works fine as well.

Thermal yolov7

Thermal imaging cameras are hellishly expensive though and I’ve found that new extremely light sensitive cameras like the HIKvision colorvu series almost obsoletes them in terms of people detection at night at a fraction of the cost.

If you are interested I’d be happy to show you a demo in a video meeting sometime if you like. I’m pretty sure it would meeting all your intrusion detection and alerting needs.

My project page is

See full post
discussion

Video camera trap analysis help

muh
Hello,I'm a complete newbie so any help would be appreciated. I never trained any ML models (have junior/mid experience with Python and R) nor annotated the data but would like to...

8 0

Hi there!, 

You should definitely check out VIAME, which includes a video annotation tool in addition to deep learning neural network training and deployment. It has a user friendly interface, has a publicly available server option that mitigates the need for GPU enabled computer for network training, and has an amazing support staff that help you with your questions. You can also download the VIAME software for local use. The tool was originally developed for marine life annotation, but can be used for any type of video or annotation (we are using it to annotate pollinators in video). Super easy to annotate as well. Worth checking out!  

Cheers, 
Liz Ferguson

See full post
discussion

Is anyone or platform supporting ML for camera trap video processing (id-ing jaguar)? 

Hi wildlabbers, I have another colleague looking for support for getting started using AI for processing videos coming out of their camera traps - specifically for species ID...

9 0

Hey there community! Im new here and looking after lots of answers too! ;-)

We are searching aswell for the most ideal App / AI technology to ID different cats, but also other mammals if possible 

- Panthera onca

- Leopardus wiedii

- Leopardus pardalis

 

and if possible:

 

- Puma concolor

- Puma yagouaroundi

- Leopardus colocolo

- Tapirus terrestris

 

Every recommendation is very welcome, thanks!

Sam

See full post
discussion

Automatic extraction of temperature/moon phase from camera trap video

Hey everyone, I'm currently trying to automate the annotation process for some camera trap videos by extracting metadata from the files (mp4 format). I've been tasked to try...

7 0

Hi Lucy

As others have mentioned, camera trap temperature readouts are inaccurate, and you have the additional problem that the camera's temperature can rise 10C if the sun shines on it.

I would also agree with the suggestion of getting the moon phase data off the internet.

 

Do you need to do this for just one project?  And do you use the same camera make/model for every deployment?  Or at least a finite number of camera makes/models?  If the number of camera makes/models you need to worry about is finite, even if it's large, I wouldn't try to solve this for the general case, I would just hard-code the pixel ranges where the temperature/moon information appears in each camera model, so you can crop out the relevant pixels without any fancy processing.  From there it won't be trivial, exactly, but you won't need AI. 

You may need separate pixel ranges for night/day images for each camera; I've seen cameras that capture video with different aspect ratios at night/day (or, more specifically, different aspect ratios for with-flash and no-flash images).  If you need to determine whether an image is grayscale/color (i.e., flash/no-flash), I have a simple heuristic function for this that works pretty well.

Assuming you can manually define the relevant pixel ranges, which should just take a few minutes if it's less than a few dozen camera models, I would extract the first frame of each video to an image, then crop out the temperature/moon pixels.

Once you've cropped out the temperature/moon information, for the temperature, I would recommend using PyTesseract (an OCR library) to read the characters.  For the moon information... I would either have a small library of images for all the possible moon phases for each model, and match new images against those, or maybe - depending on the exact style they use - you could just, e.g., count the total number of white/dark pixels in that cropped moon image, and have a table that maps "percentage of white pixels" to a moon phase.  For all the cameras I've seen with a moon phase icon, this would work fine, and would be less work than a template matching approach.

FYI I recently wrote a function to do datetime extraction from camera trap images (it would work for video frames too), but there I was trying to handle the general case where I couldn't hard-code a pixel range.  That task was both easier and harder than what you're doing here: harder because I was trying to make it work for future, unknown cameras, but easier because datetimes are relatively predictable strings, so you know when you find one, compared to, e.g., moon phase icons.

In fact maybe - as others have suggested - extracting the moon phase from pixels is unnecessary if you can extract datetimes (either from pixels or from metadata, if your metadata is reliable).

camtrapR has a function that does what you want. i have not used it myself but it seems straightforward to use and it can run across directories of images:

https://jniedballa.github.io/camtrapR/reference/OCRdataFields.html

See full post
Link

Eliminatha, WiCT 2023 Tanzania

Passionate wildlife researcher and tech user, making strides in Grumeti, the heart of western Serengeti,Tanzania, using Camera Traps to gain priceless insights into the lives of this unique fauna and contributing greatly to understanding and preserving the Serengeti's ecosystems.

4
discussion

setting up a network of cameras connected to a server via WIFI

We need to set up a wildlife monitoring network based on camera traps in Doñana National Park, Spain (see alsowildlifeobservatory.org).  We are interested in setting...

12 0

Great discussion! Pet (and other 'home') cams are an interesting option as @antonab mentioned. I've been testing one at home that physically tracks moving objects (and does a pretty good job of it), connects to my home network and can be live previewed, all for AUD69 (I bought it on special. Normal retail is AUD80): 

On the Wifi front, and a bit of a tangent, has anyone done any work using 'HaLow' (see below for example) as it seems like an interesting way to extend Wifi networks?

Cool thread!

I will be testing Reolink Wi-Fi cameras in combination with solar powered TP-Link long range Wi-Fi antennas/repeaters later this field season for monitoring arctic fox dens at our remote off grid site in Greenland. The long range Wi-Fi antennas are rather power hungry but with sufficient solar panel and battery capacity I am hopeful it will work. 
I am looking forward to explore the links and hints above after the field season. 
Cheers,

See full post