Global Feed

There's always something new happening on WILDLABS. Keep up with the latest from across the community through the Global view, or toggle to My Feed to see curated content from groups you've joined. 

Header image: Laura Kloepper, Ph.D.

discussion

AI Identification Models on Thermal Data

Hello, my name is Siddhi. I recently joined WildLabs and getting to know different groups. I hope that this is the right place to post! I am interested in AI to identify...

2 0

Wow! This dataset seems great and definitely worth trying out. Do you perhaps have a dataset for deer, elk, and those animals of the sort? 
I live in the mountainous region so deer are very common and easily hit. 

Thank you again

See full post
discussion

Integrating AI models with camera trap management applications

Hi All, As part of extending the work we are doing at the BearID Project, we are thinking about integrating the models we are developing into open source camera trap project. This...

27 4

Thank you for the update, @pvlun! I have installed and started using the Amazon model. It looks like the model is genus rather than species classifier. It's a shame they don't seem to have puma or panthera classes. Those are key genus for the researchers in Ecuador. Anyway, I should have access to more data when I get the the university in Quito next week. I'll test more then and see what they think of the classes.

@bluevalhalla those are indeed only genera. And yes, I also found that there are some crucial genera missing. Nevertheless might it still be handy for you to make a rough separation before doing any human verification. 

Regarding the video processing: I didn't catch that you're looking for an MD results file that does all frames. MD creates that, but EcoAssist deletes it again (to avoid confusion with the video-level detection file). If you want the frame-level detection file, just out comment these lines. Then you'll have both. 

Answering Peter's question to me above, about how MegaDetector's video processing stuff decides which detection to include in the video-level output file:

  • By default, the function that matters just chooses the highest-confidence detection for each category (person/animal/vehicle) as the "canonical detection" for that video (for posterity, that happens here).
  • The frame number corresponding to that detection was not previously included in the output (since 99.99% of users would have no way of using this, but it sounds like you're in the other 0.01% :) ), so Peter just submitted a PR to the MegaDetector repo to include this in the output for a video; I'll merge that PR today, so I expect you'll see this in a forthcoming version of EcoAssist.

But ditto what Peter said, it sounds like you actually want the frame-level output file, and Peter gave you a tip about how to get that.

Also FWIW you will almost never want to process every frame of a video, nor will you likely want to train your models on every frame of a video; you're not getting a lot of new information from adjacent frames.  Typically when MegaDetector users process videos, I advise sampling videos down to ~3fps, which is usually every 10th frame.

See full post
discussion

Has anyone combined flying drone surveys with AI for counting wild herds?

My vision is a drone, the kind that fly a survey pattern. Modern ones have a 61 megapixel camera and LIDAR which is a few millimeter resolution, they are for mapping before a road...

25 0

Actually my Raspberry Pi application is a sound localizer not related to image recognition. My image recognition related project runs on Jetsons and higher.

But I think recognizing bugs on a drone would likely be challenging. You would have to have sufficient detail to get good recognition which would be a very narrow field of view and then vibration also becomes an issue.

For example, the trainings on just the coco dataset seems to distill the recognition of people to a multi-segmented thing with bits sticking out. So spiders on camera lens are highly likely to be seen as people. To get better results much more training data is needed. I expect it's also likely to be the case for insects, really large amounts of training data would be needed to tell the difference between different types.

Hi Johnathan, 



There is a Canadian company more or less doing that. They have their own endurance drone and optical/thermal cameras. Very much keyed into surveys and they may have success given the number of helicopter accidents we have had in Western Canada. Not sure if the AI part is there yet. 



I know they've done surveys with at least one department here but not much beyond that. I talked to one of the developers their just as a point of interest. The current leadership today looks different than I remember though. 



 

The camera can be aimed at the greenhouse background, which is like a huge green screen. Inside the greenhouse there's only a few flying insects, and they would all have to fly between the optics and the wall or roof eventually. Or if the bot is flying, have it look upwards. 

It's pretty much a programing question. Unfortunately I am not the type of person who is good at both building and troubleshooting hardware, and writing code. I took some programming back in college but I am not sure if I want to get myself up to speed. It's starting to sound like I need a few years of college before I can even get started. Which I already did, too bad none of it counts for anything anymore. Or I guess I can compete in the marketplace with people with real money behind them, which is the only thing that means anything. If you are brilliant and not funded, you might as well be a scarecrow.

See full post
discussion

How to add a salt water switch

Hi – I’m working on developing a GPS / LoRa tracker for Diamondback Terrapins (DBT) with some colleagues. DBTs spend a lot of time in brackish water and we’...

6 0

Hey Ned, 

If you want to add a transmitting component to your tag, let me know. I would happily provide the open-source Argos boards and some free satellite service time. 

Hi, @nedhorning, sounds fascinating! I'm going to listen in to hear how things progress, including with the capacitive sensor described in this thread. That sounds wise as it insulates the electronics. 

Also wanted to mention that we use a comparator circuit following a reference design from TI that I can't find at the moment but probably could with more digging if desired. We described its basic design and use in a saltwater switch in this paper (open access) if you want to read a bit more. 

https://www.sciencedirect.com/science/article/pii/S0278434322001029

Hello @ThomasGray_Argos ,

I need to add transmitting component, so please provide me the list of these. Actually, I don't know much about it. I am interested in learning new things.

I am new here but a real enthusiast and loving this community so far. I have a background in technology and feel I could help with documentation, at least for starters. If you have any other question apart from this, you can ask. I will try my best to solve your problem. 

See full post
discussion

Auto-Wake a Pi5 without a Pijuice! (Mothbox)

MotivationReplicate the functionality of a Pijuice with just a Pi5 and a $5 battery!Like many folks here, we use a Pijuice on the Raspberry pi on the...

6 4

Hey Andrew, this is really awesome. I'm sure there are loads of people who ran into the same problems as you and will find your script super useful.

As you might know, we launched The Inventory last month--it's a wiki-style database for all things conservation technology. Based off this post, it sounds like you have some experience using Raspberry Pi in your work. It would be a huge help to the community if you could take some time and leave a review on Raspberry Pi's product page on The Inventory! Your review will help others in the community immensely. Let me know if you have any questions!  

 

Hi Andrew,
this is great! 
Did you consider the Witty Pi 4 L3V7 Real Time Clock (RTC) und Power Management Modul für Raspberry Pi?
(This one has no included battery) or the one with the capacitor?

Greetings from Austria,
Robin

See full post
discussion

Welcome to WILDLABS!

Hello and welcome to the WILDLABS community! With 6,000 members and counting, we want to get to know you a little better. In a couple of...

274 10

Hi everyone,

I work for Landscape, a connected web and mobile app for the full range of land conservation work from acquisition to stewardship. Hundreds of conservation organizations in the US and Canada use Landscape for land protection project management, field visit data collection, data and document storage, contact and communications tracking, mapping analysis, and collaborative team work.

We are a small team and we're looking for a full stack software engineer/developer to join our team. I'd appreciate your help sharing this posting with any qualified candidates!

 

Welcome, @Dan_Ford ! I'm Alex from the WILDLABS community team. We're so happy to have you here. 

 

I definitely recommend sharing that job opening on WILDLABS as a career opportunity, that way it will show up in our biweekly digest and get seen by more people! To do that, just click +Post in the top right corner, then click "Careers" and fill in all the relevant information! We'd be happy to share the opening on our socials as well. 

 

I've also gone ahead and created a page for Landscape on The Inventory, our wiki-style database of conservation tech information. It would be great if you could take a look and add any missing information--I've gone ahead and given you editing access.  Also, you can connect your WILDLABS profile to Landscape's page so that people know to contact you with any questions about the organisation! To do this, go to your profile and select “Settings” on your side-bar, and then select the “About you” tab. Search and select your organisation name in the “Organisations(s)” section and click save.

 

Let me know if you have any questions :) 

Hi!

Am Grace Mchome, Msc candidate in wildlife management and conservation at Sokoine University of Agriculture , interested in wildlife conservation and the use of conservation technologies.

See full post
discussion

Where do I go with technical questions about this platform?

The 'Getting Started' page promisesUse this thread in the Community Base to ask questions about navigating the platform,but the link does not work. I would also like to...

3 0

@JakeBurton is the go-to person for all questions and problems relating to the website! Jake, can you help with this? 

 

We're in the process of updating some of the content on the website, so thank you for drawing our attention to this!

See full post
discussion

Testing, Deployment, Solar, Conferences - Mothbox update v3.21

The last two months have been bustling with fun mothbox developments!Deployment in AzueroOur first full deployment went wonderfully! Andy and Kitty trained @Hubertszcz  's...

3 3

Thank you for the detailed post. This is amazing! 

 

Have you experimented with ways of reducing power consumption? How much of the power is used by the LEDs vs the RPi? You could try having the LEDs on for less time, or less bright. 

 

Julian

The LEDs use most of the energy, the UV leds are about 12-15 watts. the pi hangs out around 3 watts. and the white flashing LEDs use about 10 watts each, but we reduced their time to like .8 seconds for a flash photo, so they aren't as much of a big deal.

 

you basically need a lot of power to make a really bright thing to attract the bugs, but we are testing out some more efficient UV leds

I was thinking that you might be able to reduce the amount of time the lights were on by blinking them, but this paper seems to show that flickering mostly reduces the number of specimens attracted, though it affects some orders more than others, and, rarely, it increases the number attracted. On the other hand, the numbers of  insects captures only seemed to go down by around 50% even with quite extreme flickering (10% duty cycle I.e. light only on 10% of the time) and if that had a big effect on how long the battery lasted it might be worth it, as you could run more capturing sessions between charges.

 

See full post
discussion

looking for help

Hello, I am an independent researcher doing my study on assessing the effectiveness of Conservation incentives (Beehives) in deterring problematic animals apart from elephants (...

4 0

Hi Agripina! 
I have always used Browning, and I have found the basic models to be perfectly adequate, using only 6 AA batteries (not 8 like the majority of camera traps). Now I am using the model BTC-8E, which has been a great success due to its excellent definition. 
I am from Chile, and I know of a company that imports them. Perhaps you should look for someone in your country that does the same, because importation and customs duties are often a headache.
I recommend you to visit trailcampro.com to read reviews from customers on different brands and models of camera traps.
I hope this information is helpful

See full post
article

New WildLabs Funding & Finance group

WildLabs will soon launch a 'Funding and Finance' group. What would be your wish list for such a group? Would you be interested in co-managing or otherwise helping out?

3 1
This is great, Frank! @StephODonnell, maybe we can try to bring someone from #Superorganism (@tomquigley ?) or another venture company (#XPRIZE) into the fold!
I find the group to be dope, fundraising in the realm of conservation has been tough especially for emerging conservation leaders. There are no centralized grants tracking common...
See full post
discussion

VIHAR-2024 deadline extension, June 30th (Interspeech satellite event) 

Dear Wildlabs community,The submission deadline for VIHAR-2024 has been extended to June 30th, 2024. VIHAR-2024 (https://vihar-2024.vihar.org) is the fourth international...

1 0

Thanks for sharing this @nkundiushuti ! I think this post would be better suited as an event, that way it will show up on the WILDLABS event calendar page. Let me know if you have any questions on how to make an event post! You just click the +Post button in the top right corner, then click "event."

See full post
discussion

Mass Detection of Wildlife Snares Using Airborne Synthetic Radar

Mass Detection of Wildlife Snares Using Airborne Synthetic RadarFor the last year my colleauges Prof. Mike Inggs (Radar - Electrical Engineering, Unviversity of Cape Town) and...

24 4

Hi @DaveGaynor regarding funding, have you reached out to any lodges? I know that snaring is rife in the Manyaleti ... maybe some of the lodges in the Sabi Sands would be interested in helping you with your goal? 

See full post
discussion

Meta: Does anyone know how to Stay logged in to Wildlabs.net?

One challenge I have on wildlabs is that I get logged out after maybe half a day or so. It makes it a bit tricky for me to use the forums because I'll go to a thing, but then have...

15 1

But the use of the browser password manager should make it no big deal for people even if they got logged out once a day. Which I’m sure is not the case for the wildlabs site.'

"Big deal" is very subjective...

I use firefox on linux and have tabs unloaded when not in use for ~30 minutes. After 24-48 hrs if I click on the wildlabs tab I'm logged out. I then have to click on login and have my password manager fill out the form. Then it brings me to my profile page (why? why doesn't it go back to where I was which is where I wanted to be, just logged in?), then I have to click on the feed page, which displays the global feed which is also not of my interest. Finally I get to click on my feed. Is is a "big deal"? I can't claim it is. Does it keep me from checking wildlabs? Definitely!  I participate in a dozen forums/discords and wildlabs is the most annoying to check in for a quick "oh, let's see whether there's something interesting".

 

 

It's still a choice to close your tabs. Closing all your tabs tends to get you to have to login again.

However.... Despite the expiry date on the cookie. linkedin.com logs in without ever having to re-enter your password. So I guess there is a way in principle to make it so you don't have to log back in. Such would a way then also work with your auto tab closer.

However, that's not something that I'm going to investigate. If someone wanted to, the clue would be in the cookies and the final answer would be available from the wildlabs site developer. Myself, I would never have guessed that so many people would find this annoying. But everyone is different.

See full post
discussion

Data loggers for sewage monitoring

Hi WILDLABS communityRecently I've been noticing some signs that our local beach in St Andrews, Scotland may be having raw sewage discharged. The monitoring by SEPA, as far as I...

2 1

Hi Jamie,

Nitrate sensors for sewage are quite pricey, so I might go with a bunch of OpenCTD loggers, on the theory that the conductivity will spike in sewage.  There may also be a detectable temperature signal.

Jamie/Harold: OpenCTDs are awesome but unfortunately conductivity won't indicate sewage against a background of seawater. I think DO is an interesting avenue to explore but I'm not aware of much published research on correlations there. I've heard of folks using a variety of fluorescence-based sensors (e.g., CDOM to indicate the organic matter, tryptophan-like fluorescence, and possibly optical brightener fluorescence). I think it's all still a very active field of research, so can't wait to hear what you learn! 

BTW, I thought of SMRU's tags when I just saw this other thread to which @htarold recently replied over here: 

If you can share how SMRU implements your tags' saltwater switch(es) in that thread, @jamie_mac, that would be a huge help!

See full post
discussion

Time drift in old Bushnell

Hi everyone! Have you ever experienced time drift in the old Bushnell NatureView Cam HD? If so, is there a way to fix it? The camera traps are still working pretty well for...

1 0

Depending on how much drift there is it may be a fixed offset caused by the timer not restarting until you have finished puttin gin al the settings. You set the time, then do all the the other settings for a couple of minutes, then exit settings and the timer starts from the time you set, in other words wto minute slow. The apparent drift will be short and fairly consistent, and will not increase with time (it is a bias). The solution is to leave the time setting to last and exit set p immediately after you enter the time.

If it is genuine drift then you can correct for it to an extent by noting the time on the camera and on an accurate timepiece when you retrieve the images. If you want to get fancy  you can take an image of a GPS screen with the time on it, and compare it to the time stamp on the image.

See full post
discussion

Remote Sensing & GIS Group Leadership

Hi Folks,I'm interested in being the group leader for the Remote Sensing & GIS  group, but it would be great to have a co-lead. Anyone care to join me? Thanks!Kind...

4 1

Hi Cathy! Please check out the below article on group managers. Essentially, we're asking for a 12 month commitment of 1 hour a week (more or less) to promote engagement in WILDLABS groups by sparking conversations, engaging with people's content, and planning 1 virtual event a month to bring the community together. (This could be bringing a speaker in, or just having a 45 minute coffee call for people in the sector to get to know each other and share their work.)

As a group manager, you have the full support of the WILDLABS community team. We have a dedicated Slack channel, monthly calls to support you in your group management, and so much more.

Being a group manager is a great way to give back to the conservation tech community, steer conversations in the direction you think is important, foster a vibrant space for collaboration, and build your network. Shoot me an email if you're interested ([email protected])! I'm happy to hop on a call to discuss more in-depth. 

Hi Cathy, please check out what Alex sent and DM me if you're still interested in co-leading the group. Thanks.

Vance

Unfortunately I may fail to deliver so I cannot take it up. Am off net at times for two weeks straight. 

But excited about everything GIS so I follow keenly. 

Thank you for the explanation. 🤍

See full post
discussion

Research in community-based conservation programs: best practices & challenges

A practice-based research project: I am carrying out a qualitative research project to better understand the challenges and best practices when designing and...

3 0

Hey there! Here are some people you may way to reach out to:

  • @EstherGithinji 
  • Kate Tointon from Fauna and Flora International's Conservation Leadership Programme
  • @Abigail 

I'll keep thinking about other possible contacts!

Hi Yanna,

Your project seems not particularly technology oriented, which is okay, of course. However, if you're not already calling or searching there, I would suggest checking out 

cheers

Frank

See full post
event

Bioacoustics and AI 101

Recent developments in AI have not only led to dramatic increase in accuracy of detecting/classifying sounds, but have simultaneously made these tools accessible for people with little to no prior knowledge of AI. This...

0
See full post