Distributed collection of group participation metadata
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
Paul Feitzinger c01f4f05c3
add note about running without filter first
5 months ago
code initial commit 5 months ago
config initial commit 5 months ago
output initial commit 5 months ago
Dockerfile initial commit 5 months ago
README.md add note about running without filter first 5 months ago
run.sh initial commit 5 months ago

README.md

Signal Group User Activity

"How active is each user in the signal groups I'm in?"

This project generates a spreadsheet that looks like this:

Alice Bob ...
Neighborhod Chat 1 5 ...
Roomate Chat 12 0 ...
... ... ... ...

Each cell is the total message count for a user in a chat.

The goal is to afford distributed collection of group participation metadata from a community that uses disparate signal groups. No one participant in the group is a member of every group, so we need to make use of multiple users' signal data.

I personally would not at all feel comfortable sending someone my decrypted signal database, so I implemented this as a dockerized python command line tool that runs locally. It generates a simple csv file that's easily inspectable before sharing.

Usage

The following command will build the necessary docker image, then run it against the signal config directory you point it to, and dump the output to ./output/table.csv:

$ ./run.sh -c ~/.config/Signal

Sending build context to Docker daemon  380.4kB
Step 1/12 : FROM archlinux:base-20241110.0.278197
---> 8f94599caa7b
...
Successfully tagged sigint:latest
INFO:main:writing message count table to /output/table.csv

Adding -f will filter the groups under consideration to only those listed in config/chat_list.txt

$ ./run.sh -c ~/.config/Signal -f

You might want to run the command once without filtering, then paste the chat name column from the spreadsheet into config/chat_list.txt and narrow down to only the ones you care about and run it again with -f.