Sending group messages

I am working on a modifcation to the messaging plugin to allow a PM to be sent to all the users (1000+) in my site. I have used the Messages improved - Groups & Collections of friends support as a basis and have it working.

The issue I have is that when I look at the db, I now have the same message saved in the objects_entity table for every user which seems a waste of space.

Is it possible to modify the way an Elgg object is created such that the message is saved only once and all the db entries for the users who have received that message refer that single row from objects_entity?

Thanks in advance!

 

  • Hmm.. very interesting.. I will have to go thru your notes in more detail !
    What we did was not to "Grab all the users" -
    but did a one-time and incremental create of users' emails into files called
    userlist_123_999.data from users_entity tbl
    where 123 is guid of the first user and 999 is the guid of the last user on the file

    next -->
            listdir userlist_123_999.data
            if (notfound)
                all emails have been sent
                rename newsletter_123.mail --> newsletter_999.done
    and
        grab 1st userlist_999_999.data
            process with newsletter_999.mail
            rename userlist_123_999.data --> userlist_999_999.done

    while a userlist_123_999.data file is found -
    it gets processed by CRON'ed sendnewsletter.php

    The CRON is simple -->
    CRON x 5mins, PHP CLI := sendnewsletter.php
    #!../bin/bash
    ...
    echo ".START::NEWSLETTERSEND:: "$DATE
    date
    php /www/httpdocs/mod/newslettersend.php > $LOGFILE
    echo ".END::NEWSLETTERSEND.SH"

    • I guess one could call this a quasi-queue ;)
    • Looks like the CRON @ 5 mins - for us
    • does the same as the sleep(1) works for you.
  • Thanks. That looks much much simpler.

    The "grab all users" approach is a disaster, PHP runs out of memory in a lot of other places where we have used it, replacing it with something that cycles through the list, 100 at a time (which seems to be the sweet spot on the current configuration).

  • Thanks all - had thought that CRON might be an option. Will see how I get on : )