
On 18 June 2012 16:34, Odhiambo Washington <odhiambo@gmail.com> wrote:
On Mon, Jun 18, 2012 at 4:01 PM, Thuo Wilson <lixton@gmail.com> wrote:
On 18 June 2012 10:52, gisho <gichuhie@gmail.com> wrote:
- as @wash said above, only move the specific file - not everything which is denoted by "*" so this should look like mv /tmp/$f /tmp2
- instead of reading mail spool, you can pipe incoming mails to a script either by using procmailrc or by using /etc/aliases file if using postfix. this could look like this:
smsemail: "|/usr/libexec/postfix/trap.py"
Thanks Gisho, Thats is *part* of what i needed. I will use the alias. But note supposing am doing bulk sms - this may [not affirmative] fail due to numbers of emails < inflow... Second: i decided to change script to read as follows; simpler but doing better [?]
Imagine i don't have the script called from alias. How do i read one line at a time in the file "email_to_sms" and exec the command to send the sms. That is my main issue
Am sure its something simple i havent scratched my head enough!
#!/bin/bash `grep Subject /var/spool/mail/sms > email_to_sms` cat /dev/null > /var/spool/mail/sms killall -9 gammu-smsd #since for some reason gammu-smsd is using use_lock and wont disable echo `cat email_to_sms` | /usr/bin/gammu --sendsms TEXT `awk '{print $2}' email_to_sms` sleep 15 #wait for the above to finish /etc/init.d/gammu-smsd start #start to continue checking any incoming sms cat email_to_sms >> email_to_sms2 #incase make a backup file cat /dev/null > email_to_sms #empty for no2 message in the line :)
for sure slowly we are getting there...
So the file email_to_sms is a text file containing some data you need to work on, each line at a time:
for data in `cat email_to_sms`; do echo $data | /usr/bin/gammu $ARGS .....; done
isnt this doing same thing as my script? :)
That will surely iterate through all the data in the file, one line at a time... You can do loop processing until there is nothing left, or always check if exists a file before you act on it..
However at this point, I suggest you work on a file which has been moved to another directory, so move the file first and reference it from where it is. Since the filenames seem to clash at some point (or why do you wven want to move the files?) then my initial suggestion stands - append the current timestamp value to the filename to make it unique.
Please try and tell breakdown what this command line does:
echo `cat email_to_sms` | /usr/bin/gammu --sendsms TEXT `awk '{print $2}' email_to_sms`
assuming the file content looks like: A +254XXXXX F A +255YYYYYY G A +256ZZZZZ K echo `cat email_to_sms` The command just get the file content - first part; /usr/bin/gammu --sendsms TEXT `awk '{print $2}' email_to_sms` - send an sms to the number on the second column (2) in the file email_to_sms
Then again, if you need a program to rexec, there is no need to give it a SIGKILL (-9). Just do a SIGHUP (-1) So you can rexec gammu after it completes processing the first file while making it loop to look for the existence of another file in the parent directory ... and the loop continues...
Not bad thought!
Wilson.
smsemail being the account receiving mails. if your script is elsewhere you will need to tell SELinux of that otherwise it will fail ( thats if you have selinux)
-- -erastus +254733725373 Nairobi Kenya
On Mon, Jun 18, 2012 at 10:24 AM, gisho <gichuhie@gmail.com> wrote:
this is just a pointer: whichever way you do it, keeping files in the same directory means the time consumed by the filter will increase as files increase thus making the process slower and slower by every new file
--
On Mon, Jun 18, 2012 at 9:41 AM, Thuo Wilson <lixton@gmail.com> wrote:
On 16 June 2012 23:38, Odhiambo Washington <odhiambo@gmail.com> wrote:
On Sat, Jun 16, 2012 at 1:38 PM, Thuo Wilson <lixton@gmail.com>wrote:
> > > > On 16 June 2012 13:28, Odhiambo Washington <odhiambo@gmail.com>wrote: > >> >> >> On Sat, Jun 16, 2012 at 12:57 PM, Thuo Wilson <lixton@gmail.com>wrote: >> >>> I have a bash scripting question that has really bothered me; >>> >>> >>> First i want to list files in a directory with a command - no prob >>> Second - i want to get the name of the file - no problem >>> Third i want to read the content of the file - no problem >>> Fourth i want to extract some information , column and rows from >>> the file - no problem >>> >>> My Idea: >>> >My idea is to receive sms and forward to an email - working file >>> but only one at a time! #script 1 >>> >second is whenever a reply is sent from the email address "x" >>> with subject "$CELLPHONE $MESSAGE" the script is called by clone to send >>> the sms to user $cellphone - working file but only one at a >>> time! #script 2 >>> >>> >>> *Problem:* >>> The directory contains multiple files with dynamic names >>> (generated by system) some properties being date,time and cellphone number >>> The problem is i don't know how to read multiple files name, >>> read content to a file for processing at a go and output the results into >>> another file using bash. >>> No 2 - i want the *filename* of the other files....in the same >>> directory and each file to be called one at a time. >>> >>> take example >>> >>> cd /tmp >>> >>> -rw-r--r-- 1 root root 0 Jun 16 12:42 a.2012.1242pm >>> -rw-r--r-- 1 root root 0 Jun 16 12:42 b.2012.1243pm >>> -rw-r--r-- 1 root root 0 Jun 16 12:42 c.2012.1244pm >>> -rw-r--r-- 1 root root 0 Jun 16 12:42 d.2012.1245pm >>> -rw-r--r-- 1 root root 0 Jun 16 12:42 e.2012.1246pm >>> >>> bear in mind the files are generated randomly and have no >>> control over that :)- so to speak for question purposes >>> >>> I have the below script to read one line at a time and move what >>> is read to another folder - not good practice. I would want to retain the >>> file in same folder but my bash script should know that the file is read! >>> >>> How do i perfect this: This reads a file at a time and sends an >>> email but if multiple are received at a go - all the rest iregardless are >>> moved to the tmp2 folder. >>> >>> #!/bin/bash >>> cd /tmp >>> for f in "`ls`"; do >>> cat $f | mail -s "$f" lixton (ati) gmail.com >>> mv /tmp/* /tmp2 >>> >> ^^^^^^^^^^^^^^^^^^^^^^^^ Replace this line. >> > > > >> mv /tmp/$f /tmp2/ >> >> You could perfect that by appending a timestamp to /tmp2/$f so that >> there is no name clash the next day :-) >> > > Hi Wash, > > That could only deal with only one issue. Any files that comes > later that when the dir files are listed. > > #!/bin/bash > cd /tmp > for f in "`ls`"; do > cat $f | mail -s "$f" lixton (ati) gmail.com > mv /tmp/$f* /tmp2 > done >
Why do use the asterisk in there? $f* ?
> > Looking at example above... if multiple files are listed, only one > file (1st one is read) and the rest "only content is read using "cat" >
In all those files, how many matches do you get when parsing for the content you want?
For instance:
find . -type f -exec grep -li 'SOME_STRING' {} \;
If only 1 file matches, then you can use substitution to get the file, perhaps:
for f in ` find . -type f -exec grep -li 'SOME_STRING' {} \;`; do cat $f ....
This will output content of the file :(
> Wilson > > > > >> >> >> >> >>> done >>> >>> *Now for reading an email reply file from above:* >>> Well this will send BUT only one at a time and if multiple >>> emails are received at a go - all the others are deleted! >>> How do i perfect this.... >>> >>> #!/bin/bash >>> for i in 30 (corresponds to a linux user email box line 30 where >>> subject is) >>> do >>> awk -v a=$i 'NR==a {print $0}' /var/spool/mail/root > testfile2 >>> cat /dev/null > /var/spool/mail/root >>> echo `cat /root/testfile2` | /usr/bin/gammu --sendsms TEXT `awk >>> '{print $2}' testfile2` >>> >>> >> Use the same idea as above. Appending a timestamp will make files >> unique. >> > > Well if i want the SMS sent, i must read the conntent. timestamp may > not help if i can be able to read all the files and perse the output to > diffrent files or even one for sms delivery :) > > I am not sure I understand you, and as such I may come out as more misleading :)
What MTA do you use BTW? Postfix?
- i can use postfix/sendmail.
I think I have somewhere a script that can help you in this endeavor, but I only used it with Exim.
may i look into the script, may be it has what i want...
-- Best regards, Odhiambo WASHINGTON, Nairobi,KE +254733744121/+254722743223 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ I can't hear you -- I'm using the scrambler.
_______________________________________________ Skunkworks mailing list Skunkworks@lists.my.co.ke ------------ List info, subscribe/unsubscribe http://lists.my.co.ke/cgi-bin/mailman/listinfo/skunkworks ------------
Skunkworks Rules http://my.co.ke/phpbb/viewtopic.php?f=24&t=94 ------------ Other services @ http://my.co.ke
_______________________________________________ Skunkworks mailing list Skunkworks@lists.my.co.ke ------------ List info, subscribe/unsubscribe http://lists.my.co.ke/cgi-bin/mailman/listinfo/skunkworks ------------
Skunkworks Rules http://my.co.ke/phpbb/viewtopic.php?f=24&t=94 ------------ Other services @ http://my.co.ke
_______________________________________________ Skunkworks mailing list Skunkworks@lists.my.co.ke ------------ List info, subscribe/unsubscribe http://lists.my.co.ke/cgi-bin/mailman/listinfo/skunkworks ------------
Skunkworks Rules http://my.co.ke/phpbb/viewtopic.php?f=24&t=94 ------------ Other services @ http://my.co.ke
_______________________________________________ Skunkworks mailing list Skunkworks@lists.my.co.ke ------------ List info, subscribe/unsubscribe http://lists.my.co.ke/cgi-bin/mailman/listinfo/skunkworks ------------
Skunkworks Rules http://my.co.ke/phpbb/viewtopic.php?f=24&t=94 ------------ Other services @ http://my.co.ke
-- Best regards, Odhiambo WASHINGTON, Nairobi,KE +254733744121/+254722743223 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ I can't hear you -- I'm using the scrambler.
_______________________________________________ Skunkworks mailing list Skunkworks@lists.my.co.ke ------------ List info, subscribe/unsubscribe http://lists.my.co.ke/cgi-bin/mailman/listinfo/skunkworks ------------
Skunkworks Rules http://my.co.ke/phpbb/viewtopic.php?f=24&t=94 ------------ Other services @ http://my.co.ke