Batch processing
am 18.06.2005 17:33:38 von wcb
I have a small perl routine.
Perl -p -e 's/\n/
\n/'
What this does is take a file and puts a
at the end of every line in the file. Which this
does well.
perl -p -e 's/\n/
\n/' myfile > myfile.br
does this to myfile and gives me a myfile.br.
But if I have a bunch of files, * does not work.
I can't do perl -p -e 's\/n\
\n/' * > *.br.
It does one file and stops.
If I make a directory D and do cp * D/ it moves
all files to D. But in perl, apparently * does
not go through the directory as expected.
Is there someway to do this on a one line script
as above? I note likewise sed doesn't do batch
processing as expected either.
I dug through Programming Perl, and Learning Perl,
but neither seem to deal with batch programming like this.
I dug through a lot of sed examples on line but nobody
mentions batch processing much.
What's the best general approach to a down and dirty
batch command to go through a directory with Perl?
Is there a canonical method for doing this in Perl?
--
When I shake my killfile, I can hear them buzzing!
Cheerful Charlie
Re: Batch processing
am 18.06.2005 19:04:13 von Jim Gibson
In article <11b8f9hp94hov02@corp.supernews.com>, wcb
wrote:
> I have a small perl routine.
>
> Perl -p -e 's/\n/
\n/'
>
> What this does is take a file and puts a
> at the end of every line in the file. Which this
> does well.
>
> perl -p -e 's/\n/
\n/' myfile > myfile.br
> does this to myfile and gives me a myfile.br.
> But if I have a bunch of files, * does not work.
> I can't do perl -p -e 's\/n\
\n/' * > *.br.
> It does one file and stops.
Oh, really? On my system, this line complains:
tcsh: *.br: No match.
if I have no *.br files in my directory, or
tcsh: xxx.br: File exists.
If I have one *.br file (because I have 'noclobber' set), or
tcsh: *.br: Ambiguous.
If I have more than one.
Note that these are errors from my shell, not Perl. I can get it to
work if I redirect to just one file, and perl will process all of the
files, but the results all end up in the one .br file.
>
> If I make a directory D and do cp * D/ it moves
> all files to D. But in perl, apparently * does
> not go through the directory as expected.
Yes, it does, but maybe not as you expect. The -p option tells Perl to
iterate over the files given as command-line arguments. The files are
opened, read, each line is printed after your statements have been
executed, and the file is closed.
>
> Is there someway to do this on a one line script
> as above? I note likewise sed doesn't do batch
> processing as expected either.
Use the -i option to edit the files in place while at the same time
creating a backup of each file. For example:
perl -pi'.bak' -e 's/$/
/' *
will do what you want, except that it is the original files that are
changed, and the originals end up as *.bak.
>
> I dug through Programming Perl, and Learning Perl,
> but neither seem to deal with batch programming like this.
> I dug through a lot of sed examples on line but nobody
> mentions batch processing much.
See Chapter 19 of Programming Perl, 3rd edition, or 'perldoc perlrun'
for a description of the -i option.
>
> What's the best general approach to a down and dirty
> batch command to go through a directory with Perl?
> Is there a canonical method for doing this in Perl?
There are very few 'canonical methods' in Perl. The usual saying is
"there's more than one way to do it (TMTOWTDI)", which is the
antithesis of canonicalism I should say.
----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Re: Batch processing
am 18.06.2005 19:27:21 von Justin C
On 2005-06-18, wcb wrote:
> I have a small perl routine.
>
> Perl -p -e 's/\n/
\n/'
>
> What this does is take a file and puts a
> at the end of every line in the file. Which this
> does well.
>
> perl -p -e 's/\n/
\n/' myfile > myfile.br
> does this to myfile and gives me a myfile.br.
> But if I have a bunch of files, * does not work.
> I can't do perl -p -e 's\/n\
\n/' * > *.br.
> It does one file and stops.
I don't know what OS or shell you're using but if it's bash or ksh or sh
the following should work:
for file in `ls` ;
do perl -p -e 's/\n/
\n/' $file > $file.br ;
done
Those are back-ticks, by the way, not apostrophes.
Justin.
--
Justin C by the sea.
Re: Batch processing
am 18.06.2005 19:57:49 von wcb
Justin C wrote:
> On 2005-06-18, wcb wrote:
>> I have a small perl routine.
>>
>> Perl -p -e 's/\n/
\n/'
>>
>> What this does is take a file and puts a
>> at the end of every line in the file. Which this
>> does well.
>>
>> perl -p -e 's/\n/
\n/' myfile > myfile.br
>> does this to myfile and gives me a myfile.br.
>> But if I have a bunch of files, * does not work.
>> I can't do perl -p -e 's\/n\
\n/' * > *.br.
>> It does one file and stops.
>
> I don't know what OS or shell you're using but if it's bash or ksh or sh
> the following should work:
>
> for file in `ls` ;
> do perl -p -e 's/\n/
\n/' $file > $file.br ;
> done
>
>
> Those are back-ticks, by the way, not apostrophes.
I'm using Bash on Linux. What does the `ls` mean?
What I need is something that does a named directory and
its subdirectories. I have been trying to google for perl batch
processing, and looking at my Perl books, but basic, simple,
everyday batch processing like this seems to be left out of
all the whizbang howtos and books.
Amazing, simple everyday batch processing seems to be
almost impossible to find anything on quickly except a
few very complex examples.
--
When I shake my killfile, I can hear them buzzing!
Cheerful Charlie
Re: Batch processing
am 18.06.2005 20:09:20 von wcb
Jim Gibson wrote:
> In article <11b8f9hp94hov02@corp.supernews.com>, wcb
> wrote:
>
>> I have a small perl routine.
>>
>> Perl -p -e 's/\n/
\n/'
>>
>> What this does is take a file and puts a
>> at the end of every line in the file. Which this
>> does well.
>>
>> perl -p -e 's/\n/
\n/' myfile > myfile.br
>> does this to myfile and gives me a myfile.br.
>> But if I have a bunch of files, * does not work.
>> I can't do perl -p -e 's\/n\
\n/' * > *.br.
>> It does one file and stops.
>
> Oh, really? On my system, this line complains:
>
> tcsh: *.br: No match.
>
> if I have no *.br files in my directory, or
>
> tcsh: xxx.br: File exists.
>
> If I have one *.br file (because I have 'noclobber' set), or
>
> tcsh: *.br: Ambiguous.
>
> If I have more than one.
I get the ambigous a lot too. And if I try to pipe it to a
directory it does but tells me its a directory.
Most frustrating.
> Note that these are errors from my shell, not Perl. I can get it to
> work if I redirect to just one file, and perl will process all of the
> files, but the results all end up in the one .br file.
>
>>
>> If I make a directory D and do cp * D/ it moves
>> all files to D. But in perl, apparently * does
>> not go through the directory as expected.
>
> Yes, it does, but maybe not as you expect. The -p option tells Perl to
> iterate over the files given as command-line arguments. The files are
> opened, read, each line is printed after your statements have been
> executed, and the file is closed.
>
>>
>> Is there someway to do this on a one line script
>> as above? I note likewise sed doesn't do batch
>> processing as expected either.
>
> Use the -i option to edit the files in place while at the same time
> creating a backup of each file. For example:
>
> perl -pi'.bak' -e 's/$/
/' *
>
> will do what you want, except that it is the original files that are
> changed, and the originals end up as *.bak.
>
>>
>> I dug through Programming Perl, and Learning Perl,
>> but neither seem to deal with batch programming like this.
>> I dug through a lot of sed examples on line but nobody
>> mentions batch processing much.
>
> See Chapter 19 of Programming Perl, 3rd edition, or 'perldoc perlrun'
> for a description of the -i option.
>
OK, thanks. I knew there had to be a way but the bits and
pieces are scattered through these books. Actually,
Programming Perl hit the wall rather hard about half
an hour ago.
>> What's the best general approach to a down and dirty
>> batch command to go through a directory with Perl?
>> Is there a canonical method for doing this in Perl?
>
> There are very few 'canonical methods' in Perl. The usual saying is
> "there's more than one way to do it (TMTOWTDI)", which is the
> antithesis of canonicalism I should say.
I just wish in the Perl books somewhere somebody would have
set down basic simple, batch style programing for simple
things like this.
This is a beginner's utter nightmare.
Thanks.
--
When I shake my killfile, I can hear them buzzing!
Cheerful Charlie
Re: Batch processing
am 18.06.2005 20:50:48 von Jim Gibson
In article <11b8nnrilk6h0c6@corp.supernews.com>, wcb
wrote:
> Justin C wrote:
>
> > On 2005-06-18, wcb wrote:
> >> I have a small perl routine.
[problem editting multiple files]
> > for file in `ls` ;
> > do perl -p -e 's/\n/
\n/' $file > $file.br ;
> > done
> >
> >
> > Those are back-ticks, by the way, not apostrophes.
>
> I'm using Bash on Linux. What does the `ls` mean?
> What I need is something that does a named directory and
> its subdirectories. I have been trying to google for perl batch
> processing, and looking at my Perl books, but basic, simple,
> everyday batch processing like this seems to be left out of
> all the whizbang howtos and books.
>
> Amazing, simple everyday batch processing seems to be
> almost impossible to find anything on quickly except a
> few very complex examples.
Perhaps you could explain what you mean by 'batch processing'. If you
are talking about processing multiple files, then Perl offers many ways
of accomplishing that. Look up the functions opendir and readdir or the
function glob. Check out the module File:Find, part of the standard
Perl distribution. All of these give you ways of finding and processing
multiple files, possibly in multiple directories. Of course, you will
probably want to write multi-line scripts for this purpose, rather than
one-liners.
A good source for typical Perl processing is the Perl Cookbook,
O'Reilly. For example, Recipe 9.5 "Processing All Files in a Directory"
is relevant to your problem.
----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups
----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Re: Batch processing
am 18.06.2005 21:23:26 von wcb
Jim Gibson wrote:
> In article <11b8nnrilk6h0c6@corp.supernews.com>, wcb
> wrote:
>
>> Justin C wrote:
>>
>> > On 2005-06-18, wcb wrote:
>> >> I have a small perl routine.
>
> [problem editting multiple files]
>
>> > for file in `ls` ;
>> > do perl -p -e 's/\n/
\n/' $file > $file.br ;
>> > done
>> >
>> >
>> > Those are back-ticks, by the way, not apostrophes.
>>
>> I'm using Bash on Linux. What does the `ls` mean?
>> What I need is something that does a named directory and
>> its subdirectories. I have been trying to google for perl batch
>> processing, and looking at my Perl books, but basic, simple,
>> everyday batch processing like this seems to be left out of
>> all the whizbang howtos and books.
>>
>> Amazing, simple everyday batch processing seems to be
>> almost impossible to find anything on quickly except a
>> few very complex examples.
>
> Perhaps you could explain what you mean by 'batch processing'. If you
> are talking about processing multiple files, then Perl offers many ways
> of accomplishing that. Look up the functions opendir and readdir or the
> function glob. Check out the module File:Find, part of the standard
> Perl distribution. All of these give you ways of finding and processing
> multiple files, possibly in multiple directories. Of course, you will
> probably want to write multi-line scripts for this purpose, rather than
> one-liners.
Thanks. I am used to DOS batch files and general Bash stuff.
So this is a bit of alien territory. Its one of those things
if you know a little bit about this, you can look it up, but
I didn't know enough about it to do that.
I have the books, its just nowhere does it really mention doing
batch processes ala DOS. That is taking a named directory and
processing all files and files in subdirectories.
You are right, it looks like I will be needing to learn a
little scripting. I just managed to figure out a tricky
deletion problem with regex, so Its not totally hopeless.
I am beginning to see a little bit about how it wants to work.
What I lacked was simple examples, once one has a few, its
easier to work backwards and then forwards again.
The lack of simple roadmaps for general batch processing
in these books is very frustrating when you know what you
need to do but the books seem to have never heard of that
concept. Its kind of there, but in bits and pieces scattered
around.
>
> A good source for typical Perl processing is the Perl Cookbook,
> O'Reilly. For example, Recipe 9.5 "Processing All Files in a Directory"
> is relevant to your problem.
>
> ----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet
> News==---- http://www.newsfeeds.com The #1 Newsgroup Service in the World!
> 120,000+ Newsgroups ----= East and West-Coast Server Farms - Total Privacy
> via Encryption =----
--
When I shake my killfile, I can hear them buzzing!
Cheerful Charlie
Re: Batch processing
am 19.06.2005 00:19:47 von Jim Gibson
In article <11b8soc68pu74b8@corp.supernews.com>, wcb
wrote:
> Jim Gibson wrote:
>
> > In article <11b8nnrilk6h0c6@corp.supernews.com>, wcb
> > wrote:
> >
> >> Justin C wrote:
> >>
> >> > On 2005-06-18, wcb wrote:
> >> >> I have a small perl routine.
> >
> > [problem editting multiple files]
> >
> >> > for file in `ls` ;
> >> > do perl -p -e 's/\n/
\n/' $file > $file.br ;
> >> > done
> >> >
> >> >
> >> > Those are back-ticks, by the way, not apostrophes.
> >>
> >> I'm using Bash on Linux. What does the `ls` mean?
> >> What I need is something that does a named directory and
> >> its subdirectories. I have been trying to google for perl batch
> >> processing, and looking at my Perl books, but basic, simple,
> >> everyday batch processing like this seems to be left out of
> >> all the whizbang howtos and books.
> >>
> >> Amazing, simple everyday batch processing seems to be
> >> almost impossible to find anything on quickly except a
> >> few very complex examples.
> >
> > Perhaps you could explain what you mean by 'batch processing'. If you
> > are talking about processing multiple files, then Perl offers many ways
> > of accomplishing that. Look up the functions opendir and readdir or the
> > function glob. Check out the module File:Find, part of the standard
> > Perl distribution. All of these give you ways of finding and processing
> > multiple files, possibly in multiple directories. Of course, you will
> > probably want to write multi-line scripts for this purpose, rather than
> > one-liners.
>
> Thanks. I am used to DOS batch files and general Bash stuff.
> So this is a bit of alien territory. Its one of those things
> if you know a little bit about this, you can look it up, but
> I didn't know enough about it to do that.
> I have the books, its just nowhere does it really mention doing
> batch processes ala DOS. That is taking a named directory and
> processing all files and files in subdirectories.
>
> You are right, it looks like I will be needing to learn a
> little scripting. I just managed to figure out a tricky
> deletion problem with regex, so Its not totally hopeless.
>
> I am beginning to see a little bit about how it wants to work.
>
> What I lacked was simple examples, once one has a few, its
> easier to work backwards and then forwards again.
>
> The lack of simple roadmaps for general batch processing
> in these books is very frustrating when you know what you
> need to do but the books seem to have never heard of that
> concept. Its kind of there, but in bits and pieces scattered
> around.
OK. I think I understand now. What you are calling 'batch processing'
most of us would call 'programming' or 'scripting'. You are talking
about taking things you normally do one-at-a-time via a command line
processor (DOS) and putting them in a 'batch file'. Most people would
call this 'shell scripting' or just 'scripting'. You can do a lot by
programming in a shell language. Unix has bash and csh for scripting.
However, for more complex tasks, you want to use a programming language
like C, Visual Basic, or Perl. Perl has many of the best qualities of a
shell scripting language and a general purpose programming language,
which is why many people use it.
So search for 'scripting', 'shell scripting', or 'programming' to find
more information on automating repetitive manual tasks.
----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= East/West-Coast Server Farms - Total Privacy via Encryption =---
Re: Batch processing
am 19.06.2005 11:36:04 von Justin C
On 2005-06-18, wcb wrote:
>
[snip]
> I just wish in the Perl books somewhere somebody would have
> set down basic simple, batch style programing for simple
> things like this.
Have you got the Llama book? Learning Perl I think it is, it's very
good. Once you've finished it the other two ("Programming" and
"Cookbook") are a lot easier.
Justin.
--
Justin C by the sea.
Re: Batch processing
am 19.06.2005 14:54:34 von wcb
Justin C wrote:
> On 2005-06-18, wcb wrote:
>>
> [snip]
>
>> I just wish in the Perl books somewhere somebody would have
>> set down basic simple, batch style programing for simple
>> things like this.
>
> Have you got the Llama book? Learning Perl I think it is, it's very
> good. Once you've finished it the other two ("Programming" and
> "Cookbook") are a lot easier.
>
> Justin.
>
I have it. Its useless.
What I need is a cook book with the routines
real world people use in real world situations
98% of the time. An unknown thing in the stack
of books I do have.
Emulating a -r switch which Perl does not have,
much to my regrest.
Start in present directory.
Walk tree.
If a file is found, do -e (command)
end.
I spent an hour looking for something like that
and the book ended up hitting the wall hard.
I just spent two %!$!$!$! hours googling for basic, atomic
Perl scripts to do simple, common things. And failed.
I have only weeks experience with Perl and I hate its guts.
Its a time suck waste.
I started to do a simple job and a week later I am not even close.
Its quicker to do 600 directories and sub-directories by hand.
Sed, no better, Rexx,no better.
Unix - We do it the hard way.
I am incredibly frustrated, a simple job is made hard.
Time just flies by and nothing is done. Real, uesful,
real world useful repositories of the dozen or so
really needed ways of doing this cannot be found
on google among the vast masses of garbage in the way.
--
When I shake my killfile I can hear them buzzing.
Re: Batch processing
am 19.06.2005 16:20:09 von Justin C
On 2005-06-18, wcb wrote:
> Justin C wrote:
>
>> On 2005-06-18, wcb wrote:
>>> I have a small perl routine.
>>>
>>> Perl -p -e 's/\n/
\n/'
>>>
>>> What this does is take a file and puts a
>>> at the end of every line in the file. Which this
>>> does well.
>>>
>>> perl -p -e 's/\n/
\n/' myfile > myfile.br
>>> does this to myfile and gives me a myfile.br.
>>> But if I have a bunch of files, * does not work.
>>> I can't do perl -p -e 's\/n\
\n/' * > *.br.
>>> It does one file and stops.
>>
>> I don't know what OS or shell you're using but if it's bash or ksh or sh
>> the following should work:
>>
>> for file in `ls` ;
>> do perl -p -e 's/\n/
\n/' $file > $file.br ;
>> done
>>
>>
>> Those are back-ticks, by the way, not apostrophes.
>
> I'm using Bash on Linux. What does the `ls` mean?
`ls` is the Bash "list directory contents" command. By putting it in the
back-ticks it means (the whole first line): For each file[1] in `generate
list of files` - or, perhaps more correctly, "evaluate the output of the
command in back-ticks and use it as input for the first line". I'm sure
someone could explain this better than me.
[1] in Bash when you declare a variable (in this case I called it file -
it seemed logical, it could equally have been "for eggs in `ls`; do
....$eggs > $eggs.br; ...") you don't use the leading '$'
> What I need is something that does a named directory and
> its subdirectories.
I've had a look at how to do this with Bash and Perl on the command
line, I'm sure it's possible but I've not the experience or knowledge to
crack it. I can, however, crack it in Perl (see below). This script is
in-elegant in the extreme, in fact it is extremely long-winded but it's
at the limit of what I know about Perl. I'm sure there are subscribers
to this group who, if they read it, would be able to achieve exactly the
same thing in much, much fewer lines. I wouldn't be surprised if it
could be done on one command line but so often these things can be
useful if kept so saved as a script can be used time and again without
having to remember the what is/does what. Anyway, here is my solution
(ugly though it is):
===== START =====
#!/usr/bin/perl
use warnings ;
use strict ;
use Cwd ;
unless ( defined @ARGV ) {
print "Usage add_br [/path/to/directory]\nWithout a directory name I
don't know which files you want to process." ;
exit ;
}
my $workingdir = pop @ARGV ;
chdir $workingdir
or die "Cannot change working directory to $workingdir: $!" ;
filetest($_) ;
sub filetest {
foreach (glob "*") {
if ( -d $_ ) {
my $olddir = getcwd ;
chdir $_
or die "Cannot change to sub directory $olddir/$_: $!";
filetest() ;
chdir $olddir ;
}
elsif ( -f $_ ) {
br() ;
}
}
}
sub br {
open INFILE, "<$_"
or die "Cannot open file $_: $!" ;
open OUTFILE, ">$_.br"
or die "Cannot create file $_.br: $!" ;
select OUTFILE ;
while () {
s/\n/\
\n/ ;
print "$_" ;
}
close INFILE ;
close OUTFILE ;
select STDOUT ;
}
===== END =====
> Amazing, simple everyday batch processing seems to be
> almost impossible to find anything on quickly except a
> few very complex examples.
On the whole, for simple everyday stuff Bash can be better - you just
have to learn to use the other tools at your disposal (and I don't know
anywhere near enough about those): sed and awk may have been the best
way of achieving what you are trying to do but, like I said, I don't
know about those!
Anyway, HTH.
Justin.
--
Justin C by the sea.
Re: Batch processing
am 19.06.2005 16:27:45 von Justin C
On 2005-06-19, WCB wrote:
> Justin C wrote:
>
>> Have you got the Llama book? Learning Perl I think it is, it's very
>> good. Once you've finished it the other two ("Programming" and
>> "Cookbook") are a lot easier.
>
> I have it. Its useless.
No, it is not useless, it is excellent if you use it for the intended
purpose. It is a course, you start at the beginning and by the time you
finish you know an awful lot more about Perl than you did. It's how I
started to learn Perl and, with that knowledge I've solved your problem
(see another post).
Frankly, if your attitude is going to be that expressed in the whole of
the post to which I am replying, then I'm not inclined to waste any more
time on you. I thank you for setting the problem that has made me *teach
myself* how Perl sub-routines work.
Justin.
--
Justin C by the sea.
Re: Batch processing
am 19.06.2005 22:07:13 von wcb
Jim Gibson wrote:
> In article <11b8soc68pu74b8@corp.supernews.com>, wcb
> wrote:
>
>> Jim Gibson wrote:
>>
>> > In article <11b8nnrilk6h0c6@corp.supernews.com>, wcb
>> > wrote:
>> >
>> >> Justin C wrote:
>> >>
>> >> > On 2005-06-18, wcb wrote:
>> >> >> I have a small perl routine.
>> >
>> > [problem editting multiple files]
>> >
>> >> > for file in `ls` ;
>> >> > do perl -p -e 's/\n/
\n/' $file > $file.br ;
>> >> > done
>> >> >
>> >> >
>> >> > Those are back-ticks, by the way, not apostrophes.
>> >>
>> >> I'm using Bash on Linux. What does the `ls` mean?
>> >> What I need is something that does a named directory and
>> >> its subdirectories. I have been trying to google for perl batch
>> >> processing, and looking at my Perl books, but basic, simple,
>> >> everyday batch processing like this seems to be left out of
>> >> all the whizbang howtos and books.
>> >>
>> >> Amazing, simple everyday batch processing seems to be
>> >> almost impossible to find anything on quickly except a
>> >> few very complex examples.
>> >
>> > Perhaps you could explain what you mean by 'batch processing'. If you
>> > are talking about processing multiple files, then Perl offers many ways
>> > of accomplishing that. Look up the functions opendir and readdir or the
>> > function glob. Check out the module File:Find, part of the standard
>> > Perl distribution. All of these give you ways of finding and processing
>> > multiple files, possibly in multiple directories. Of course, you will
>> > probably want to write multi-line scripts for this purpose, rather than
>> > one-liners.
>>
>> Thanks. I am used to DOS batch files and general Bash stuff.
>> So this is a bit of alien territory. Its one of those things
>> if you know a little bit about this, you can look it up, but
>> I didn't know enough about it to do that.
>> I have the books, its just nowhere does it really mention doing
>> batch processes ala DOS. That is taking a named directory and
>> processing all files and files in subdirectories.
>>
>> You are right, it looks like I will be needing to learn a
>> little scripting. I just managed to figure out a tricky
>> deletion problem with regex, so Its not totally hopeless.
>>
>> I am beginning to see a little bit about how it wants to work.
>>
>> What I lacked was simple examples, once one has a few, its
>> easier to work backwards and then forwards again.
>>
>> The lack of simple roadmaps for general batch processing
>> in these books is very frustrating when you know what you
>> need to do but the books seem to have never heard of that
>> concept. Its kind of there, but in bits and pieces scattered
>> around.
>
> OK. I think I understand now. What you are calling 'batch processing'
> most of us would call 'programming' or 'scripting'. You are talking
> about taking things you normally do one-at-a-time via a command line
> processor (DOS) and putting them in a 'batch file'. Most people would
> call this 'shell scripting' or just 'scripting'. You can do a lot by
> programming in a shell language. Unix has bash and csh for scripting.
> However, for more complex tasks, you want to use a programming language
> like C, Visual Basic, or Perl. Perl has many of the best qualities of a
> shell scripting language and a general purpose programming language,
> which is why many people use it.
>
> So search for 'scripting', 'shell scripting', or 'programming' to find
> more information on automating repetitive manual tasks.
I have done a lot of searching. I want to know the time, they want to tell
me how to build a clock. All I want is the small generic routine to
run a routine on a directory and subdirectories and files.
I really don't want to do much programming, but it looks like I will either
have to do it by hand, or spend a lot of time reinventing wheels.
I am beginning to hate anything to do with Unix.
All I want is a small collection of basic wheels.
For
sed -i -e {your-script-here}
done
Neither Bash nor Perl nor Rexx seem to have anything
like that I can get my hands on.
I really don't want to paw throug a 450 page book
and reinvent all of this and trouble shoot it.
I started a week ago to figure this out and still
haven't and just don't have the time to learn it now
to do a job I could do, buy hand, in three hours.
Everything to do with Unix/linux seems to be this way.
A time wasting, do it the hard way, reinvent the wheel
problem with large massive books utterly useless for get
it done now.
If Perl had a -R switch, none of this would be a problem.
I just wasted three hours looking for repositories
of basic wheels I don't want to reinvent and failed.
95% of the stuff 90% of people want and need to do.
Its just not there among the mountain of chaff
Google spits out·
Lord I hate wasting time for nothing this way.
I have three thick, expensive Perl books and they are
utterly useless for doing this now and quickly.
Learning Perl has 1/2 page on this stuff where they demur
on giving us details how to do stuff like this.
That book hit the wall hard.
--
When I shake my killfile I can hear them buzzing.
Re: Batch processing
am 19.06.2005 22:20:03 von wcb
Justin C wrote:
> On 2005-06-19, WCB wrote:
>> Justin C wrote:
>>
>>> Have you got the Llama book? Learning Perl I think it is, it's very
>>> good. Once you've finished it the other two ("Programming" and
>>> "Cookbook") are a lot easier.
>>
>> I have it. Its useless.
>
> No, it is not useless, it is excellent if you use it for the intended
> purpose. It is a course, you start at the beginning and by the time you
> finish you know an awful lot more about Perl than you did. It's how I
> started to learn Perl and, with that knowledge I've solved your problem
> (see another post).
The part I was interested in, recursive commands, walking a tree,
it covered in 1/2 page where it explained it wasn't going to explain that.
Splat! Up against the wall hard.
I don't have TIME to sit down and do that 'couurse' figure out what it
explicity said it would NOT cover, write a script, debug it, with the other
8 things likewise wasting my time trying to figure out, including HTML, and
CSS.
I am extremely frustrated because all thsese books are urttterly useless for
the one thing I want now.
I don't want to study Perl for weeks to be able to wrote the one little
script that shouild be explained up front, the stuff 90% of people will be
using 90% of the time.
I don't WANT to learn Perl. If it takes weeks of work to get to the point I
have a little script that allows me to walk a directory tree and diddle
a few files, its a broken language.
My frustration level is now exceeded by lack of any decent explanation for
the little scripts 90% of people would use 90% of the time with a scripting
language.
I want to use the wheel, not work weeks to reinvent it.
the time waste factor here sucks deeply.
I am going to do it by hand because I have put in a lot of time, not to
mention money on deeply useless books and am not getting good return for my
time.
I just wanted to get my files fixed, no a vast struggle with a
language that as far as I can find, has made no attempt to
give us a basic way to do something that show be done by -R.
(do this) -R -f *
I don't have time to screw with this now.
I have to go read some HTML books.
Perl is not meeting me halfway.
Sorry to vent, but this is incredible frustrating that
all these books are so useless to do something quickly
with little fuss.
>
> Frankly, if your attitude is going to be that expressed in the whole of
> the post to which I am replying, then I'm not inclined to waste any more
> time on you. I thank you for setting the problem that has made me *teach
> myself* how Perl sub-routines work.
>
> Justin.
>
--
When I shake my killfile I can hear them buzzing.
Re: Batch processing
am 19.06.2005 22:42:00 von dha
On 2005-06-19, WCB wrote:
>
> I don't have TIME to sit down and do that 'couurse' figure out what it
> explicity said it would NOT cover, write a script, debug it, with the other
> 8 things likewise wasting my time trying to figure out, including HTML, and
> CSS.
>
> I am extremely frustrated because all thsese books are urttterly useless for
> the one thing I want now.
> I don't want to study Perl for weeks to be able to wrote the one little
> script that shouild be explained up front, the stuff 90% of people will be
> using 90% of the time.
>
> I don't WANT to learn Perl. If it takes weeks of work to get to the point I
> have a little script that allows me to walk a directory tree and diddle
> a few files, its a broken language.
No, you just don't want to learn the language. Programming is not the
sort of thing where you just say "I want to write a program" and
suddenly you can. You need to actually learn how to do it.
If you don't want to, that's fine, but that's not the fault of the
technology but, rather, of your lack of patience.
Frankly, if this is your attitude, you'll be better off - for *any*
programming task - to hire someone to write the program for you. This,
however, is not the appropriate place to do that.
For what it's worth, you may want to look at the Perl Cookbook. However,
given your reluctance to learn Perl, I'm not sure it will help.
dha
--
David H. Adler - - http://www.panix.com/~dha/
"Something I'm hoping to achieve is, rather than have the film look
like we went out in New Zealand and shot on location, is that it looks
like we went out to Middle Earth and shot on location." - Peter
Jackson
Re: Batch processing
am 19.06.2005 22:46:28 von dha
On 2005-06-19, WCB wrote:
>
> I just wasted three hours looking for repositories
> of basic wheels I don't want to reinvent and failed.
Then you must not have looked very hard: http://search.cpan.org
I find it unlikely that whatever wheel you're looking for would not be
there. If, however, it isn't, them's the breaks. I note we don't have
all those neat flying cars that you see in lots of science fiction
movies yet either. Sometimes you wind up being the guy who has to build
the first one. Although, from what I've seen of what you're asking for,
I doubt that's the case.
dha
--
David H. Adler - - http://www.panix.com/~dha/
Apparently the left hand doesn't know what a right hand IS.
- zenham in #perl
Re: Batch processing
am 20.06.2005 01:59:53 von wcb
David H. Adler wrote:
> On 2005-06-19, WCB wrote:
>>
>> I just wasted three hours looking for repositories
>> of basic wheels I don't want to reinvent and failed.
>
> Then you must not have looked very hard: http://search.cpan.org
>
> I find it unlikely that whatever wheel you're looking for would not be
> there. If, however, it isn't, them's the breaks. I note we don't have
> all those neat flying cars that you see in lots of science fiction
> movies yet either. Sometimes you wind up being the guy who has to build
> the first one. Although, from what I've seen of what you're asking for,
> I doubt that's the case.
>
> dha
>
Fret - Font REporting Tool MHOSKEN
Font::TFM RpdO Read info from TeX font metric files JANPAZ
Font::TTF bpdO TrueType font manipulation module MHOSKEN
FrameMaker cpdO Top level FrameMaker interface PEASE
FrameMaker::Control cpdO Control a FrameMaker session PEASE
FrameMaker::FDK icdO Interface to Adobe FDK PEASE
FrameMaker::MIF cpdO Parse and Manipulate FrameMaker MIF files PEASE
FrameMaker::MifTree apdOp A MIF Parser RST
Frontier::RPC Performs Remote Procedure Calls using XML KMACLEOD
Language::VBParser apd?g Visual Basic 6 source parser FRETT
Lyrics::Fetcher::Google apdfg Uses google to fetch song lyrics NEBULOUS
MMDS Mpdhp Minimal M
These are complex programs.
Not what I desperately am looking for.
Very, very basic "wheels"
--
When I shake my killfile I can hear them buzzing.
Re: Batch processing
am 20.06.2005 02:33:36 von wcb
David H. Adler wrote:
> On 2005-06-19, WCB wrote:
>>
>> I don't have TIME to sit down and do that 'couurse' figure out what it
>> explicity said it would NOT cover, write a script, debug it, with the
>> other 8 things likewise wasting my time trying to figure out, including
>> HTML, and CSS.
>>
>> I am extremely frustrated because all thsese books are urttterly useless
>> for the one thing I want now.
>> I don't want to study Perl for weeks to be able to wrote the one little
>> script that shouild be explained up front, the stuff 90% of people will
>> be using 90% of the time.
>>
>> I don't WANT to learn Perl. If it takes weeks of work to get to the
>> point I have a little script that allows me to walk a directory tree and
>> diddle a few files, its a broken language.
>
> No, you just don't want to learn the language. Programming is not the
> sort of thing where you just say "I want to write a program" and
> suddenly you can. You need to actually learn how to do it.
I want to get work done.
rm -f -R *
Dead simple. That works.
Perl something *.
No recursion. I need a script. I need somewhere to see
the most basic, primitive script I can use.
To replace the missing -R
I refuse to spend weeks picking bits and pieces out
of thick books that do not show any useful real world examples,
reinvent these wheels, debug them, and THEN weeks later,
try to use them.
There are no easy to find Perl "wheel" depositories or
tutorials that do anything remotety like I need. Not want.
Need.
Now. Quickly.
Quick and Now and Perl are mutually exclusive.
I don't want to have to become a propeller head
to do a simple task.
Or what should be a simple task.
This is why Linux won't be bumping of Windows anytime soon.
Granny wants to add
to a bunch of stuuf in a lot of
subdirectories. She will look for the easy way to do it
and will, with Perl and Bash, end up doing it by hand.
This should not be hard. But because Perl breaks with user
comfort and sane ability to do things the right way, no -R,
its not going to be used by the masses.
Its conceptually broken from the beginning.
The recommended books suck high vaccuum.
Its simply a hard language lacking creature comforts,
ease of use, adequate real world documentation except
for propellerheads with time on their hands.
All this time wasted because Wall didn't understand
why -R beats that script the books don't tell you up front,
plain and simple how to write.
The sort of simple script to do the stuff 90% people
do 95% of the time.
All I have done is waste far too much time trying to
do something the Perl world grudgingly will not tell
us up front.
If Perl does not understand -R, I just wonder how many other
stupities of that ilk it lacks.
Unfortunately, all alternatives I have looked at seem
to be equally braindamaged.
If I knew C, I would take something like rm with its nice -R
and -f switch, find out how rm implments recursion and would
hack it so one had f() with -R -f et al. Inset f(my-sed-routine;
routine.sh) and be able to do a sed routine wit full recursion as per
rm.
This would be sane and worth 10 Perl like languages.
f(myroutine.sh) -R -f *, its done.
Yet everybody spends their time inventing new broken
languages doing it the hard way.
f(add
.sh) -R *
sed -i.bak -p -e -n 's/n\
Make it a simple shell script named add
.sh
and poke in in /bin. Done.
I had hoped these hordes of propeller heads would have
figured out easy is better but they haven't.
I had hoped to like Perl but its reputation as a text
orientated scripting language is undserved for everyday use
by people who need quick and easy and now.
It simply has not, for me, cut the mustard.
>
> If you don't want to, that's fine, but that's not the fault of the
> technology but, rather, of your lack of patience.
>
> Frankly, if this is your attitude, you'll be better off - for *any*
> programming task - to hire someone to write the program for you. This,
> however, is not the appropriate place to do that.
>
> For what it's worth, you may want to look at the Perl Cookbook. However,
> given your reluctance to learn Perl, I'm not sure it will help.
>
> dha
>
--
When I shake my killfile I can hear them buzzing.
Re: Batch processing
am 20.06.2005 08:00:51 von dha
On 2005-06-20, WCB wrote:
>
> I refuse to spend weeks picking bits and pieces out
> of thick books that do not show any useful real world examples,
> reinvent these wheels, debug them, and THEN weeks later,
> try to use them.
>
> There are no easy to find Perl "wheel" depositories or
> tutorials that do anything remotety like I need. Not want.
> Need.
Clearly, you wish to ignore CPAN. This is your first stumbling block.
> Now. Quickly.
Here's your second. You basically seem to want to abandon all
responsibility for what is, after all, what *you* need to do. If you say
"Hey! Fix this for me! Chop chop!" you will be ignored or chastized.
If windows makes you happy, stick with it. If you don't want to know how
to program, don't.
*shrug*
dha
--
David H. Adler - - http://www.panix.com/~dha/
There are 6 billion people in the world, and only 30 billion of those
are Canadians - Headline in the Toronto Globe and Mail
Re: Batch processing
am 20.06.2005 08:04:58 von dha
On 2005-06-19, WCB wrote:
> David H. Adler wrote:
>
>> On 2005-06-19, WCB wrote:
>>>
>>> I just wasted three hours looking for repositories
>>> of basic wheels I don't want to reinvent and failed.
>>
>> Then you must not have looked very hard: http://search.cpan.org
>>
>> I find it unlikely that whatever wheel you're looking for would not be
>> there. If, however, it isn't, them's the breaks. I note we don't have
>> all those neat flying cars that you see in lots of science fiction
>> movies yet either. Sometimes you wind up being the guy who has to build
>> the first one. Although, from what I've seen of what you're asking for,
>> I doubt that's the case.
>>
>> dha
[snip list of very specialized modules]
> These are complex programs.
>
> Not what I desperately am looking for.
> Very, very basic "wheels"
Well, yes. If you go looking for modules that do complex tasks you will
find complex solutions. Of course, none of those modules seem to have
anything to do with your problem.
The task you wanted to do probably could be done with one of the modules
in the range of File::Find.
Again, you don't like perl, fine. Don't use it. But, really, we don't
need to hear about it.
dha
--
David H. Adler - - http://www.panix.com/~dha/
* Nathan_Roberts screams out loud, "IS THIS ENTIRE CHANNEL COMPOSED
OF BOTS?!?!?"
Kill the human. - from the #drwhochat Quotefile
Re: Batch processing
am 20.06.2005 11:01:41 von Joe Smith
WCB wrote:
> I want to get work done.
>
> rm -f -R *
>
> Dead simple. That works.
>
> Perl something *.
> No recursion. I need a script. I need somewhere to see
> the most basic, primitive script I can use.
>
> To replace the missing -R
Use File::Find to do recursive things with Perl.
> I refuse to spend weeks picking bits and pieces out
> of thick books that do not show any useful real world examples,
The books do show real-world examples.
> There are no easy to find Perl "wheel" depositories or
> tutorials that do anything remotety like I need. Not want.
> Need.
http://learn.perl.org and http://search.cpan.org takes care of that.
> Quick and Now and Perl are mutually exclusive.
Oh, pish and tosh. I do one-line perl scripts all the time,
and some of them are recursive.
> I don't want to have to become a propeller head
> to do a simple task.
>
> Or what should be a simple task.
It is quite simple, if you break the problem down into two parts.
1) Recursively search a directory to find files that you wish to
process.
2) Process the files, either one at at time, or do them all after
step 1 is finished.
You've accomplished step 2, all you have to do is put step 1
in front of that.
> This is why Linux won't be bumping of Windows anytime soon.
Windows is no help in this area.
> Granny wants to add
to a bunch of stuuf in a lot of
> subdirectories. She will look for the easy way to do it
> and will, with Perl and Bash, end up doing it by hand.
It is quite simple with perl and bash and find:
find public_html -type f -print | perl fixfiles.pl
find2perl public_html -type f -print >temp.pl
perl temp.pl | perl fixfiles.pl
> This should not be hard. But because Perl breaks with user
> comfort and sane ability to do things the right way, no -R,
> its not going to be used by the masses.
>
> Its conceptually broken from the beginning.
So, you're saying that bash is conceptually broken because it
does not have a -R switch. And that gcc is conceptually broken
because it does not do recursion. I don't think so.
> The recommended books suck high vaccuum.
Nope.
> Its simply a hard language lacking creature comforts,
> ease of use, adequate real world documentation except
> for propellerheads with time on their hands.
And how does this differ from the documentation for the
C compiler, or for bash?
> All this time wasted because Wall didn't understand
> why -R beats that script the books don't tell you up front,
> plain and simple how to write.
Have you attempted to write a C program from scratch that
does recursion? Are you saying that the C language is
busted because it does not have -R as built-in functionality?
> The sort of simple script to do the stuff 90% people
> do 95% of the time.
Perl is very good at doing simple stuff.
> All I have done is waste far too much time trying to
> do something the Perl world grudgingly will not tell
> us up front.
> If Perl does not understand -R, I just wonder how many other
> stupities of that ilk it lacks.
Why should any programming language have -R?
Are there any programming languages that have -R?
If no, why are you blaming perl for this?
> Unfortunately, all alternatives I have looked at seem
> to be equally braindamaged.
> If I knew C, I would take something like rm with its nice -R
> and -f switch, find out how rm implments recursion and would
> hack it so one had f() with -R -f et al. Inset f(my-sed-routine;
> routine.sh) and be able to do a sed routine wit full recursion as per
> rm.
That's easy to do with Perl, just 'use File::Find;' and create
a suitable wanted() function.
> This would be sane and worth 10 Perl like languages.
>
> f(myroutine.sh) -R -f *, its done.
find . -type f -print | xargs perl myroutine.pl
> Yet everybody spends their time inventing new broken
> languages doing it the hard way.
Ah, you're talking about broken user interfaces,
not broken languages.
> I had hoped to like Perl but its reputation as a text
> orientated scripting language is undserved for everyday use
> by people who need quick and easy and now.
Oh, it is most deserved for the type of everyday things that
I use it for. I'm sorry that you're unrealistic expectations
have left you bitter.
-Joe
Re: Batch processing
am 20.06.2005 11:13:02 von Joe Smith
WCB wrote:
> If Perl had a -R switch, none of this would be a problem.
You keep saying that, but it is not true.
When it comes to parsing command-line switches, it is
the responsibility of the program (script) to do that, not
the language compiler.
I can understand you're frustration at not finding any
templates or sample scripts showing how to recognize a
request for recursion and to implement recursion.
But it is not Perl, the language, that needs -R; it is
your program that does.
> I just wasted three hours looking for repositories
> of basic wheels I don't want to reinvent and failed.
The perl module File::Find is very powerful and does
the low-level basics of what you need.
> 95% of the stuff 90% of people want and need to do.
> Its just not there among the mountain of chaff
> Google spits out·
You can blame Google for that.
> Lord I hate wasting time for nothing this way.
If you had asked nicely, you might have had an answer by now.
Badmouthing the language is not the way to get sympathy.
-Joe
Re: Batch processing
am 20.06.2005 11:21:17 von Joe Smith
Justin C wrote:
> sub filetest {
> foreach (glob "*") {
Yuck.
sub br {
$file = shift;
return unless -f $file;
... # open in, open out, process in writing to out, close out
}
use File::Find;
find(\&br,'.');
exit;
-Joe
Re: Batch processing
am 20.06.2005 17:20:59 von Justin C
On 2005-06-20, Joe Smith wrote:
> Justin C wrote:
>
>> sub filetest {
>> foreach (glob "*") {
>
> Yuck.
Oh, I know! I only took the task on because I thought it interesting
and, for me, educational.
>
> sub br {
> $file = shift;
OK, yeah, that makes sense.
> return unless -f $file;
That's definitely an improvement.
> ... # open in, open out, process in writing to out, close out
> }
> use File::Find;
> find(\&br,'.');
> exit;
And that's *all* you could find wrong with it?! I am pleased!
Justin.
--
Justin C by the sea.
Re: Batch processing
am 21.06.2005 20:37:18 von Phillip Hartfield
wcb wrote:
> I have a small perl routine.
>
> Perl -p -e 's/\n/
\n/'
> What's the best general approach to a down and dirty
> batch command to go through a directory with Perl?
> Is there a canonical method for doing this in Perl?
for i in `find . -type f`; do perl -pi.'bak' -e 's/\n/
\n/' $i; done
This is 50% shell command 50% perl. To do subdirectory crawls with
perl, you'd need to use a module or opendir, readdir. stat will tell
you which are directories, files, symlinks, etc. In a unix file system
you'd want to deal with each according to need. You probably wouldn't
want those
's in your .htaccess files...
for i in `find . -type f|grep -v .htaccess`; do perl -pi.'bak' -e
's/\n/
\n/' $i; done
or
for i in `find . -type f|grep \.htm`; do perl -pi.'bak' -e
's/\n/
\n/' $i; done
See what the find . -type f shows and you'll see what to grep in or out.
Dos filesystems don't have symlinks, files that look like directories,
directories that look like files or files that have undisplayable
characters. Files that begin with a dash aren't nearly as bad...
File names with spaces in them are no fun in the shell or in perl.
I have found times when -R breaks. It isn't pretty and it always goes
up the ladder to the sysadmin types to find a fix. Directories nested
over several hundred deep for instance... rm -R * doesn't work
I forget whether find worked on those or not, interesting exercise in
any event.
Phil
Re: Batch processing
am 21.06.2005 23:59:25 von wcb
Joe Smith wrote:
> WCB wrote:
>
>> I want to get work done.
>>
>> rm -f -R *
>>
>> Dead simple. That works.
>>
>> Perl something *.
>> No recursion. I need a script. I need somewhere to see
>> the most basic, primitive script I can use.
>>
>> To replace the missing -R
>
> Use File::Find to do recursive things with Perl.
I foundthat. Al I found left me going "Hunh?"
Can yoiu give me an example how you do what you say it can?"
>
>> I refuse to spend weeks picking bits and pieces out
>> of thick books that do not show any useful real world examples,
>
> The books do show real-world examples.
I have about $100 worth of Perl books right here.
No, they don't.
I have real world examples to do propellerhead things.
Nothing like oh, say romping through a directory and
subdirectory and files doing simple things.
I wasted hours looking.
>
>> There are no easy to find Perl "wheel" depositories or
>> tutorials that do anything remotety like I need. Not want.
>> Need.
>
> http://learn.perl.org and http://search.cpan.org takes care of that.
>
No. It does not. Ahain, lots of propellerhed projects.
No little wheels. I wasted hours again.
>> Quick and Now and Perl are mutually exclusive.
>
> Oh, pish and tosh. I do one-line perl scripts all the time,
> and some of them are recursive.
>
Mutually exclusive.
>> I don't want to have to become a propeller head
>> to do a simple task.
>>
>> Or what should be a simple task.
>
> It is quite simple, if you break the problem down into two parts.
I can do that. Now I need to know how to do X.
But I don't have that in all these useless books.
The simple things 90% of people do 95% of the time use
utterly ignored.
Hence my fury at wasting time trying to figure to out with sources that
simple don't cut the mustard.
> 1) Recursively search a directory to find files that you wish to
> process.
Yes, if I knew how, and I don't.
Again, the boks are useless, website totally useles.
If the info is out there, its buried under 50 tons of dross.
> 2) Process the files, either one at at time, or do them all after
> step 1 is finished.
> You've accomplished step 2, all you have to do is put step 1
> in front of that.
Just whip zip do it and its done.
No, it isn't like that.
Its spend hours with books hours googling at at the end of the day, you are
no closer to knowing how than before.
That is whgy I am angry. It should not be that way.
But it is that way. With everything.
Rexx, Perl, Bash.
>
>> This is why Linux won't be bumping of Windows anytime soon.
>
> Windows is no help in this area.
>
Linux is far, far worse. MAC and Windows have programs that do
the heavy lifting.
>> Granny wants to add
to a bunch of stuuf in a lot of
>> subdirectories. She will look for the easy way to do it
>> and will, with Perl and Bash, end up doing it by hand.
>
> It is quite simple with perl and bash and find:
> find public_html -type f -print | perl fixfiles.pl
> find2perl public_html -type f -print >temp.pl
> perl temp.pl | perl fixfiles.pl
>
Granny isn't going to spend days figuring out Bash, and find, and
perl to do what -R does. If Granny needs to do that, she will be getting a
MAC. Linux is for server propellerheads, not end user with needs.
>> This should not be hard. But because Perl breaks with user
>> comfort and sane ability to do things the right way, no -R,
>> its not going to be used by the masses.
>>
>> Its conceptually broken from the beginning.
>
> So, you're saying that bash is conceptually broken because it
> does not have a -R switch. And that gcc is conceptually broken
Very broken. That's why we have had so much pain over the last two years
with broken compilers and Linux kernels. Real programmers use Intel
compilers anyway. less broke, better optimzed, better tools.
> because it does not do recursion. I don't think so.
>
I do. I KNOW it.
The motto is "we're the Unix boys. We've been doing it the
hard way for 35 years and we aren't going to stop now."
>> The recommended books suck high vaccuum.
>
> Nope.
Yes. I Have themhere.
Learning Peal. Useless.
Totally, utterly bone stick, stone, hard core useless.
>
No -R switch and that book does not tell how to
deal with it.
It mumbles about the problem half a page and then
runs off to waste time on other useless things.
Frustrating as hell. All I want to know and no, we aren't
going to get into that.
I want, quick, dirty and now.
Nope. They chirp that you might get into writing programs
if they told you that.
Fuck yes! That is why I got the book!
>> Its simply a hard language lacking creature comforts,
>> ease of use, adequate real world documentation except
>> for propellerheads with time on their hands.
>
> And how does this differ from the documentation for the
> C compiler, or for bash?
Bash sucks too. Same problem. Documentation is for building a database
from end or 400 page books that go over all the tedious little details for
1,001 comands et all I will never use.
What I need, how to do what most people do 90% of the time is conspicously
absent.'
I do NOT want to spend months studying hard so I can reinvent all the little
wheels over again.
>
>> All this time wasted because Wall didn't understand
>> why -R beats that script the books don't tell you up front,
>> plain and simple how to write.
>
> Have you attempted to write a C program from scratch that
> does recursion? Are you saying that the C language is
> busted because it does not have -R as built-in functionality?
If you have to build a C program to get what better bits of Bash
do with -R, you must be using Linux.
(some comand) -r -f *
Perl -i -p -e 's/junk// -R -f *
-R is the right way to do things. Perl dopes not do things the right way.
It does it the brain damaged all aroun the houses way.
I have a theory. Command -f -r is pretty good programming.
Its right. It works. But its probably hard to program things that way that
seem simple. But anybody can program an ugly kludge.
You can download thousands of them all day long.
Which is exactly what Perl is. A clumsy kludge.
>> The sort of simple script to do the stuff 90% people
>> do 95% of the time.
>
> Perl is very good at doing simple stuff.
Bwahahahahahahah!
Why with all thes eperl books can I not find ho wto do that?
1/2 page in th elearning perl book where they dismiss that
and romp of on things that do not matter.
A book only good for throwing if a simple basic recursive script
to replace teh missing -R is what you need NOW.
Actually, I have a 5 foot stack of expensive computer books
which mainly has been useless, I won't be spending more $$$.
I now realize most computer book writers are unorganized and
incapable of analyzes what needs to be in a book.
>
>> All I have done is waste far too much time trying to
>> do something the Perl world grudgingly will not tell
>> us up front.
>> If Perl does not understand -R, I just wonder how many other
>> stupities of that ilk it lacks.
>
> Why should any programming language have -R?
> Are there any programming languages that have -R?
> If no, why are you blaming perl for this?
>
If everybody else is stupid and you follow their example,
are you stil stupid?
Why didn't he early ion realize "Switches are so easy by comparrison to
doingthese teh long hard, time consuming round about way. They do
the hard work for you.
Long ago, people had to pprogram in assembly. So they wrote
languages to make the work of programming easier.
Wall didn't get it.
>> Unfortunately, all alternatives I have looked at seem
>> to be equally braindamaged.
>
>> If I knew C, I would take something like rm with its nice -R
>> and -f switch, find out how rm implments recursion and would
>> hack it so one had f() with -R -f et al. Inset f(my-sed-routine;
>> routine.sh) and be able to do a sed routine wit full recursion as per
>> rm.
>
> That's easy to do with Perl, just 'use File::Find;' and create
> a suitable wanted() function.
I saw a little about find but it all drifted off into propellerhead
ramblings tll it made no sense.
Nobody has the organized miond to say, here are basic
quick ways to do the stuff you do 90% of the time.
I wasted two hours looking for somebody who had brains enough to see the
problem and lay it out, but if somebody does do that, he's buried in 1,001
morons that don't and clog the bandwidth with useless chatter.
>> This would be sane and worth 10 Perl like languages.
>>
>> f(myroutine.sh) -R -f *, its done.
>
> find . -type f -print | xargs perl myroutine.pl
Utterly opaque and meaningless to me.
>
>> Yet everybody spends their time inventing new broken
>> languages doing it the hard way.
>
> Ah, you're talking about broken user interfaces,
> not broken languages.
YES! YES! YES! You want to talk about computer
languages, not broken interfaces and AND utter lack of ease of
use and utter lack of well documented example of the most basic,
little routines needed to emulate the missing -R switch.
-R does a LOT. So well most people have no idea how it works.
You don't NEED to know. You just have to know -R *.
That is the way it should be.
But way back, somebody foisted that broken horror awk
on us, missing all switches. And ever since, people have been emulated awk,
and avoiding good programming, papering over teh lack with complex
scripts.
Why write script when you can type -R? Why write a script debug it and
then go the next problem and again, write a script and debug it.
Because most of them get paid by the hour.
Job security, there is no incentive to make it quick and easy.
That is why the books are bad. You have to spend months with a 480 manual
just to beging to do real work. Again, job security. Writing a script
that is almost impossible to do with that book with lots of time consumimng
study makes writing litte scripts rather than typing command regex -f -R *
is job security. So for 30 years, nobody cared.
In the end, it didn't work. They study the same books in India
and work at 1/3 the pay anyway.
>> I had hoped to like Perl but its reputation as a text
>> orientated scripting language is undserved for everyday use
>> by people who need quick and easy and now.
>
> Oh, it is most deserved for the type of everyday things that
> I use it for. I'm sorry that you're unrealistic expectations
> have left you bitter.
> -Joe
I wanted something quick and dirty.
Can't get there from here.
--
When I shake my killfile I can hear them buzzing.
Re: Batch processing
am 22.06.2005 00:08:58 von wcb
David H. Adler wrote:
> On 2005-06-20, WCB wrote:
>>
>> I refuse to spend weeks picking bits and pieces out
>> of thick books that do not show any useful real world examples,
>> reinvent these wheels, debug them, and THEN weeks later,
>> try to use them.
>>
>> There are no easy to find Perl "wheel" depositories or
>> tutorials that do anything remotety like I need. Not want.
>> Need.
>
> Clearly, you wish to ignore CPAN. This is your first stumbling block.
No, I didn't. Nor did I find what I needed there.
Lots of projects. Need to buid a front end to MySQl in perl?
I don't. Lots of stuff like that.
>
>> Now. Quickly.
>
> Here's your second. You basically seem to want to abandon all
> responsibility for what is, after all, what *you* need to do. If you say
> "Hey! Fix this for me! Chop chop!" you will be ignored or chastized.
No. You don't get it.
I bought a stack of Perl books expecting something.
It is not there. I want to do simple things NOW.
It turns out simple and now and perl are not congruent.
$100 worth of useless books.
What I wanted to do was DISMISSED IN 1/2 PAGE
IN LEARNING PERL.
The Algorithms book pays no attention to the 101
little things people really want to do in the real world.
-R is missing. So you need a bloody script.
Which CPAN as far as I can see, and the books simple
do not tell in a quick, timely organized, thougtful, real world useful
manner.
And that is it in a nutshell.
And no, I don't use Windows. I used DOS.
which at least, for what it was, worked.
DOS documentation at least, was an order of magnitude better
than anything the Linux world seems to be able to produce.
Re: Batch processing
am 22.06.2005 00:27:29 von wcb
Joe Smith wrote:
> WCB wrote:
>
>> If Perl had a -R switch, none of this would be a problem.
>
> You keep saying that, but it is not true.
>
> When it comes to parsing command-line switches, it is
> the responsibility of the program (script) to do that, not
> the language compiler.
>
> I can understand you're frustration at not finding any
> templates or sample scripts showing how to recognize a
> request for recursion and to implement recursion.
> But it is not Perl, the language, that needs -R; it is
> your program that does.
>
-R beats a script any day of the week.
You know it works and how.
You don't have to worry about that.
*, ? and a few switches can do powerful things.
>> I just wasted three hours looking for repositories
>> of basic wheels I don't want to reinvent and failed.
>
> The perl module File::Find is very powerful and does
> the low-level basics of what you need.
Again, I failed to find documentation.|
I did find it, but as usual, some propellerhead started
in on it and drifted off into complex irrelevancies.
After wasting a hour googling that, I gave up.
Again, its lack of documentation.
Real world documentation.
Get it done now, no bullshit, documentation.
At least I know one good thing now.
Don't even bother with Perl books.
Its $35 wasted. I'll never buy another.
The problem is, if there is good documentation out there, its
one guy among 1000 fools clogging Google with nonsense.
A problem of high noise to signal.
And too many people with good computer skills but no
sense writing books.
There are apparently several little modules out there that could be useful.
Maybe. I coul'd find enugh examles to tell, just opinions.
I want some "lets cut the guff" real world, simple explanations
with a few examples.
And those are rare as hen's teeth in the fetid Perl world.
Which people use because there is nothing better.
>
>> 95% of the stuff 90% of people want and need to do.
>> Its just not there among the mountain of chaff
>> Google spits out·
>
> You can blame Google for that.
>
>> Lord I hate wasting time for nothing this way.
>
> If you had asked nicely, you might have had an answer by now.
> Badmouthing the language is not the way to get sympathy.
> -Joe
--
When I shake my killfile I can hear them buzzing.
Re: Batch processing
am 22.06.2005 00:52:41 von Jim Gibson
In article <11bh31fj0bui1fa@corp.supernews.com>, WCB
wrote:
> Joe Smith wrote:
>
> > WCB wrote:
> >
> >> I want to get work done.
> >>
> >> rm -f -R *
> >>
> >> Dead simple. That works.
> >>
> >> Perl something *.
> >> No recursion. I need a script. I need somewhere to see
> >> the most basic, primitive script I can use.
> >>
> >> To replace the missing -R
> >
> > Use File::Find to do recursive things with Perl.
>
>
> I foundthat. Al I found left me going "Hunh?"
> Can yoiu give me an example how you do what you say it can?"
> >
Its 17 lines instead of one, but this may do what you want:
#!/usr/local/bin/perl
use strict;
use warnings;
use File::Find;
find(\&process,'./test');
sub process
{
return if -d || /\.br$/; # skip directories and processed files
open(my $in, '<', $_) or die("Can't open $_: $!");
open(my $out, '>', "$_.br") or die("Can't open $_.new for output:
$!");
while(<$in>){
chomp;
print $out "$_
\n";
}
close($in);
close($out);
}
----== Posted via Newsfeeds.Com - Unlimited-Uncensored-Secure Usenet News==----
http://www.newsfeeds.com The #1 Newsgroup Service in the World! >100,000 Newsgroups
---= East/West-Coast Server Farms - Total Privacy via Encryption =---
Re: Batch processing
am 22.06.2005 00:56:41 von wcb
chaosppp wrote:
> wcb wrote:
>> I have a small perl routine.
>>
>> Perl -p -e 's/\n/
\n/'
>
>
>> What's the best general approach to a down and dirty
>> batch command to go through a directory with Perl?
>> Is there a canonical method for doing this in Perl?
>
>
> for i in `find . -type f`; do perl -pi.'bak' -e 's/\n/
\n/' $i; done
>
> This is 50% shell command 50% perl. To do subdirectory crawls with
> perl, you'd need to use a module or opendir, readdir. stat will tell
> you which are directories, files, symlinks, etc. In a unix file system
> you'd want to deal with each according to need. You probably wouldn't
> want those
's in your .htaccess files...
>
> for i in `find . -type f|grep -v .htaccess`; do perl -pi.'bak' -e
> 's/\n/
\n/' $i; done
>
> or
>
> for i in `find . -type f|grep \.htm`; do perl -pi.'bak' -e
> 's/\n/
\n/' $i; done
>
> See what the find . -type f shows and you'll see what to grep in or out.
Thanks. Something to chew on.
>
> Dos filesystems don't have symlinks, files that look like directories,
> directories that look like files or files that have undisplayable
> characters. Files that begin with a dash aren't nearly as bad...
> File names with spaces in them are no fun in the shell or in perl.
>
DOS did in fact have undisplayable characters and spaces
and malformed names that could not be deleted with del.
I used to know a bunch of ways of dealing with them.
I shudder to think what you could do with Linux.
Symlinks are generally not a problem with mere batch processing of
lots of small organized directories of files. A general solution to
walking a tree might need to know that for future uses.
If a symlink, ignore.
If a file, do something.
If directory, open.
If no more files or directories to find, exit directory.
close directory.
This is where I need to go. Basic low level stuff and
build back up.
> I have found times when -R breaks. It isn't pretty and it always goes
> up the ladder to the sysadmin types to find a fix. Directories nested
> over several hundred deep for instance... rm -R * doesn't work
> I forget whether find worked on those or not, interesting exercise in
> any event.
I would bet the guy who wrote the -R switch didn't think there would ever
be hard disks big enough to nest directories that deep. Probably the
switch ran out of alotted resources somewhere along the way.
O course some of the blame goes to guys who alllowed a file system that
could allow you to break it like that.
This one of my objections to perl. Switches may break, but only in unusual
circmstances or by fools. Writing and debugging functions as root for
example. But more than that, scripts need debugging.
That is my problem I wanted time tested and well understood basic
routines, well tested wheels. Stuff that has stood the test of time.
With tested spokes. Little routines like "if this file belongs to root, do
not remove or change". It would be nice to have stuff like that too.
>
> Phil
--
When I shake my killfile I can hear them buzzing.