Grep For Knowledge, Grep For Industry

Posted Saturday, 06 January 2007, 4:48 pm

I’ve seen a number of stories regarding standard Unix commands get promoted to the front page of digg.com. It’s a peculiar phenomenon, at least to me. Most of the articles have been pretty much straightforward explanations of how to use these commands in completely "normal" ways, rather than in novel or innovative ways. I’ve been using Unix since 1986, and have worked as a systems administrator since 1994—perhaps I’m jaded. The popularity of linux has vastly increased the audience for Unix over the years, and that means there’s a LOT of inexperienced people out there who use a Unix-like OS, but have little experience with what was formerly the bailiwick of experts.

That introduction tendered, I’d like to share a mildly novel use of grep that I’ve employed for years. I make no claims to it being truly innovative. However, it is an approach I don’t often see used, and a casual understanding of it should show that it has broader application.

So, on with it!

grep is a standard unix command. It’s included with virtually every flavor of Unix and Unix-like OSes. There are variants of course—GNU grep has considerably more functionality than the default grep that one would find in Solaris a decade ago, for example. This example depends upon the GNU version of grep.

One task a systems administrator performs regularly is checking his systems for ‘unusual’ processes that might suggest something is running on the system that shouldn’t be. Whether it’s an unexpected instance of ‘w’ being run, or ‘rlogin’, or some such—it’s handy to be able to see those oddball apps with as few distractions as possible.

If I run ps -ecf on my Solaris server, I’ll get a listing of every process running at that moment (I prefer the POSIXly functioning ps to the BSDish invocation, which might typically be ps -aux). The problem is, there’s a hundred or so applications listed/running that I know should be running, so I don’t care terribly much about them. I’m interested in those few commands scattered in the output that should not be running.

Culling those unwanted entries from the list is "easily enough" done by just piping the output through several grep -v invocations, for example,  

ps -ecf | grep -v httpd | grep -v imapd

and so on. The problem of course is that you want to cull dozens—maybe a hundred—listed applications, and that can make for an awfully ungainly pipeline!

There’s an easy option to grep though that simplifies the matter considerable—the -f option. -f tells grep to get its arguments from a file, rather than on the command line. In the file, you list—one entry per line—all the apps that you don’t want listed. So, for example, you create a file called curiouspat (or any name to your liking of course), and in it, you list

httpd
imapd
tcsh
mysqld

and so on. You then invoke your pipeline like this:

ps -ecf | grep -v -f curiouspat

and now you’ll get the same output as a long pipeline would produce. The nice side-benefit is that it’s repeatable—you don’t have to keep typing out some ungainly pipeline for all those commands you don’t want to see—and you don’t have to remember them either.

To simplify things further, you can dump the above line into an executable file ( I call mine, naturally enough, "curious") so it’s even easier to run any time. Obviously, your pattern file will likely be quite long—but that’s what makes using a pattern file so much easier than manually pipelining multiple grep -v’s together.

So that’s my tip. Not groundbreaking. Not crazy-innovative. But it’s a handy method  to keep in your toolbag.

Categories

Unix Tech Digits Puters

Leave a comment

 

Made with WordPress and the Semiologic CMS | Design by Antonella Pavese