Topic: Lex and Yacc fried my brains

Have been writing a configuration parser with Lex and YACC.

Not only have I learnt a lot, but I have also come to realise just how powerful they are. I'm not saying I've learnt all there is about them or all the features they have to offer; in fact, I'd say I've only touched on the most minimal features. However, I'm seriously thinking I'll never manually write a config parser again.

On the downside, it has seriously fried my brains in terms of understanding how to move the data around and how to get it into memory from the config file.

For completeness, I used this guide: http://tldp.org/HOWTO/Lex-YACC-HOWTO.html

"UBER" means I don't drink the coffee... I chew the beans instead
             -- Copyright BSDnexus

Re: Lex and Yacc fried my brains

They are indeed very powerful. However, to truly appreciate the power of Lex and Yacc, you have to try building a lexical analyzer (what Lex builds) and a parser (what Yacc builds) by hand. Building a lexical analyzer and a parser by hand is usually a part of what is taught in compiler construction class (typically a senior level computer science class).

Re: Lex and Yacc fried my brains

I've not used Yacc or Lex, but your post does touch on somthing I've wondered about for some time.  That is, the old Unix tools (including sed, awk, grep and friends, troff and preprocessors and so on) are really wonderful, powerful tools for those who know a little programming.  Are we really any better off these days with all of the WIMP stuff?  Certainly the web has changed things a bit, and Unix is not aware of that, but for mundane computer "stuff" have things really improved much at all?