• grrgyle@slrpnk.net
    link
    fedilink
    arrow-up
    2
    ·
    43 minutes ago

    Hey I throw a /^regexp.*/ {print $NF} in there sometimes!

    …but yes, it’s mostly print $1—but only because I mix up the parameters whenever I try to use cut!

  • Lauchmelder@feddit.org
    link
    fedilink
    arrow-up
    48
    arrow-down
    1
    ·
    13 hours ago

    Why spend 30 seconds manually editing some text when you can spend 30 minutes clobbering together a pipeline involving awk, sed and jq

      • Tangent5280@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        2 hours ago

        The important part is to learn the limits of any tool. Nowadays I no longer use jq for any long or complicated tasking. Filter and view data? jq is fine. Anything more and I just cook up a python script.

          • Tangent5280@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            50 minutes ago

            How do you get complex data structures to work? I was alienated from scripting on zsh because I wanted something like a dict and realised I would have to write my own implementation. Is there a work around for that?

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    20
    ·
    15 hours ago

    my favorite awk snippet is !x[$0]++ which is like uniq but doesn’t care about order. basically, it’s equivalent to print_this_line = line_cache[$current_line] == 0; line_cache[$current_line] += 1; if $print_this_line then print $current_line end.

    really useful for those long spammy logs.

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        10 hours ago

        To be fair, a lot of the programs don’t use a single character, have multiple spaces between fields, and cut doesn’t collapse whitespace characters, so you probably want something more like tr -s " "|cut -d" " -f3 if you want behavior like awk’s field-splitting.

        $ iostat |grep ^nvme0n1
        nvme0n1          29.03       131.52       535.59       730.72    2760247   11240665   15336056
        $ iostat |grep ^nvme0n1|awk '{print $3}'
        131.38
        $ iostat |grep ^nvme0n1|tr -s " "|cut -d" " -f3
        131.14
        $
        
        • ThunderLegend@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          This is awesome! Looks like an LPI1 textbook. Never got the certification but I’ve seen a couple books about it and remember seeing examples like this one.

        • TechLich@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          8 hours ago

          I never understood why so many bash scripts pipe grep to awk when regex is one of its main strengths.

          Like… Why

          grep ^nvme0n1 | awk '{print $3}'

          over just

          awk '/^nvme0n1/ {print $3}'

          • FooBarrington@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            2 hours ago

            Because by the time I use awk again, I’ve completely forgotten that it supports this stuff, and the discoverability is horrendous.

            Though I’d happily fix it if ShellCheck warned against this…

    • Laurel Raven@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      9 hours ago

      This is definitely somewhere that PowerShell shines, all of that is built in and really easy to use

      • Laser@feddit.org
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        3 hours ago

        People are hating on Powershell way too much. I don’t like its syntax really but it has a messy better approach to handling data in the terminal. We have nu and elvish nowadays but MS was really early with the concept and I think they learned from the shortcomings of POSIX compatible shells.

  • Ŝan@piefed.zip
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    19
    ·
    edit-2
    18 hours ago

    Ok, þe quote misplacement is really confusing. It’s

    awk '{print $1}'
    

    How can you be so close to right about þis and still be wrong?

    • hddsx@lemmy.ca
      link
      fedilink
      arrow-up
      14
      arrow-down
      2
      ·
      18 hours ago

      Who downvoted this? If you use awk, you know Sxan is using the correct syntax.

      • teft@piefed.social
        link
        fedilink
        English
        arrow-up
        35
        arrow-down
        4
        ·
        edit-2
        17 hours ago

        People have been downvoting him because he uses the letter thorn in his comments.

        Some people will hate on anyone different.

        • mexicancartel@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          12
          ·
          16 hours ago

          I recently noticed many people on lemmy have that thing rn. Why are they using it/is that autocorrecty thibgy or something? I didn’t downvote them but i hate seeing this. And it’s not just this letter

          • Badabinski@kbin.earth
            link
            fedilink
            arrow-up
            15
            ·
            16 hours ago

            I’m not using it because it would be extremely inconvenient for me, but I think that the English language deserves to have the thorn returned to it.

            • teft@piefed.social
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              2
              ·
              15 hours ago

              The english alphabet needs to be completely redone. We should bring back thorn, eth, and wynn. We should also increase the vowels to actually represent the crazy amount of vowel sounds we have, dipthongs are dumb. 5 vowels is not sufficient for 15+ phonemes.

            • scathliath@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              3
              ·
              13 hours ago

              I used to use it for math notation, so I’d welcome it’s use again if I can keep using it as a placeholder for “then this happens” in between steps of functions.

  • DreamButt@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    17 hours ago

    In all my years I’ve only used more than that a handful of times. Just don’t need it really

    Now jq on the other hand…

  • CubitOom@infosec.pub
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    18 hours ago

    I’ve become a person that uses awk instead of grep, sed, cut, head, tail, cat, perl, or bashisms

  • pelya@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    2
    ·
    16 hours ago

    Everything you do with awk, you can do with python, and it will also be readable.

  • otacon239@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    15 hours ago

    I used awk for the first time today to find all the MD5 sums that matched an old file I had to get rid of. Still have no idea what awk was needed for. 😅 All my programming skill is in Python. Linux syntax is a weak point of mine.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 hours ago

      Probably the very same thing that the post talks about, which is extracting the first word of a line of text.

      The output of md5sum looks like this:

      > md5sum test.txt
      a3cca2b2aa1e3b5b3b5aad99a8529074 test.txt
      

      So, it lists the checksum and then the file name, but you wanted just the checksum.

    • bulwark@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      17 hours ago

      I remember when I first stumbled across this manual I was trying to look up a quick awk command and wound up reading the whole thing. It’s really one of the better GNU manuals.