I’ve been trying nushell and words fail me. It’s like it was made for actual humans to use! 🤯 🤯 🤯

It even repeats the column headers at the end of the table if the output takes more than your screen…

Trying to think of how to do the same thing with awk/grep/sort/whatever is giving me a headache. Actually just thinking about awk is giving me a headache. I think I might be allergic.

I’m really curious, what’s your favorite shell? Have you tried other shells than your distro’s default one? Are you an awk wizard or do you run away very fast whenever it’s mentioned?

  • Obin@feddit.org
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    I’m really curious, what’s your favorite shell?

    Emacs eshell+eat

    It essentially reverses the terminal/shell relationship. Here, it’s the shell that starts a terminal session for every command. Eshell is also tightly integrated with Emacs and has access to all the extended functionality. You can use Lisp in one-liners, you can pipe output directly to an emacs buffer, you can write custom commands as lisp functions, full shortcut customization not limited to terminal keys, history search via the completion framework (i.e. consult-history), easy prompt customization, etc.

    There’s also Tramp, which lets you transparently cd into remote hosts via ssh, docker containers, SMB/NFS-shares, archive files, and work with them as if they were normal directories (obviously with limited functionality in some cases, like archives).

    And probably a lot of stuff I’m missing right now.

  • priapus@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    I love Nushell, it’s so much more pleasant for writing scripts IMO. I know some people say they’d just use Python if they need more than what a POSIX shell offers, but I think Nushell is a perfect option in between.

    With a Nushell scripts you get types, structured data, and useful commands for working with them, while still being able to easily execute and pipe external commands. I’ve only ever had two very minor gripes with Nushell, the inability to detach a process, and the lack of a -l flag for cp. Now that uutils supports the -l flag, Nushell support is a WIP, and I realized systemd-run is a better option than just detaching processes when SSHd into a server.

    I know another criticism is that it doesn’t work well with external cli tools, but I’ve honestly never had an issue with any. A ton of CLI tools support JSON output, which can be piped into from json to make working with it in Nushell very easy. Simpler tools often just output a basic table, which can be piped into detect columns to automatically turn it into a Nushell table. Sometimes strange formatting will make this a little weird, but fixing that formatting with some string manipulation (which Nushell also makes very easy) is usually still easier than trying to parse it in Bash.

  • communism@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Nushell looks cool but I prefer to stick with the POSIXes so that I know my scripts will always work and syntax always does what I expect it to. I use zsh as a daily driver, and put up with various bashes, ashes, dashes, that come pre-installed with systems I won’t be using loads (e.g. temporary vms).

    • nimpnin@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Always confuses me when people say this. You can use multiple different shells / scripting languages, just as you can use multiple programming languages.

  • esa@discuss.tchncs.de
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    I’ve been using fish (with starship for prompt) for like a year I think, after having had a self-built zsh setup for … I don’t know how long.

    I’m capable of using awk but in a very simple way; I generally prefer being able to use jq. IMO both awk and perl are sort of remnants of the age before JSON became the standard text-based structured data format. We used to have to write a lot of dinky little regex-based parsers in Perl to extract data. These days we likely get JSON and can operate on actual data structures.

    I tried nu very briefly but I’m just too used to POSIX-ish shells to bother switching to another model. For scripting I’ll use #!/bin/bash with set -eou pipefail but very quickly switch to Python if it looks like it’s going to have any sort of serious logic.

    My impression is that there’s likely more of us that’d like a less wibbly-wobbly, better shell language for scripting purposes, but that efforts into designing such a language very quickly goes in the direction of nu and oil and whatnot.

    • Overspark@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      nu 's commands also work on JSON, so you don’t really need jq (or xq or yq) any more. It offers a unified set of commands that’ll work on almost any kind of structured data.