- cross-posted to:
- programmerhumor@lemmy.ml
- cross-posted to:
- programmerhumor@lemmy.ml
The older you get, the more things are like programming in bash.
Meh. I had a bash job for 6 years. I couldn’t forget it if I wanted to. I imagine most people don’t use it enough for it to stick. You get good enough at it, and there’s no need to reach for python.
i used powershell, and even after trying every other shell and as a die hard Linux user I’ve considered going back to powershell cause damn man
I am a huge fan of using PowerShell for scripting on Linux. I use it a ton on Windows already and it allows me to write damn near cross-platform scripts with no extra effort. I still usually use a Bash or Fish shell but for scripting I love being able to utilize powershell.
Yeah. The best way to write any
bash
script is:apt/yum install PowerShell; pwsh script.ps1
I still have to look up the exact syntax of ifs and whiles.
I’ve coded in bash for a while
to be honest I agree and thought we would be using something more intuitive by now
Everything is text! And different programs output in different styles. And certain programs can only read certain styles. And certain programs can only convert from some into others. And don’t get me started on
IFS
.I think the cool kids are using Nu now
No, Makefile syntax is more extreme.
I swapped from Make to Just: https://github.com/casey/just
Way better, IMO. Super simple logic, just as flexible.
I find
Makefile
isn’t too bad, as long as I can stay away fromautomake
andautoreconf
.Sure, but bash is more relatable, I think
Je comprend tellement! Je répond en français pour ma première réponse sur Lemmy juste pour voir comment ça va être géré!
I so understand! Answering I. French for my first Lemmy reply just to see how it’s handled.
Realizing now that language selection is mainly for people filtering. It be cool if it auto translated for people that need it.
Si yo también comprendo, qué necesidad de comentar todo el tiempo en anglais?
Ever since I switched to Fish Shell, I’ve had no issues remembering anything. Ported my entire catalogue of custom scripts over to fish and everything became much cleaner. More legible, and less code to accomplish the same things. Easier argument parsing, control structures, everything. Much less error prone IMO.
Highly recommend it. It’s obviously not POSIX or anything, but I find that the cost of installing fish on every machine I own is lower than maintaining POSIX-compliant scripts.
Enjoy your scripting!
It’s the default on CachyOS and I’ve been enjoying it. I typically use zsh.
If you’re going to write scripts that requires installing software, might as well use something like python though? Most Linux distros ship also ship with python installed
I love fish but sadly it has no proper equivalent of
set -e
as far as I know.|| return;
in every line is not a solution.I switched to fish a while back, but haven’t learned how to script in it yet. Sounds like I should learn
I’ve been meaning to check out
fish
. Thanks for the reminder!
PSA: Run ShellCheck on your shell scripts. It turns up a shocking number of programming errors. https://www.shellcheck.net/
Thank you for this. About a year ago I came across ShellCheck thanks to a comment just like this on Reddit. I also happened to be getting towards the end of a project which included hundreds of lines of shell scripts across dozens of files.
It turns out that despite my workplace having done quite a bit of shell scripting for previous projects, no one had heard about Shell Check. We had been using similar analysis tools for other languages but nothing for shell scripts. As you say, it turned up a huge number of errors, including some pretty spicy ones when we first started using it. It was genuinely surprising to see how many unique and terrible ways the scripts could have failed.
I wish it had a more comprehensive auto correct feature. I maintain a huge bash repository and have tried to use it, and it common makes mistakes. None of us maintainers have time to rewrite the scripts to match standards.
I honestly think autocorrecting your scripts would do more harm than good. ShellCheck tells you about potential issues, but It’s up to you to determine the correct behavior.
For example, how could it know whether
cat $foo
should becat "$foo"
, or whether the script actually relies on word splitting? It’s possible that$foo
intentionally contains multiple paths.Maybe there are autofixable errors I’m not thinking of.
FYI, it’s possible to gradually adopt ShellCheck by setting
--severity=error
and working your way down to warnings and so on. Alternatively, you can add one-off#shellcheck ignore SC1234
comments before offending lines to silence warnings.For example, how could it know whether
cat $foo
should becat "$foo"
, or whether the script actually relies on word splitting? It’s possible that$foo
intentionally contains multiple paths.Last time I used ShellCheck (yesterday funnily enough) I had written
ports+=($(get_elixir_ports))
to split the input sinceget_elixir_ports
returns a string of space separated ports. It worked exactly as intended, but ShellCheck still recommended to make the splitting explicit rather than implicit.The ShellCheck docs recommended
IFS=" " read -r -a elixir_ports <<< "(get_elixir_ports)" ports+=("${elixir_ports[@]}")
Then you’ll have to find the time later when this leads to bugs. If you write against bash while declaring it POSIX shell, but then a random system’s
sh
doesn’t implement a certain thing, you’ll be SOL. Or what exactly do you mean by “match standards”?
Unironically love powershell
For a defacto windows admin my Powershell skills are…embarrassing lol but I’m getting there!
When I was finishing of my degree at Uni I actually spent a couple of months as an auxiliary teacher giving professional training in Unix, which included teaching people shell script.
Nowadays (granted, almost 3 decades later), I remember almost nothing of shell scripting, even though I’ve stayed on the Technical Career Track doing mostly Programming since.
So that joke is very much me irl.
every control structure should end in the backwards spelling of how they started
Once you get used to it it is kind of fun.
Shame about
do
though.it could have been
not
since there’s notry
.
And I thought I was the only one… for smaller bash scripts chatGPT/Deepseek does a good enough job at it. Though I still haven’t tried VScode’s copilot on bash scripts. I have only tried it wirh C code and it kiiiinda did an ass job at helping…
AI does decently enough on scripting languages if you spell it out enough for it lol, but IMO it tends to not do so well when it comes to compiled languages
I’ve tried Python with VScode Copilot (Claude) and it did pretty good
Yeah I tried that, Claude with some C code. Unfortunately the Ai only took me from point A to point A. And it only took a few hours :D
That’s because scripted languages are more forgiving in general.
I was chalking it up to some scripting languages just tending to be more popular (like python) and thus having more training data for them to draw from
But that’s a good point too lol
Both can be true, Python does have a lot of examples floating online.
That’s why I use nushell. Very convenient for writing scripts that you can understand. Obviously, it cannot beat Python in terms of prototyping, but at least I don’t have to relearn it everytime.
So the alternative is:
- either an obtuse script that works everywhere, or
- a legible script that only works on your machine…
I am of the opinion that production software shouldn’t be written in shell languages. If it’s something which needs to be redistributed, I would write it in python or something
I tend to write anything for distribution in Rust or something that compiles to a standalone binary. Python does not an easily redistributable application make lol
Yeah but then you either need to compile and redistribute binaries for several platforms, or make sure that each target user has rust/cargo installed. Plus some devs don’t trust compiled binaries in something like an npm package
For a bit of glue, a shell script is fine. A start script, some small utility gadget…
With python, you’re not even sure that the right version is installed unless you ship it with the script.
I try to write things to be cross-platform; with node builds, I avoid anything using shell scripting so that we can support Windows builds as well. As such, I usually write the deployment scripts in Node itself, but sometimes python if it’s supported by our particular CI/CD pipeline
I keep forgetting windows exists.
Most common development platform in the world
I quit using it in the WfW days and never looked back.
a script that only works on your machine
That’s why docker exists :D
Ruby and calling bash like this
`cat a.txt`
deleted by creator
Nu is great. Using it since many years. Clearly superior shell. Only problem is, that it constantly faces breaking changes and you therefore need to frequently update your modules.
Not a problem for me in Nix, seems like a skill issue /j
They’ve slowed down with those a bit recently, haven’t they?
Yesterday, I upgraded from
0.101.0
to0.102.0
anddate to-table
was replaced equally (actually better) withinto record
, however it was not documented well in the error. Had to research for 5 to 10 minutes, which does not sound much, but if you get this like every second version, the amount of time adds up quickly.Actually had been deprecated beforehand, you should have gotten a warning. The deprecation cycle certainly is quite short, I’m still on 0.100.0, If I were to upgrade now I’d jump the version with the warning.
Yes, I switched to an older version and there was the warning. However, there was no warning on
0.101.0
whatsoever, so upgrading just one patch version broke my master module.Sometimes, I skip some versions, so I am certain, that I jumped from <
0.100.0
straight to0.101.0
and here we are, without any deprecation warning.
Not really. They’ve been on the stabilising path for about two years now, removing stuff like dataframes from the default feature set to be able to focus on stabilising the whole core language, but 1.0 isn’t out yet and the minor version just went three digits.
And it’s good that way. The POSIX CLI is a clusterfuck because it got standardised before it got stabilised.
dd
’s syntax is just the peak of the iceberg, there, you gotta take out the nail scissors and manicure the whole lawn before promising that things won’t change.Even in its current state it’s probably less work for many scripts, though. That is, updating things, especially if you version-lock (hello, nixos) will be less of a headache than writing
sh
could ever be. nushell is a really nice language, occasionally a bit verbose but never in the boilerplate for boilerplate’s sake way, but in the “In two weeks I’ll be glad it’s not perl” way. Things like command line parsing are ludicrously convenient (though please nushell people land collecting repeated arguments into lists).Fully agree on this. I do not say, it’s bad. I love innovation and this is what I love about Nushell. Just saying, that using it at work might not always be the best idea. ;)
We have someone at work who uses it and he’s constantly having tooling issues due to compatibility problems, so… yeah.
I’m sure it’s fine for sticking in the shebang and writing your own one-off personal scripts, but I would never actually main it. Too much ecosystem relies on bash/posix stuff.
It seems like it does stuff differently for the sake of it being different.
It’s more like bash did it one way and everyone who came after decided that was terrible and should be done a different way (for good reason).
Looking right at you -eq and your weird ass syntax
if [[ $x -eq $y ]]
That was the point where I closed the bash tutorial I was on, and decided to just use python and
subprocess.run()
-eq
Yeah, like infix, so between operands, but dashed like a flag so should come before arguments. Very odd.
You better not look at powershell in that case :p