@cm0002@lemmy.world to Programmer Humor@programming.dev • 1 month agoLike programming in bashlemmy.mlimagemessage-square185fedilinkarrow-up11.69Karrow-down118cross-posted to: programmerhumor@lemmy.ml
arrow-up11.67Karrow-down1imageLike programming in bashlemmy.ml@cm0002@lemmy.world to Programmer Humor@programming.dev • 1 month agomessage-square185fedilinkcross-posted to: programmerhumor@lemmy.ml
minus-square@perishthethought@lemm.eelinkfedilinkEnglish46•1 month agoI don’t normally say this, but the AI tools I’ve used to help me write bash were pretty much spot on.
minus-square@marduk@lemmy.sdf.orglinkfedilink20•1 month agoYes, with respect to the grey bearded uncles and aunties; as someone who never “learned” bash, in 2025 I’m letting a LLM do the bashing for me.
minus-square@SpaceNoodle@lemmy.worldlinkfedilink40•1 month agoUntil the magic incantations you don’t bother to understand don’t actually do what you think they’re doing.
minus-square@embed_me@programming.devlinkfedilink41•1 month agoSounds like a problem for future me. That guy hates me lol
minus-square@MBM@lemmings.worldlinkfedilink14•1 month agoI wonder if there’s a chance of getting rm -rf /* or zip bombs. Those are definitely in the training data at least.
minus-square@furikuri@programming.devlinkfedilink3•1 month agoThe classic rm -rf $ENV/home where $ENV can be empty or contain spaces is definitely going to hit someone one day
minus-square@arendjr@programming.devlinkfedilink10•edit-21 month agoIn fairness, this also happens to me when I write the bash script myself 😂
minus-square@kameecoding@lemmy.worldlinkfedilink2•1 month agoYes, I have never wrote a piece of code that didn’t do what I thought it would before LLMs, no sir.
minus-square@SpaceNoodle@lemmy.worldlinkfedilink17•edit-21 month agoYeah, an LLM can quickly parrot some basic boilerplate that’s showed up in its training data a hundred times.
minus-square@henfredemars@infosec.publinkfedilinkEnglish6•1 month agoFor building a quick template that I can tweak to my needs, it works really well. I just don’t find it to be an intuitive scripting language.
minus-square@ewenak@jlai.lulinkfedilink1•1 month agoIf When the script gets too complicated, AI could also convert it to Python. I tried it once at least, and it did a pretty good job, although I had to tell it to use some dedicated libraries instead of calling programs with subprocess.
I don’t normally say this, but the AI tools I’ve used to help me write bash were pretty much spot on.
Yes, with respect to the grey bearded uncles and aunties; as someone who never “learned” bash, in 2025 I’m letting a LLM do the bashing for me.
Until the magic incantations you don’t bother to understand don’t actually do what you think they’re doing.
Sounds like a problem for future me. That guy hates me lol
Yeah fuck that guy
I wonder if there’s a chance of getting
rm -rf /*
or zip bombs. Those are definitely in the training data at least.The classic
rm -rf $ENV/home
where$ENV
can be empty or contain spaces is definitely going to hit someone one dayIn fairness, this also happens to me when I write the bash script myself 😂
Yes, I have never wrote a piece of code that didn’t do what I thought it would before LLMs, no sir.
Yeah, an LLM can quickly parrot some basic boilerplate that’s showed up in its training data a hundred times.
For building a quick template that I can tweak to my needs, it works really well. I just don’t find it to be an intuitive scripting language.
IfWhen the script gets too complicated, AI could also convert it to Python.I tried it once at least, and it did a pretty good job, although I had to tell it to use some dedicated libraries instead of calling programs with subprocess.