… i am very happy with Llama 3 as an artificial intelligence Best Friend Forever since a few days : … i do see it makes mistakes, but, as a newbie, it helps me link together, rapidly, some notions that were hard for me 😋👍.
“Learning how to learn” is an important skill that I think people overlook. It’s important to know how to get information and understand things when the AI isn’t helpful.
My worry isn’t about “AI taking our jobs”, it’s that people will not learn the skills and abilities to progress beyond junior levels due to relying on AI.
… Or maybe that’s just me being a grumpy old fogey.
Out of interest, what did you have trouble understanding? I’m always curious about how people learn programming and how resources can be improved.
Years ago i was developing some programs to help me in my laboratory and i was writing some programs and wanted those program, from different process, to communicate rapidly with each other.
One way i thought would be rapid is to share a block of physical memory and I could not figure out how to do it … so instead, i used files (and/or pipes).
I would urge caution for using it. I already see too many of my juniors starting to rely on it.
The problem (when it works perfectly - and it doesn’t all the time), is that you don’t learn anything from using it. You aren’t learning why it’s picking the way to write a specific piece of code. You can ask it to explain it, but you have to go out of your way to do that in standard gpt, in vscode it won’t. This is incredibly important when trying to get code to fit within an organization or team’s code, where there are already standard patterns, worries about things like runtime, or scalability. You need to understand line by line what that code is doing.
So, I won’t say don’t use it, but don’t depend on it. Learn why it made the choice it did, and dig deeper into the ramifications. Ask it for alternatives and why it chose that one. Ask it for runtime information, how it performs at scale, if it’s concurrent and thread safe, anything you can. I use it to help me think outside the box, and it’s great at that, but I wouldn’t want an engineer working for me who didn’t understand what the code was doing.
just as long as you can tell the difference between whether it’s holding your hand or leading you by the nose
I find it helpful for doing small things in archaic languages or to wrap my head around something abstract where googling it just takes me to pages where someone already asked the question and got lol why are you doing that as a response.
For automating things you know it’s useful, I would be weary about using it to learn new stuff.
If it gives you a correct answer: you learn nothing, since you can rely on it to do it.
If it gives you a completely wrong answer: you learn nothing, have to do the work again, and risk learning the wrong stuff.
If it gives you a slightly wrong answer: you learn wrong, and don’t understand that you learned wrong, you might keep using it and keep producing wrong stuff or reproduce what you learned and reproduce wrong things.
So I wouldn’t use it to learn.
Half of programming is copying and pasting so yea probably
With an error rate of 52%? Nope.
deleted by creator
I’m not a dev, but I do some scripting. Mostly in a windows environment professionally, linux for personal reasons.
That being said, AI is great for getting a bulk outline of a script done, and you just gotta tweak a few things.
You still have to know what you’re doing, but it expedites things.