GitHub AI programmer

I saw this https://twitter.com/github/status/1409883156333879300?s=21 and for me this was the news of the day.

I’m feeling so many things at once seeing this tweet.

1: gut reaction: this is bullshit / not valuable.
2: finally programmers will know why “test automation” is a shitty term now that “programming automation” is here. It’s about thinking!
3: who tests this and how?

What are you thinking and feeling about this?

2 Likes

Why would it be bs? Autocomplete and macros have been a thing for a while and now they are just getting smarter. 90% of code is copy and paste. You pay Devs to know which code to copy.

3 Likes

I agree with @crunchy I think it’s a good thing, a lot of code is even auto-generated or copy-pasted from stack overflow :stuck_out_tongue:

I don’t see how it would be bullshit since it makes our lives easier. Bill Gates likes to hire lazy people just because they get the job done faster. They will defiantly use this! Haha.
I would say don’t hate until you tried it.

:thinking: :man_shrugging:

Apparently OpenAI: “Trained on billions of lines of public code”

My gut reaction for many things is “this is bullshit!”, that’s why I wrote that down as a note :joy:
I’m just extremely sceptical by nature, can’t really help it.

It’s not based on nothing though, AI and “AI” do a lot of harm in the world, so that’s the first thing my gut screams whenever I see a new “AI Solution” being brought into the world.

I’m going to follow this thing and see what people say of it! Maybe I can try it out myself too.

Just saw it’s trained by only public project so yea I’m interested in how many bugs it’s going to reproduce.
Since it’s trained with bugs so it’s normal to produce bugs.

How about security? :stuck_out_tongue:

FYI I made a racket on it: @maguay, What do you think about GitHub Copilot? | Racket as an answer on Maguay’s Volley: What do you think about GitHub Copilot? | Racket

EDIT: found another copilot racket: What will GitHub Copilot mean for the future of testing? | Racket
by @neil

3 Likes

Ha Ha!

Good to know your bullshit-ommeter 2000 is still working well after TestbashHome. We all need to stop saying AI when we really mean ML. I was just reading an article about Softbank canning a product which was selling “AI” human-like empathic robots, probably because the robots were not empathic enough, but mostly because nobody has a need for them at this time. I can see how this peaked too early for them RIP Pepper robot? SoftBank 'pauses' production - BBC News. These days every time I see “AI”, my brain reads it as “ML”.

1 Like

OpenAI’s GPT-3 model already writes code from a text description, so this isn’t so far-fetched.

I have no idea how good GPT-3’s code output is, but such worries could lead to some very secure jobs for testers!

2 Likes

There we go :stuck_out_tongue:

May I be the first to say, “Oh dear…”?

2 Likes