paint-brush
The Facebook TransCoder Explained: Converting Coding Languages with AIby@whatsai
1,627 reads
1,627 reads

The Facebook TransCoder Explained: Converting Coding Languages with AI

by Louis BouchardApril 15th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The Facebook TransCoder Explained: Converting Coding Languages with AI. New model converts code from a programming language to another without supervision. It can take a Python function and translate it into a C++ function, and vice-versa, without any prior examples. It understands the syntax of each language and can thus generalize to any programming language. Ask any questions or remarks you have in the comments, I will gladly answer everything! Share this to someone who needs to learn more about Artificial Intelligence! Spread knowledge, not germs!

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - The Facebook TransCoder Explained: Converting Coding Languages with AI
Louis Bouchard HackerNoon profile picture

This new model converts code from a programming language to another without any supervision. It can take a Python function and translate it into a C++ function, and vice-versa, without any prior examples. It understands the syntax of each language and can thus generalize to any programming language!

Watch the video

This week my interest was directed towards TransCoder. A new model by the AI team at Facebook Research. Ask any questions or remarks you have in the comments, I will gladly answer everything!

Subscribe to not miss any AI news and terms clearly vulgarized! Share this to someone who needs to learn more about Artificial Intelligence! Spread knowledge, not germs!

The TransCoder Paper: https://arxiv.org/abs/2006.03511

Video transcript

00:00

this new model converts code from a

00:02

programming language to another without

00:04

any supervision

00:05

it can take a python function and

00:07

translate it into a c

00:08

function and vice versa without any

00:10

prior examples

00:12

it understands the syntax of each

00:14

language and can thus generalize

00:16

to any programming languages let's see

00:18

how they did that

00:21

[Music]

00:25

this is what's ai and i share artificial

00:27

intelligence news every week

00:29

if you are new to the channel and want

00:31

to stay up to date please consider

00:32

subscribing to not miss

00:34

any further news to understand this new

00:36

model by facebook

00:38

called the transcoder we first need to

00:40

introduce what it is

00:41

in short it's a trans compiler meaning

00:44

that it is a source to source translator

00:46

which converts a source code from a high

00:48

level programming language

00:50

such as c plus or python to another one

00:54

currently this type of translation

00:55

requires manual modifications

00:58

in order to work properly because of the

01:00

complexity of the task

01:01

it is hard to respect the target

01:03

language conventions

01:04

when changing from a programming

01:06

language a to a programming language b

01:09

plus since it is a complex task it

01:12

requires a lot of knowledge

01:13

in both target and source programming

01:15

languages to work

01:16

which means a lot of examples and

01:19

computation time

01:20

this is why facebook tried this new

01:22

approach using unsupervised machine

01:24

translation to train this fully

01:26

unsupervised

01:27

neural trans compiler the model was

01:29

trained on open source github projects

01:32

and mainly trained to translate

01:33

functions between three programming

01:35

languages

01:36

c plus plus java and python they

01:38

achieved that using a sequence to

01:40

sequence model with attention

01:42

composed of an encoder and a decoder

01:44

with a transformer architecture

01:46

trained in an unsupervised way

01:48

specifically on functions

01:50

at first they initializes the model with

01:53

cross-lingual mask language

01:55

model pre-training this is done by

01:58

randomly masking

01:59

some of the tokens and training the

02:01

transcoder to predict these mask tokens

02:04

based on the context allowing the model

02:06

to create high quality

02:08

sequence representations regardless of

02:10

the programming language

02:12

once the pre-training is done the

02:13

decoder is then trained

02:15

using the pre-trained transcoder as

02:17

input to always generate

02:19

valid code sequences even when fed with

02:21

noisy data

02:22

increasing the encoder robustness to

02:24

input noise

02:26

in order to translate functions from one

02:28

language to another

02:29

we need to add this last part the back

02:31

translation

02:33

which is composed of two models as you

02:35

can see in this image

02:36

the target to source model which is used

02:39

to translate the target code into the

02:41

source code

02:42

producing a noisy source sequence then

02:45

this sequence is used in the source to

02:47

target model to reconstruct the target

02:49

sequence

02:50

from this noisy input if this doesn't

02:53

tell you anything

02:54

i invite you to check out the videos i

02:56

made on attention

02:57

and transformers which are linked in the

02:59

description below

03:01

as the results show there are still a

03:03

lot of progress to be done in the field

03:05

but the best translation shows over 90

03:08

success

03:08

which is a huge jump in accuracy here

03:10

are some examples of functions

03:12

translated using the transcoder

03:15

i invite you to pause the video and take

03:17

a deeper look at these examples

03:35

programmers will be able to easily

03:37

improve and build

03:38

robust cross-platform software with

03:41

these improvements in translations

03:43

with little or no modifications to make

03:45

to their code

03:46

using a single code base the code and

03:49

pre-training models will be publicly

03:51

available soon as they said in the paper

03:54

of course this was just a simple

03:56

overview of this unsupervised code

03:58

translator

03:58

i strongly recommend to read the paper

04:00

link to the description to learn more

04:02

about it

04:02

and try out the code whenever it's

04:04

available leave a like if you went this

04:06

far in the video

04:08

and since there are over 90 of you guys

04:10

watching that are not subscribed yet

04:12

please consider subscribing to the

04:14

channel to not miss any further news

04:16

clearly explained if you want to support

04:18

the channel i now have a patreon

04:20

linked in the description where you can