Here is a transcript generated by Otter.ai of The Content Mix podcast episode about AI translation and dubbing tools:
Shaheen Samavati 0:02
Hi, everyone Shaheen from VeraContent. Here with a special episode of the podcast, you’ve probably seen some of your favorite YouTubers and social media personalities using a tool that makes it look pretty convincingly, like they’re speaking another language. The tool they’re most likely using is the AI video translation tool from HeyGen, which came out in August and has been rapidly expanding into more languages. And since it VeraContent, our expertise is in adapting content for diverse international markets, we decided to analyze the output of this tool which works in 13 languages. So far, we’ve analyzed the results in four of those languages, Spanish, German, Mandarin, and English. And I have three expert linguists from our team here to share their insights about this. We’ve got it in it’s a Maura, who’s going to review the Spanish, we have Laura Lueck, who’s going to review German. And then we have Tom from China, who’s going to review the Mandarin. And as a native English speaker and CEO of VeraContent. I’ll also be sharing my thoughts on the English output. So let’s go right into the topic, I’d like to ask each of you just what were your first thoughts when you saw the results from this tool, and we can start with Lara.
Lara Luig 1:12
I was really surprised because I never had experience with translating something with the lip sync as well. I only use my tools, for example, as Chet UBT as most people know, probably where I was already surprised that techy beauty can help you out very much especially for like, marketing context. But I was really really surprised about the Lipson everything. Which Jen, Jen Ha, yeah. Hey, Jen, was was able to do?
Shaheen Samavati 1:45
Yeah, absolutely. This is like a new functionality. I think that’s really like the what you call it like the the thing that’s like a breakthrough with with this technology, because I there are other dubbing tools that I was discovering, but not that actually move your mouth to match match the speech. So Tom, what what were your thoughts when you when you first saw this?
Liu Jian 2:07
Yeah, for me, also, when I saw that, also, I feel stoning. So it’s really awesome. Because, like, I try different kinds of AI to like her I use AI to translate. I use AI to create writings or pictures. Even I tried to use AI to generate code coding. But like for this video thing, like is surely amazing cover is like It not only can get it translated into the local language, but also it can tap into the local language. Because when I watch video, for me, it seems the foreigner she knows my language. She knows Chinese, and she can speak it. So it looks real.
Shaheen Samavati 3:04
Yeah, absolutely. So we’re gonna get more into like, the our analysis of like how well it actually worked in each language. But before we do that, let’s get it in his first impressions.
Irene Zamora 3:14
Yeah, well, I actually I was knowing about these kind of AI tools for videos. Just last week, I was discussing this topic with my boyfriend because he’s into video filmmaking and so on. So he was showing me like, look what technology can do at the moment. And we were watching videos of us, you mentioned famous people, like, actually speaking whatever language every few sentences, so it was like, wow. And I was surprised about how well the lip movement the lip sing was because we have heard like AI tools like these kind of robotic voices that the Tick Tock videos are so but this was actually quite natural. Although we know these people don’t speak and I cannot judge for other languages that are not Spanish. Right? But it was like, surprisingly good. So when we suggested like, let’s dive into it, let’s analyze a person that I know, speaking in other languages, and what happens now I know how you speak. I know what’s your normal voice? So it was like going more into a deeper analysis like how really work is this happening? They will, what is the status at the moment now of the developing of the two?
Shaheen Samavati 4:35
Yeah, so obviously, my perspective like it was when I saw this, I was like, Well, this is, as I said in the video. As the director of a marketing localization agency. It’s like, Should I be worried about this? Is this going to take away work from us but I also see it as something really exciting and something that’s an opportunity to learn how to use these tools on how to do even more in bed or work. So I was really like, I obviously caught my attention. The technology is so like, like you said, it’s so realistic. So it’s like other things that I’d seen didn’t like, I guess speak to me as much. But when you see some with the the voice clone, so it’s actually sounds like the person the mouths, the mouth is moving. So it really looks like they’re speaking the language, it really got me attention. It was like, Okay, this is something we need to look into further and see like, okay, the technology is here. Now, how can we maybe apply this or do something with this? So, well, I wanted to give some context about, like the test that we did, because basically, we decided to do this about a week ago. And what, what we did was basically, I recorded a one minute video of myself reading a script in English. And in Spanish, we did it in two languages. And then we from those two lines, which are the only two languages I speak so we were limited to that. We did our own localization of the text for obviously, from English to Spanish. So when I did in Spanish we did, we wanted to try to like test the AI on how well it translated idioms, and how well it could maybe adapt to the different like tone and variation in my voice. So I tried to use a lot of expression go up and down in the way I was speaking. Yeah, use expressions that it would be unlikely that maybe that machine translation could catch. And then yeah, and so basically, we recorded this and we we, the tool was really easy to use, it’s basically there’s a beta version of it right now. It’s just the simple upload, and, and then you wait a few minutes for processing, and you download the video. And there is like a free version that you can use, but you have to wait a really long time to download. And there’s a limit on how much you can do. So we went ahead and got the paid version for this test, which is 59 euros a month. And then we created a survey for each of the linguists involved in this. And so each of you filled out filled out a survey, but maybe louder, you could share a little bit about what what your approach was in doing this test.
Lara Luig 7:09
Yes. So at first, I watched the German AI generated video without headphones, to see if I can recognize any arrows or if I feel any weird wipe, or if I can actually you recognize that it is AI generated. Then I went for my headphones to even listen a bit deeper to the voice and actually the words the pronouncing and everything. And then I went to do the same thing with the English English version you you did to compare if I see any differences, I haven’t heard your voice before. So I needed to hear your voice and they I generated voice as well. So I could compare it. And that was pretty much how I did it.
Shaheen Samavati 7:58
Okay, so well Did either of you take like a different approach, I guess or kind of similar process. Did I know Tom?
Speaker 4 8:07
I just started with the original video first. For the translation that I was watching like, the two side by side clip by clip like, now I hear this part and I hear this part. And then I had a last one with the script you provided in English in front of me while I was watching the AI generated video. So I could hear more in detail the translation or the choice of words.
Liu Jian 8:34
Video. Yeah, I think for me, I use a similar approach. I didn’t use like special techniques to analyze it.
Shaheen Samavati 8:45
Okay, yeah, I mean, that was kind of the idea was just kind of your first impressions of, of how the results were and then we had some we all use the same kind of rating system where we rated on a few different things. So we’re gonna go into those now. But I guess I mean, overall, Tom, maybe you can share just the book. What did you think that the AI did? Well, and what could have been better?
Liu Jian 9:06
Yeah, I think like because Al, either well, is the translation. I think especially like the word by word translation is pretty clear. And that is correct. So as before the areas the AI need to get improved. For the translation aspect is localization. Or contextualise costs like one or watch a video like a one I listen. It sounds like motion translation. So it doesn’t sound that natural. So I think when the one part it can be improved is localization. And another one is like for emotions. Of course, like I cannot get the same emotions when I was originally in Our video, because when I watch the original English video, like I can feel the excitement. But like for Chinese video, I cannot sense that failing. And also, I think another challenge part, Mays maybe is special for our language for Chinese. Because Thau Chinese we have, we have tones, because for English, you don’t have tones. But for Chinese, we have photons. So like, if you’re pro you give wrong pronunciation, it can mean different things. So I think the tones might be a challenge for for I
Shaheen Samavati 10:46
didn’t actually do anything that made me say the wrong thing in Chinese. Like were there any incorrect tones? That kind of changed the meaning?
Liu Jian 10:55
Yes, I think for some pass, because of the tone is incorrect. So I couldn’t understand that part. So I don’t know what you’re trying to say.
Shaheen Samavati 11:10
I see. Okay. And it wasn’t possible to like figure it out based on the context. Yeah, like we were able to kind of guess. Yeah,
Liu Jian 11:18
yes, I can guess. Yeah.
Shaheen Samavati 11:20
Okay. Yeah, um, well, let’s go do it. I need to go a little bit deeper on the translation aspect, like how accurate was the translation from English into Spanish? And how well did it translate expressions? Yeah, well,
Speaker 4 11:35
I would say that overall, if the message was transmitted, I mean, so I could say that in Spanish, what you were saying in English, was basically very basically said in Spanish, but we were using, as Tom mentioned, emotions, or inflections in the voice, it was more monotone, everything. Then expressions, especially when you were being more colloquial, because this was recorded, the recording was like, You’re speaking directly to the viewer, it was like our one on one conversation. So the tongue was colloquial. And I think at that point, the tool was an accurate at all. So in terms of accuracy, as this will be like a transcreation sort of thing, because we are you expressing emotions, because we are in a colloquial tone, speaking. I think transcription wasn’t there. But that being said, also, there were a lot of omissions i i, I noticed that the AI tool in Spanish was a meeting adverbs, like worse that the tool was in considering essential to the message. So it was like making a summary of what you were saying. The summary is accurate. But this is not exactly what you said. And all the little nuances that you are with the choice of adjectives or adverbs were lost, most of them were lost. So you could understand the message. But you could clearly see this, this is not a person because nobody speaks like that, that choice of expressions in Spanish was weird. And nobody says that actually, it’s not a real expression that people use in in Spanish. Although I must say, the output in Spanish has a Latin American accent. And I am a Castilian speaker. But it was strange. And I think I can say positively that not even in the international Latin American variant that is using the news, they use those kinds of expressions, speaking naturally in a conversation.
Shaheen Samavati 13:47
It’s interesting the point about omissions because I noticed that as well. And I think, I’m not sure if this is the reason but it seems like some some of those sentences were would have been very long in Spanish and maybe wouldn’t have fit in the time period of the video. Like, basically, when you export the video, it has to be exactly the same length as the original, so they can’t. And I was actually comparing this tool with another dubbing tool called Rasc. And in that tool, it actually in the Spanish version, it like, sped up my speaking a lot like it actually was a better translation, I thought, but it made me say it like very fast. So I guess fit in all the words that were needed in Spanish to say the same thing in English. So that’s definitely a challenge and localization. And I get
Speaker 4 14:35
to that in general translation from English into Spanish, we get more texts, because English is more synthetic way of expressing things. So that can be a challenge to or you speed up the person speaking or you and I must say that the tool was choosing the right words to omit because the essential message was there anyways, so, so meeting those words that they were considered not essential.
Shaheen Samavati 15:05
That’s super interesting. So well laid out. Would you like to come in more on the translation into German? And kind of how how the quality there and did it omit things didn’t make mistakes? Yeah.
Lara Luig 15:20
Overall, it was pretty much okay, like the translation was very good. The feeling of the video was also pretty good, I would say, I just recognized which kind of destroyed the whole, I would give 100% If the beginning and the end would be different. Because if you translate like you started the original video with Listen up, and this was translated word by word in German, and you would never say this, everyone would recognize, okay, this is not German, it doesn’t make sense at all. If you don’t know what you want to say, or if you don’t know the English version of the video, you probably wouldn’t even get what the person wants to say. But because it doesn’t make sense at all. So the beginning was kind of weird because it was translated word by word, then everything was very good. I was very shocked that it isn’t as good. And then kind of like the end was rude actually, because it was translated, right? But the pronunciation of choose which is by in German, was very rude. So you watched the whole video, you were like, Okay, that’s very nice. The person that the video is very friendly, and then it ended in also route way that he would be like, Okay, thank you. Sorry, what did I do wrong, kind of you would feel affected of it. So that’s pretty much what I can say about the German plot the translation all over was good. But the beginning, MPM should be definitely optimized.
Shaheen Samavati 16:56
That was something strange actually, that I think both in German and in Spanish. And you have to tell me, Tom, if it was the same in Chinese, but it like added this by at the end, but I didn’t say like I ended the video saying, you know, let us know in the comments. But for some reason, it said buy in, in Spanish in German, and how did it end the video in Chinese tongue
Liu Jian 17:20
in Chinese is good. Like, it’s good is complete.
Shaheen Samavati 17:26
Okay, it just said Leave. Leave us your comments. Yes. Yeah, interesting. So different results.
Speaker 4 17:34
So I’m so rude as I said, because suddenly you said earlier, like, very flat. I don’t care.
Shaheen Samavati 17:43
Yeah, I’m tired. By Yeah, it was like, muscles. Out of tone out of Yeah. Yeah, so not only were there omissions, there was also like, additions for no reason. Yeah. So yeah, well, just speaking, I guess in general about the feeling and like the kind of tone of the whole thing. Like I was saying, I was trying to be very expressive in the video. And it didn’t seem like it got that at all. Like, it was a very monotone very, and it also made my face look serious. Because the movements was like kind of one expression on the face. So it was very just I looked forward the whole time. So But speaking of that, I mean, how realistic did you guys think that that the lip movements were in each of your languages? Maybe starting with with Tom, how was it
Liu Jian 18:37
for you? I think it’s perfect. It looks really well. I mean, yeah, I mean, yeah, it’s, it’s great. It’s perfect. Science ironing mismatch.
Shaheen Samavati 18:52
Okay, interesting. So it seems to be that’s what this tool is really good at. Would you say the same lot I knew that it was the with the lip movements were accurate in your language.
Lara Luig 19:01
And German it was also pretty good. I just recognize that when you you touched your lips, someone in the middle and then your your fingers were blurred. So I was like, Okay, that should be something which needs to do, which needs to be optimized as well. Because okay, if you see that the fingers will be blurred. If you touch somewhere here, you will recognize that it’s not a person speaking in there that just somehow edited. It was the only point which I can say here.
Speaker 4 19:32
Yeah, this is the same as Lara if you wouldn’t cover your mouth, I wouldn’t. Without the sound I would say that is you actually speaking and I must say that the lip movement was matching the Latin American accent that you were seeing in that video so I could see differences because I have had time to compare with the Spanish video you record it so I could see also the differences in us speaking Spanish how you move your mouth now. Um, the Latin American accent so that was really on point.
Shaheen Samavati 20:03
Yeah, I should say also and the English version, I actually had like two different options for like the type of English you could choose the English accent, it had an American accent, or you could choose your accent, which is a complete mystery to me how they determine your accent because I’m in that video, the source language was Spanish. So how do they possibly know what my accent is in English? Right? But, um, I tried it both ways. And I mean, overall, the quality sounds similar to what you guys described in the other languages. But I think the accent was curious because a well first of all, I thought the the one that was supposedly American was not didn’t sound American to me at all. I’m American, I have an American accent. And it made me sound totally different. And it actually sounded like really Australian and a few parts. I don’t know if that had to do with like, just the way the AI pronounced the vowels. And then there was I felt like the other the, your own accent sounded more like a non native English, like, it almost sounded or maybe like Indian English in some points with the way it was pronouncing. So it was interesting. These these two versions, and also they had some slight differences in the translation. But one thing, another step that I did in my analysis was that I took I put the script through Google Translate, and I compared the Google Translate to what they I did. And I would say, overall, it was about the same but a little bit worse than Google Translate. In some places, like Google Translate actually did better with some of the expressions. But curiously, that that beginning part, I think that Listen up, this is wild part was difficult to translate, obviously, but like, one of the English ones did try to translate it, it said, like, how powerful well, okay, no, sorry, because that was from Spanish to English. So in the Spanish, when I say GIF, what they and it both Google Translate, and one of them changed it to how strong but one of them translated it to how powerful which is a little better, although not exactly what that means. It actually means like, how crazy or how impactful right, but, yeah, so. So interesting, similar, but different results across the languages. I mean, I mean, maybe we can just go and give like a final summary or anything else we wanted to know about, like, how, how well the tool actually worked. And we can start with Tom on that.
Liu Jian 22:32
Yeah, I think like, generally speaking, I think it works pretty well. Because just now like, we talked about it, like, for the labor movements, at work pretty well. So I think like, but also definitely there are some rooms it can get improved. Just now I mentioned the like, how are How could like the AI IX, express, like, the real feelings behind behind it? And also, like I mentioned the like, for for for Chinese, because we have different homes. So how can we make the AI pronounced the right toe? And also, I think just now also, like we mentioned about the automation part. So how could we avoid that problem? So So I think from my opinion, I think like, my, my opinion, I think it’s pretty like it’s, it’s good, and it’s pretty positive. So I think we can use it, definitely to improve our work.
Shaheen Samavati 23:52
Do you think maybe if we had it, because obviously, when I was making this video, I was purposefully trying to trip up the API, but if I had maybe recorded a video that was very clear and easy to translate, maybe in the first place, like using really common terminology, do you think that it would have done a better job and maybe come up with a result that would have been easier to understand?
Liu Jian 24:12
Yeah, I think so. It will be better. Yeah.
Shaheen Samavati 24:18
Um, okay, well, we’ll go to a lot of what are your overall thoughts on the on the output?
Lara Luig 24:24
Yeah, I agree here as well. I think it is possible to use it I would be very curious to see someone watching the video who doesn’t know that isn’t AI to do the writing and everything after? Because it would be interesting if they recognize actually because, of course, we were watching the video very in detail. We knew that it was aI so it’s, yeah, it’s harder to say. Or a different way of off of watching the video. You know what I mean? What I would say what else had earlier is very important that it doesn’t end root because I would say we can use the video. But we, if we would now send it to a client or something like that, or would use it, for example, for the website, we really needed to cut out the end because the people will be very offended, especially Germans, because Germans are like this. And if they will watch a video, which ends route, they won’t be happy about it. And then they wouldn’t like you, for example, but it is not your fault at all. But they don’t know that it is AI translated. So that would be something which I would say would be very helpful. And also maybe start with an easier common sentence like hay. Or not like the also it’s like wild, we use the word actually in German, like, Jen that people are saying wild, German version of it, but you probably would use it also differently than the way I used it and the video.
Shaheen Samavati 26:04
Okay, yeah, that was especially, I mean, it’s true, like, even in English, that’s not the most common way that you would say that, like, you probably say how crazy this is crazy, or something like that. But yeah, it ain’t your thoughts on the output in Spanish.
Speaker 4 26:22
If you want to use this kind of video, for a fast paced environment, like social media posting, like quite fast, and so on, it can work because as you said, is a tool that is pretty easy to use, and quite fast to it. However, we need to take into account what you are transmitting in the video. So I would say that it will work better as you suggested, if the video was like using short sentences, something more neutral or even formal, like corporate style, that will work better than trying to use this tool for a video that is more core. That is expressing emotions, that is has a lot of differences in the intonation varied in a tone. That because I think it’s worthy to gets quite lost in his Yeah,
Shaheen Samavati 27:17
exactly. So just like if you want to get a straightforward message across and just this simple, straightforward translation of it looks like you’re saying it, this tool can do that. So it kind of goes to the next question about like, what do you think that what applications could there be for this? Like? Where do you see potential for using it? And also, maybe you guys could share at this point a little bit more about what each of you do? Like what kinds of translations you usually work on? Or what kinds of content projects? And do you see potential for this in the kind of work that you do? You can start with it and it here?
Speaker 4 27:53
Well, so we do videos I usually working I mean, our as far as I have done, I have worked more in subtitling, but it’s our work of translations to transcribe it into titles, so you can maintain the style and the tone of the speaker. I have done that in more formal videos like and from universities presenting the study plan of a program. And something like a little more informal, more like conversational. I would say that I could, that this could be used. But for example, in my case, I would only recommend to use this to Latin American cricket because this you polish this and your target market is Spanish from Spain, people are going to be quite shocked watching the video and suddenly you are like some kind of weird Mexican person. So this is always something that you don’t relate as much when someone is speaking to you without we are accent that is not your own, you will always connect more if you know if you feel like the video is really addressing. And so for a Latin American audience, it could be good, I will have to speak with Latin American colleagues to be 100%. But my recommendation would be okay, you can then use this kind of tool, but only for Latin America. Not for because as far as I have seen, they don’t have this option for Castilian
Shaheen Samavati 29:31
Spanish. I imagine it’s something that they will eventually do or try to do. But definitely there’s something left to be desired on the different like they do have the two variations of English and like I said the American English doesn’t sound American. So I’m sure that’s something they’re like actively working on in going to make it better. But yeah,
Speaker 4 29:47
I would like to quit. And I would say that this could be used as it was mentioned before, I think if it was an option to Every day, give your thoughts because they are choosing to omit towards to shorten sentences. And so and if you could have their their input of a human translator translator doing that, I think the output will be much better. So if you’ve definitely if the tool or the to the AI tools add this feature, I think it could be used much more in in the actual translation work.
Shaheen Samavati 30:31
And that’s something I’d like to do more investigation into other tools, because I actually, as we said, we’ve did this kind of quickly from last week to this week. So we were only really looking at Hey, Jen. But in the meantime, I did also have a quick look at some other tools. And one that stood out to me is one that’s called Rask. That AI which actually is says that they’re either about to or have already also added this lip movement thing, but they’re so Hey, Jen is actually a tool that was doing avatars that and voiceovers, but like and then they’ve just kind of expanded into this language thing. But but the Rask one actually started out as a dubbing tool. So I feel like they’re a little bit more advanced on the technology for the actual cloning the voice and the dubbing. So, like, but, and they have, well, I tested it with the video. But the test that I did, it didn’t do the lip thing. So it just did a dubbing, but actually liked the results more because you could still see my expression in the way that I’m talking. But if you’re only hearing the voice in the other language, and I think there’s probably I could see more applications for that. And some of the kinds of videos that we typically do that we typically do, like subtitles for, like most of our clients are only asking us for subtitles, so we’re just adding subtitles in the and it stays in the original language. But I could see if it was very easy to dub it. I see a lot of applications for that. Because I think it’s like more honest to like, leave the video without the lips moving actually. And you see like the real person, the real person talking even though the lip moving thing is really cool. It just there’s like this uncanny valley thing about it. It just seems weird and fake. So I don’t know if it’s like necessarily something you’d always want to use. There’s could be some cool. I’m curious to see what you guys thinks could be the application, maybe ladder if you have any ideas.
Lara Luig 32:17
Yeah. So I basically held an initial branch with their marketing in the German market through organics, social media, I actually see the value here as well, because especially for personal branding reasons. It would be nice, because there are a lot of like, like influencers doing German and English accounts because they can reach so many more people and then build a community all around the world because it’s more personal if they are speaking the same language as their community is. And a lot of like, big influences were like, for example, English American, or Russian, I saw that they were starting to adapt their original content, which for example was English to Russian and then localize it to Italian as well. But it is very hard, I think to produce, produce all the content. Because you need to find a person who has a similar voice as you have, or first you like the voice and pronunciation, then they have to do the translation and then speak for you in the video. But then still the lip sync is given. So I think that would be actually something could be a breakthrough and social media, if you want to localize your social media accounts, which can be for brand as well. Because if you’re doing like an US account, and then different European accounts, you always have someone who’s doing the content in the local language. And that could be something where their AI tool really can help out. Absolutely.
Shaheen Samavati 33:55
Yeah, Tom, do you have any thoughts on how you think this technology could be used?
Liu Jian 34:00
Yeah, because like, I’m thinking about my case. Cuz I’m also like translating videos, because I’m thinking about one of my customers. So like, normally, he’s sending me the English videos. So I, I’ve got, so I joined I translated the scripts into Chinese. And then I put the subtitle like as a video, and after was heavier gave this video and squibs to another Chinese person. So the Chinese person, like a hedge to do the tapping. So it’s a little bit complicated. So I’m just thinking like, if, if I can, if I mentioned this AI to him, it can save lots of work, cuz it’s not perfect, but in the middle, maybe I can do some fun tunes. So to make some adjustments How to Make the better are definitely I think like it will improve the I mean, the speed of the walk.
Shaheen Samavati 35:10
Absolutely. Actually, like a couple other things that I learned in my whole research on this topic. One is that actually YouTube is has announced that like, they’re incorporating this kind of technology like in their platform, because they have some startup that they bought that does this, and they’re going to integrate it into YouTube. So and actually, something I’ve seen a lot on YouTube is people have the different channels in different languages, but on the other channel language, maybe on their main channel, they have like 1000 videos, on the, on the other language channel, they only have 10, because like Laura was explaining, it takes a lot of resources and time to like, dump it into into those languages and to make it match and upload it to obviously that if that whole process can be done semi automatically, at least then it’s, it’s gonna like open up the possibility to have access to this content in different languages a lot more. And then I don’t know if I mentioned before, but I wanted to say one more thing I wanted to say about the this Rasc tool is that you can edit it. So like, and I think that’s really cool. Like the interface was really well done. And like you can it actually gives you the transcript like side by side with like, here’s your original thing that you said, and it like has the auto transcript and then you have next to it, the translation and you can actually change the translation so that the output is what you want. You say like redub? And it like does it over again. So it looks like that technology is coming and it is going to be editable. And I think Tom said something that’s kind of in line with thoughts that I’ve had on this, because obviously it’s like concerning like, I think that the first thing people the first reaction oftentimes is like fear, like oh, is this going to take away the work of translators or interpreters. But I think if we think back to, like when machine translation came out, and there was a lot of fear around that as well. But in my opinion, I don’t think that it’s actually taken work away, it’s actually created, but it’s changed. It’s like completely transformed and changed the way that we do translation now. And every translation tool has, obviously machine translation built in. And it’s a tool that every translator uses now. But I personally think that it has made the work more fun because we don’t we get to work on the creative problems that only humans can solve and not on the on the tedious work of just changing one word into the other thing. So but I don’t what do you guys think about kind of the future of of our industry of translation interpretation? And I don’t know, do you think it’s going to change for the better for the worse? But do you think if
Speaker 4 37:39
we change, bring something good and something that maybe we don’t appreciate so much. So probably, this is going to stay, and it’s going to develop even further. So you can resist change, you can only criticize, and be against it and so on. Or you can see, okay, this is calm, this has come to stay. So how can I integrate this into my work? And how can I adapt what I do to work with these new technologies, machines, etc. So this is basically what I’m doing at the moment. This all new tools are entering the market. And I think they are going to be present more and more I have been AI applications in so many different things. I have seen AI tools to create your like your weekly menu plan to cook and do the shop is like crazy. Like you want a list of books. And you go to jack up at and say Okay, give me books on this topic that I can read blah, blah, blah. And you get that. So this has come to stay like and it’s going to be more and more in our daily life. So also in our professional life. So I will think how can I recycle? How can I adapt? How can I work with the machine with the AI not resisting or so because it doesn’t make sense, like in the end is going to stay there. And as you’ve mentioned, machine translation in the beginning was like a huge Wow, and everybody against it. And now most of us work with that. And on a daily basis. So what do you do you work more on the transcription side of the chain. So basically with the AI tool is going to be the same. I don’t see that this is going to work out autonomously 100% Like without any human check at any point, at least anytime soon. So you can’t work on that. Like okay, what can I add to the AI work as a
Shaheen Samavati 39:55
professional linguist and I think another parallel with machine translation is like obviously when machine translation was introduced it, it led to like more translation like proliferation of things being translated that had never been translated in the past because people didn’t have the resources to do it. So that created more work for editing those translations. And I’m sure something similar could happen here where there’s going to be need to like, revise and check these the results of these AI tools. Get, Tom, your thoughts on this?
Liu Jian 40:22
Yeah, like for me, I think I like it, because I like high technology. So if I’m in kind of high technology we can use to improve the quality of our work. I would like to use it. And also, I think, like, I don’t think this AI will take the place of our position, because I think like, because AI is not perfect, it still needs to get improved. And also there are certain areas like for as translator when to jump in. And when you to like checks for quality, and when to fine tune it and make it acceptable and make it better. So I also have for my own opinion, I like it, and I want to use it more often if it’s possible.
Shaheen Samavati 41:22
In my What do you think the future holds?
Lara Luig 41:25
Yeah, I can, I can pretty much just agree, in my opinion. I mean, like, I sometimes use API’s to help me find content ideas, or to optimize a content strategy or something like that. But you can count on ai 100%. For example, if I translate something with Google translator, from inhalation to German, you can’t use it, you have to adjust it as a human, then it’s the same if you, for example, write a copy was Chet GBT, you can’t use the copy 100% like Ted CBT is given to you, you will have to adjust it because it’s, for example, also, if it includes hashtags or something, they are probably not the hashtags at the moment viral on Instagram, and then it’s not necessary to use them use you will have to do an extra hashtag research which, which tivity isn’t 100% doing for you. As I said, say with Google translator, you can’t translate with Google Translate 100%, copy, paste, and then upload it somewhere that people really recognize that Google Translator 100%. So I think it is nice to have something like this for getting ideas brainstorming to help you out sometimes. But for now, it is definitely a must that a human is doing a check afterwards. And we’ll adjust it in the perfect way.
Shaheen Samavati 42:54
Yeah, that’s a good point that extends to all the AI tools, not only the dubbing tools. Yeah, so. So we’re like about time to wrap up the conversation. But I wanted to give you each like the final chance to have any final thoughts that you have on on this overall conversation and anything you wanted to share. And we can go well, just reversing the order back to lead if you have any final takeaways. Yeah,
Lara Luig 43:18
I think the final takeaway for me is actually that I would love to try out the dubbing tool you used for this conversation. Because I think it is different if you’re watching yourself in the DAP AI version. It would be interesting because I think you can even more analyze the video because you know yourself better you know your pronunciation better your lip sync and the feeling your trance meeting to the people. So I think that would be a takeaway from now on for me to maybe check it out and do it and see how the results are if I’m doing this video.
Shaheen Samavati 43:56
Absolutely. Tom
Liu Jian 43:59
Yeah, I think for me, that’s the same. I think I would also like to task different kinds of dubbing tools. Because just now I mentioned like one of my customer, my customers, like a constantly send me the video to get it translated. So like, I mean, if I have a good recommendation, maybe I can mention to him as a potential customers.
Shaheen Samavati 44:25
Definitely. And actually, I wanted to mention we have been using like AI tools for transcription for for quite a while now at VeraContent. So this is kind of could be a next step from there. Which had like greatly, I mean, we did transcription before those tools existed and it was always very tedious job and so it’s something that’s really like made the job a lot easier and it’s improved. I think the quality of our transcriptions, of course, always checked by a human. Anyway, I’ll leave the final word to it any Yeah, final takeaways.
Speaker 4 45:01
Definitely check the other tool you’re mentioning Rex, because I want to see how can it be when app human can intervene before result. So I think that combination will be, I don’t know if perfect, but it will be the optimal for finding what you can do automatically combining what you can do with the lip seeing the dubbing, and human touch to adapt.
Shaheen Samavati 45:28
Yeah, and it will be really interesting to see, like, the additional functionality that all these tools come up with because I when I was trying to research, it was like, there’s so many, like, so many startups are doing different related things. So it’s actually really difficult to test all of them. But I’m sure we’re gonna see the results of the winners, I guess, that are gonna end up being incorporated into tools that we use every day, like YouTube or, or social media platforms and other things, I’m sure, or video software and so forth. So it’d be really exciting to see. See that? Well, thank you all so much for taking part in this conversation. I really appreciate your feedback. And I’m sure people find this really valuable. I think it was a unique chance to hear firsthand perspective on people who really know about language and about working in these different languages. So thank you all. So I wanted to let everybody know that if you want to see all the videos that we translated using this Hey Jen video translation tool, you can check out our blog post that is going live that’s on veracontent.com/mix. We’ve embedded all the videos there and all 13 languages. And we’ve also had put our like full written analysis. And if you’d like to get in touch with us with feedback or topic ideas for the podcast in the future, feel free to reach out at mcs@veracontent.com. So thanks everybody again, and see you soon.
Transcribed by https://otter.ai
Want more insights from the content and marketing community in Europe?
- Subscribe to our podcast: Spotify | iTunes | RSS
- Subscribe to our YouTube channel
- Follow us on LinkedIn, Facebook, Instagram and Twitter
- Sign up for our newsletter
- Join The Content Mix Facebook community