I'm seeking to run googletrans to translate a series of 300 .txt files in a single folder. I'm struggling to construct a loop that will allow me to run a translation of each file and write the output in new .txt files. Googletrans has a limit on bulk translations, so I'm happy to limit the iterations to 50 files at a time.
Here's the code for translating a single file. It prints the original txt file, then the translated file, and finally outputs the file into a new txt file.
from googletrans import Translator
f = open('Translation Project\page_323.txt', 'r')
if f.mode == 'r':
contents = f.read()
print(contents)
translator = Translator()
result = translator.translate(contents, dest='en')
print(result.text)
with open('Translation Project\trans_page_323.txt', 'w') as f:
f.write(result.text)
Any thoughts? New to Python and still wrapping my head around loops.
Assuming that there are 999 pages, the files are formatted as trans_page_1.txt
rather than trans_page_001.txt
, and that the first page is page 1, not page 0:
from googletrans import Translator
translator = Translator()
for page_number in range(1, 999):
f = open(f'Translation Project\page_{page_number}.txt', 'r')
if f.mode == 'r':
contents = f.read()
print(contents)
result = translator.translate(contents, dest='en')
print(result.text)
with open(f'Translation Project\trans_page_{page_number}.txt', 'w') as f:
f.write(result.text)
This doesn't limit the files translated, but you can do this by changing the maximum page to 50 or by doing some other code shenanigans.