Rejected Translations III
Okay, the whining continues. The sentence is "Was mir sehr wurscht ist.", which is basically expressing approximately the sentiment of "I don't really give a damn.", but I don't feel comfortable translating it exactly that way, so I go with "I couldn't possibly care less.". No! You suck, chubbard! 17%! Go away! The best translation to date? "I really do not care what is." Yeah, much better. Not only does the English sound completely native, it does so while pretty much expressing the original sentiment exactly (or not). That's awesome! At least I have no way to see what the other translations were (if there were others), and no way to assess them. Oddly this is not really encouraging me to expend further time and effort on translations. Why bother? Let's just hand the keys over to Google and call it a day.
Well, this sentence isn't a full sentence, just a subordinate clause. So it is actually incorrect in german anyways and should complete a german sentence after a comma (...., was mir sehr wurscht ist."). This translates literally to "..., which is very all the same to me." And this isn't exactly, what I would call good english.
I think your translation is best!
Remember that that "17%" isn't actually grading your sentence. It's just comparing it to the sentences that have already been entered. If there's only 1 or 2 to compare to, your percentage may be very low, even if it's correct (or uses synonyms, for example). But now that your sentence is in there, when someone ELSE enters a sentence like yours, their percentage will be much higher, because the bots will be able to reference that many people seem to agree on this alternative wording.
Also, do you do a lot of rating of other people's sentences? That will help a lot too. The bots can only use the info they have.
Once you rate the "best" sentence, you can click to see other people's.
elae, thank you for the response. You've put to rest one of my biggest concerns, which was, if my candidate is rejected by the system (that is to say, if it scores so low that it's not acceptable to fulfill the lesson challenge), it simply gets deleted. But, from what you're saying, it sounds like that's not the case, which is great news!
As for voting, yes, if I am reasonably confident about my own translation, I take the time to vote on (and perhaps suggest edits to) EVERY available translation candidate. After all, like you point out, if people don't rank the submissions, there is no way for the good ones to float to the top. On the flip side, if I am not very confident in my translation, I typically won't vote. I figure if I'm not sure I understand the source material, I'm not competent to offer an opinion (although I may still reject those that are obviously from a machine translator and make no sense in English).
My other concern is this though. In the example I gave where I got ranked at 17%, I wasn't even given the opportunity to rank the other submissions, even though, in this particular case, I was pretty sure of my own translation. It's possible that's because I was doing the translation as part of a lesson challenge, but, when I went back to the translation from the "Translation" page, I still couldn't see or vote on any other submissions. If the system works that way, that's a problem, because it means people with more appropriate (albeit currently low rated) translations, will never have the opportunity to vote, and improved translations will never surface.
I fear that's already happening. I was looking at some articles this afternoon via the new translations interface, and I noticed that, for the particular article I was reviewing, most of the sentences were marked with a green 100% circle. I take this to mean that the translation is "closed". That is, the system feels the current best is good enough. The problem is, of the many sentences that were so marked, a few of them were written in poor English. I think the ideas they expressed were proper, it's just that they were often expressed in a way that no native English speaker would ever actually say. For some of these, I tried to rank submissions, but that functionality seemed unavailable (although it appeared that I could still suggest an edit for the "best").
Yeah, I do the same thing when voting. If I'm confident, I rate a lot of the submissions, but if I'm less than confident about one of my translations, I'll only rate sentences that I feel do really poorly (because even if I'm not confident in my own, I can identify ones that have really bad grammar and vote them down).
I totally agree that the 100% circle is problematic. Doing a translation seems to fill it by 5%... which means only 20 translations are taken into account?! That seems really really low if you want to crowdsource a strong translation, especially since no one else can vote on them.
I came across a sentence that was closed already. Perhaps there were enough more or less identical translations given. BUT most of these translations are absolutely inaccebtable. I reported to Duolingo to get this fixed.
I don't think Duolingo is doing a good job choosing texts, because they are often either blogs in modern spoken german (and this differs from written the language to a certain degree - like in english too). The Fables were written down probably in the 19th century or earlier. So the use of certain word does not apply to the modern (written and spoken) use in many cases. For instance the verb "witzigen" is not in the standard german dictionary because it has not been used vor decades. Related words like "Witz" and "witzig" are used - in the sense of something funny - nowadays. And this leads to missinterpretation of the text - also caused by Duolingo that gives the wrong translation hint.