Suggestions for Poor "Final" Translations
I came across a translation today that was marked with a green circle indicating 100% complete. The translation was not exactly wrong, but it wasn't exactly right either. Among other things, an instance of "sie" that should have been translated as "she" was translated as "it". Also, in general, the English was awkward, and certainly not phrased in a way that a native English speaker would have actually spoken. Out of curiosity, I gave Google a shot at it, only to discover that this, presumably the VERY BEST translation, was EXACTLY the version spit out by Google.
That doesn't bode well for this site's main objective, and personally, I think the problem has a lot to do with people not voting. Clearly this site can't be successful if nobody translates, but it also can not be successful if nobody takes the time to rank the hopefuls. I propose the following changes as a possible solution. Feel free to tear this apart or suggest modifications or additions.
1) Don't accept new translations into the database until the submitter has also ranked at least 10 other submissions (10 is an example number, adjust as you feel is appropriate). The idea here is to force more people to rank. Without real people competent in English ranking the submissions, we're nowhere.
2) Of those 10, make the first 3 the three translations currently ranked the highest. Make the next 7 the translations with the fewest number of votes. The point here is to make sure everyone has a chance to vote down Google dross that has temporarily got rated undeservedly high, and to make sure that, over time, ALL translations, even late comers, get voted on about the same number of times. (again, 3 and 7 are example numbers, use a different ratio if that makes more sense).
3) Once a translation has received a certain number of bad votes (say 5 red circles), toss that one out. In this way, over time, clearly bad translations go away and the overall fitness of the remaining population goes up.
4) Don't close a translation as complete (100%) after a fixed number of submissions (maybe that isn't even currently done, I can't tell). Instead, use a different metric like "the total number of votes must be X and the ratio between good and kinda good (that is, green and yellow) has to be at least as large as some predetermined ratio (4:1? I don't know).
Perhaps a translation should never be rated as 100%, but rather 99% or something similar, so that edits, improvements, or variations would ALWAYS be possible.
The people that are just putting google translate into the translations to rack up points are a problem, but here's my worry. If you make them rate in order to get credit for translating, then what's to stop them from just randomly clicking whatever ratings they want? They're clearly after points, so why wouldn't they just say "okay, if I have to rate, then I'll just click whatever and be done with it." The issue is quality and it needs to be in the ratings as much as the translations being rated imo.
What would be neat is that you'd only receive credit/points if your translation is reasonably close to the final one. Many times the google translation is very off and a better translation is too different for it to be reasonable close to get any points. This happens to me a lot when no one has gone through a document except DuoBot. I'll spend a bit of itme trying to figure out how to express something in English, but it varies too much from what DuoBot spits out, so I get 41% accuracy compared to the "top translation".
What would be nifty is, as the translations are rated and such, if my percentage goes up above some predetermined number (say 50), that I suddenly receive credit. Likewise, if the garbage I just plug in from google becomes too dissimilar from the final translation (falls below 50% let's say), then I would lose credit. This would make it really difficult for people to game the system, and would also encourage people to a) take more time with translations, and b) check back on documents they translated to follow up/edit their original translation. I don't know how difficult this would be to implement though.
@duodanny: I'm not sure I agree with you that people are submitting Goggle level translations just to rack up points, although you may well be right. That idea just hadn't crossed my mind before. But here's the thing. I think the majority of people that are here, are here to learn a new language, polish up old language skills, or specifically to do translations. I'm not convinced that points are actually a great motivator. But I could be wrong. After all, I like to see my own points go up to be sure. The points system is fun, and it's a concrete indicator of progress, but it's certainly not my main reason for being here, and, actually, if the points system were to go away, I wouldn't care. But let's say for a minute that points are a primary motivator to the majority of the Googleites. There are other ways to rack up points that don't take that much longer and are certainly more productive (10 points every time you practice the material you've been through to date, more points any time you do a vocab practice, no limit to the number of those you can do). As for your second idea about people just voting any which way on 10 translations so they can get their points... I'll point out that you get extra points for ranking translations in the current system right now, yet very few people seem to actually vote. If points were a strong driver, I'd expect people to quickly throw in a Google translation and then vote on as many candidates as possible, thereby maybe doubling the number of points received. But we don't see that. Instead, we see that typically most submissions haven't been voted on AT ALL!
In my own opinion, the real thing that is driving the high number of Googleisms, is that translations are given as a challenge in the lesson stream much too early. I think people feel somehow compelled to do them, and then, when they try, they find themselves way in over their heads and resort to Google. Actually, when I first appeared on the scene, it wasn't clear to me that I could still advance through the lessons without first doing the requested translations. I am assuming that other people may have had (or still have) this misconception as well.
In my opinion, all translation challenge requests should be removed from all the lessons before the first unlock point, and perhaps from all of the second group of lessons as well. I would replace them all with "refresh" lessons. This doesn't prevent people who already have a German background from translating early on. Anyone can translate as much as they want to at any time by going to the "Translation" tab. It does take the pressure off people who are actually new to the language from feeling like they have to do translations though. I think it would help. Then again, I may be wrong, and it really is all a points deal.
@petro626: I do like your idea, but it may already be (partially) that way anyway. You know, when I wrote my original post yesterday, it was my understanding that translations marked "100%" were completely closed. But later I discovered that, although I could offer no new translation for one marked at 100%, I was still able to go through and vote on the submissions that had been made so far. I don't know if those new votes actually had any effect, but it was more than I thought you could do.
Before I came to Duolingo, I translated a German novelette using Google Translate. I downloaded the novelette from an Austrian website and copied each sentence, one at a time, into Google Translate.Then I would copy its translation to my word processor. Using a dictionary, I would edit the translation into acceptable English. Except for very short, simple sentences, I needed to do this every time. Many of these translations did not make sense. Often Google had chosen a completely wrong meaning from those listed in the dictionary, sometimes with hilarious results. I cannot see whether my translation of the novelette was a good one, whether it conveyed the literary values the author intended. I have not tried it out on a reader who doesn't know any German. However, if I had just left the Google output untouched, the reader would have found it the utter gibberish.