Let us offer more than one translation so that we can experiment and get more feedback
Today I had a fairly simple sentence: "Il est quasiment prêt."
I gave a fairly safe response: "he is practically ready." It would be nice to experiment by having a second blank, or being able to add additional blanks.
In the second blank, I could experiment with "he is nearly blank." in a third, "he is almost blank." and so on.
As far as credit is concerned, I am fine with credit only being given for the first response. What I am much more interested in is Duolingo's opinions of each answer. Since Duolingo rates translations, I know you must have a sense of how good a translation is, how natural, how close to the speech or writing of a native speaker.
So what would be wonderful is to have the system rate the answers I give. Ratings would help me make my own speech or writing more natural. And in the meantime Duolingo gets data on what students know and are thinking of as possible answers.
No, I am not assuming that, actually. I know that in the Immersion tab, Duolingo rates translations, even for texts that are still in the process of being translated. So, I merely suppose that Duolingo has a method for arriving at those ratings and so might apply the same to translations in exercises, too. Nothing superhuman required.
How do you know that the translations are rated? Is it done personally, or by software? All I see is that they tell you how much of a text still needs translating or still how much of what is translated needs review. You can do that by having a computer keep track of which sentences have been translated and which translations have been reviewed by someone. That's quite far from somebody entering a single sentence and being told how natural it is or how close to what a native speaker would say. How do you imagine that would work?
Try visiting this page: http://www.duolingo.com/translation/9d19419b60310411a5d3b55c7191345d
The words duolingo uses are "Quality of Translation." That's a rating of the quality. Every immersion has one. However Duolingo manages those, I am suggesting that use the same technique for their lessons.
Yes, the word "quality" is used. Then it's qualified by "estimate", which means that the computer makes the same entry for every translation that is entered. There's no evidence that a human, other than the translator, has even seen it. The same technique applied to the lessons would mean the computer is telling you "yes, you just entered what appears to be a sentence." How would that help? I think you're looking for magic.
I have been more than civil with you, putting up with all your criticisms, including ones based on your assumptions about me. Are you a Duolingo staff person? Because if you are you would not do them any credit. "Looking for magic," indeed.
And if it is such an extraordinary request then it does no harm to Duolingo for me to think of it. They can ignore it or aspire to it to as they please.
But you? I don't intend to waste any more time with you. Heap criticisms on this thread if you like. That you find no possible way for it to be practical does not impress me in the least.