When should we see the 3.0 Tree rolled out to all users?
Your progress will be kept for all skills and lessons not touched by the update, but in cases where there's been a change to a lesson, this will have to be redone before you can progress further.
The 3.0 version contains less changes and new vocabulary than the 2.0 version did, so in most cases where a previously completed skill has turned incomplete, you'll only have to redo one or two of the lessons inside it. These will feel like reviews with some new content mixed in. :)
We've asked Duo repeatedly for an update on how the test is going and had no response. The issue seems to be that we don't teach a language staring with "Jap" and ending with "nese". ;)
There used to be two different metrics visible to us, one showing drop-out rate (user retention), and the other attempting to measure actual learning (knowledge retention) by testing users in already learned content at the end of a lesson.
These "test questions" would be given without hints, so that they measured actual retention rather than just the ability to use hover hints. Apparently this was frustrating to users, impacting user retention negatively, so Duo removed it.
The only metric left is the drop-out rate, which in my opinion is the wrong thing to measure a tree's success by. This metric will show better results if we simplify all the content, even if the users end up learning less. If we add more challenging content, it will show worse results, as users get frustrated by getting things wrong. Basically, we're forced through a test that says nothing about the tree's ability to teach Norwegian, only its ability to keep people using it.
Not the answer either of us were looking for, I'm sure, but there it is.
Thanks. There are some data points I've been curious about for a long time, that Duolingo seems to keep close to their chest, such as what percentage of people that start a course actually finish it, and/or how many will reach level 25, within let's say a year. My guess is the number is under 1% or maybe even less than that, which still means a significant number of people are gaining a certain level of proficiency in a new language. My guess is that over 90% of the users will give up in the first week.
I wouldn't be surprised if that were the case, or at least close to it. Since the courses are free (thankfully!), there's nothing stopping people from creating an account just to poke around a bit. I know I started mine over a year before I actually started using it actively.
There's likely to be quite some variation between courses too though; some are what's standing between the learner and a job opportunity, while others are more likely to be learned for fun - or started out of curiosity. The length of the course and complexity of the grammar also plays in.
Edit: I remembered that Duo actually wrote a blog post about this a while back. They've only measured results over a 90 day period, so understandably very few users managed to complete the longer courses like ours in that amount of time, but it would be nice to see the same thing monitored over a longer period of time.