This is a question linguists don’t want to answer, because it raises the spectre of glottochronology.
Glottochronology is an assumption made in the fifties, that a core 100 or 200 words of vocabulary in all languages would be lost at a constant rate. The figures that a study came up with was 86% retention per millennium for a core 100 words, and 81% per millennium for a core 200 words.
Glottochronology is derived from lexicostatistics, which uses the same core vocabulary to classify languages. The rule of thumb that field linguists apply is that two languages are separate if they share only 80% of the core 100 words. Joining the two together, you get maybe 1300 years to separate two languages.
Lexicostatistics is still used in poorly attested language families, when you have no other choice. It gets a lot of use in Papua New Guinea. 80% seems to me to be on the low side, though.
Glottochronology on the other hand was discredited very early. The statistical study was heavily flawed: the languages were almost exclusively European, and Latin ended up counted 5 times. A study done in 1962 found that Icelandic (universal literacy) had lost just one word out of 100 in a thousand years, whereas Inuit (taboo substitution of words) had lost close to half in the same period.
So there is no constant rate at which languages separate.
But we have plenty of instances in history where people migrated away, and the language slowly diverged. The instances I can first think of, such as early modern resettlement within Europe, or colonialism in the New World, show that 300 years is clearly not enough. A ballpark figure is going to be closer to between 500 and 1000 years. With all the provisos already given.