• A geometric series fit the form Gn = sum (k=0 to infinity) ar^k. Setting r=1 would make Gn = sum (k=0 to infinity) a 1^k. Since 1^k = 1 for all integers, the series would just be an infinite sum of whatever "a" is, so a + a + a + a + ..., since the terms are all the same, they are not getting any smaller, thus the series cannot converge. Since a divergent series is defines as a series that has no finite limit, this is a divergent series. To take a more methodical approach, lim(n->inf) (1 - r^(n+1)) / ( 1 - r ) when r=1, lim(n->inf) ( 1 - 1^(n+1) ) / ( 1 -1 ) = undefined. So, actually, the second part of that statement is not true, the limit does not exist when r = 1.
    • S@m21
      Okay, I understand now. I was confused with the words: sequence and series. So, if Gn was a geometric sequence rather than a geometric series, it would converge, since its limit as n goes to infinity exists, right?
    • bostjan64
      Correct. I agree that the term "series" is confusing..

Copyright 2020, Wired Ivy, LLC

Answerbag | Terms of Service | Privacy Policy