please solve with clear steps and exactly as writen in the question
Grade distribution:
- Correct Code: 25 points. - Programming style (comments and variable names): 5 points Write a Python program that implements the Taylor series expansion of the function (1+x) for any x in the interval (−1,1], as given by: l(1+x)= x − x^2/2 + x^3/3 − x^4/4+x^5/5−…. The program prompts the user to enter the number of terms n. If n>0, the program prompts the user to enter the value of x. If the value of x is in the interval (−1,1], the program calculates the approximation to l(1+x) using the first n terms of the above series. The program prints the approximate value. Note that the program should validate the user input for different values. If an invalid value is entered, the program should output an appropriate error messages and loops as long as the input is not valid. Sample program run: Enter number of terms: 0 Error: Zero or negative number of terms not accepted Enter the number of terms: 9000 Enter the value of x in the interval (−1,1]:−2 Error: Invalid value for x Enter the value of x in the interval (−1,1]:0.5 The approximate value of ln(1+0.5000) up to 9000 terms is 0.4054651081