I am getting an error which tells me I can't multiply two variables of a specific value.
TypeError: can't multiply sequence by non-int of type 'str'
I am trying to make a Pythagorean theorem inside of python for school. I need to have it inside of a float to get a decimal number.
I have already tried a couple of different things,
l_1 = float(input())
l_1 = float(l_1)
l_1 = str(l_1)
print ("The long side is: " + l_1)
l_2 = float(input())
l_2 = float(l_2)
l_2 = str(l_2)
print ("The short side is: " + l_2)
l_2 = int(l_2)
l_1 = int(l_1)
l_1 = int(l_1)
l_2 = int(l_2)
wor1 = math.sqrt(l_1 * l_1 - l_2 * l_2)
print (wor1)
I expect the output to actual be the answer without any error codes, I just need it to calculate with the variables it is given.
A couple of changes to the code and you are good to go.
Please mind that, While calculating the square root, be careful of passing the absolute difference of the squares in the sqrt function. Using this, you can remove the convention of small and large side. Just take two sides and code will handle this for you.
import math
l_1 = float(input())
print ("The long side is: " + str(l_1))
l_2 = float(input())
print ("The short side is: " + str(l_2))
difference = float(l_1 * l_1 - l_2 * l_2)
# Take absolute difference since square roots of negative numbers are undefined
absolute_difference = math.fabs(difference)
# Get square root of the absolute difference
wor1 = math.sqrt(absolute_difference)
print (wor1)