When I sum an empty series I expect the value to be zero, as indicated in the Pandas documentation. Indeed, if I follow the example there, create an empty Series, and sum it then that is what I get
(Pdb) empty = pd.Series([])
(Pdb) type(empty)
<class 'pandas.core.series.Series'>
(Pdb) empty.sum()
0.0
However, I have a DataFrame that, based on the underlying data, can sometimes be empty. In that case, I have something like
(Pdb) prior
Empty DataFrame
Columns: [ILC, FCTC, AWD, PD]
Index: []
(Pdb) prior['PD'].sum()
False
This is unexpected, because
(Pdb) type(prior['PD'])
<class 'pandas.core.series.Series'>
This seems to be a situation identical to what I had in the first code block. Can someone help me understand what I'm missing? Why is it the sum in the second code-block returns False, whereas the first one returns a numerical value?
Edit
I've been asked to post the code that creates prior. There are several steps that go into creating it, but I can recreate this issue by create an empty dataframe and summing over one of the columns. See below:
In [1]: import pandas as pd
In [2]: %paste
dfObj = pd.DataFrame(columns=['User_ID', 'UserName', 'Action'])
print("Empty Dataframe ", dfObj, sep='\n')
## -- End pasted text --
Empty Dataframe
Empty DataFrame
Columns: [User_ID, UserName, Action]
Index: []
In [3]: dfObj['User_ID'].sum()
Out[3]: False