1

I am trying to use pandas DataFrame.to_sql to insert values in a table of my Postgres database. I have some nan values in a column of integers that does not belong to any constraint.

I get the following error :

sqlalchemy.exc.DataError: (DataError) integer out of range

When I substitute nan values with zéros, insertion happens as wanted, so it is really nan values that are to blame for my error.

I have tried converting nan values to None and to np.nan , but I get the same error. So the question is: what nan format do I need so that pd.to_sql handles it correctly?

My restrictions are : python 2.7 pandas 0.14.1 , sqlalchemy 0.9.8 , Postgres 9.2

1
  • similar problem here: "Incorrect decimal value: 'NaN' for column XYZ". XYZ in my case is a DECIMAL column. Commented Jan 13, 2020 at 6:10

1 Answer 1

2

The problem is with your pandas version: 0.14.1.

Starting with pandas 0.15, to_sql supports writing NaN values.

You can try upgrading your pandas.

Also, for now you can convert NAN to NONE like:

df = df.astype(object).where(pd.notnull(df), None)
Sign up to request clarification or add additional context in comments.

2 Comments

Thank you for your answer. I can unfortunately not upgrage pandas since it is on a server I use, and the administrator does not wish to proceed to an upgrade.. I have already tried replacing by None but it won't work, I get the same error..
Please refer to this.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.