Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Column types assumed as int if first value is a whole number #13

Closed
timkpaine opened this issue Jan 27, 2018 · 4 comments
Closed

Column types assumed as int if first value is a whole number #13

timkpaine opened this issue Jan 27, 2018 · 4 comments

Comments

@timkpaine
Copy link
Member

timkpaine commented Jan 27, 2018

If I have a column consisting of values [1.000, 2.5, 3.562, ...] then the column is interpreted as integer instead of float. A (bad) workaround is to increment the first value in the column by a negligible amount (e.g. 0.00000001).

@tstordyallison
Copy link

I think in this case the only way you are going to get the correct result is to create the table with a schema, rather than rely on type inference. There's only so far down the rabbit hole we can go...

@timkpaine
Copy link
Member Author

The json is coming out as 1.000 though. I would get that if it looked like {[1, 2.5, 3.4]} etc where the first value is actually an int.

@tstordyallison
Copy link

The magic is here:

https://github.com/jpmorganchase/perspective/blob/828a2614e638bd277863495d3f69f7c2f1586a79/packages/perspective/src/js/perspective.js#L37-L62

It tries to guess the types - but it isn't always going to work (especially in js, where we lack the distinction between an int and a float).

@timkpaine
Copy link
Member Author

closing, the solution is really to use a schema

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants