Description
Bug: BaseConfig's max_anystr_length default
Currently, str
and bytes
are validated using the Config
class. By default, it imposes a validation of
max_anystr_length
of 2 ** 16
.
The result is that declaring a model:
from pydantic import BaseModel
class Model(BaseModel):
a: str
would be equivalent (with respect to validations) to:
from pydantic import BaseModel
class Model(BaseModel):
a: constr(max_length=2 ** 16)
or
from pydantic import BaseModel, Schema
class Model(BaseModel):
a: str = Schema(..., max_length=2 ** 16)
(although the JSON schema currently doesn't show the default max_length).
Given that bytes
are made to hold binary data that might be large, I think it doesn't make much sense to have that default constraint, especially for bytes
. I can also imagine a whole book in a str
would have the same problem.
I would suggest changing the defaults of min_anystr_length
and max_anystr_length
to None
. Is there a more complex rationale for the fixed default that I'm not seeing?
One option is to set the defaults of Config
to None
(I vote for this).
The other option is to make the generated JSON Schema match the actual default validation behavior.
But I think it would probably seem counterintuitive to declare a model with a standard str
or bytes
and get a constrained version (and a constrained JSON Schema). And having to add extra code to remove a default extra constraint, while using a "standard" type (unconstrained in Python).
For bugs/questions:
- OS: All
- Python version
import sys; print(sys.version)
: All - Pydantic version
import pydantic; print(pydantic.VERSION)
:master
branch
Where possible please include a self contained code snippet describing your
bug, question, or where applicable feature request:
PR with test case in 5 min.