Scalar types represent concrete values at the leaves of a query. For example in
the following query the name field will resolve to a scalar type (in this case
it’s a String type):
There are several built-in scalars, and you can define custom scalars too.
(Enums are also leaf values.) The built in scalars are:
String , maps to Pythonβs str
Int , a signed 32-bit integer, maps to Pythonβs int
Float , a signed double-precision floating-point value, maps to Pythonβs
float
Boolean , true or false, maps to Pythonβs bool
ID , a specialised String for representing unique object identifiers
You can create custom scalars for your schema to represent specific types in
your data model. This can be helpful to let clients know what kind of data they
can expect for a particular field.
To define a custom scalar you need to give it a name and functions that tell
Strawberry how to serialize and deserialise the type.
For example here is a custom scalar type to represent a Base64 string:
Note
The Base16 , Base32 and Base64 scalar types are available in
strawberry.scalars
To override with a pendulum instance you’d want to serialize and parse_value
like the above example. Let’s throw them in a class this time.
In addition we’ll be using the Union clause to combine possible input types.
Since pendulum isn’t typed yet, we’ll have to silence mypy’s errors using
# type: ignore
Python by default allows, integer size to be 2^64. However the graphql spec has
capped it to 2^32.
This will inevitably raise errors. Instead of using strings on the client as a
workaround, you could use the following scalar:
You can adapt your schema to automatically use this scalar for all integers by
using the scalar_overrides parameter
Tip
Only use this override if you expect most of your integers to be 64-bit. Since most GraphQL schemas
follow standardized design patterns and most clients require additional effort to handle all numbers
as strings, it makes more sense to reserve BigInt for numbers that actually exceed the 32-bit limit.
You can achieve this by annotating `BigInt` instead of `int` in your resolvers handling large python integers.