-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Decoding tstzrange(-infinity, infinity) crashes sqlx #3685
Comments
One way to solve this problem is to create a wrapper type that does take this into account. #[derive(Debug)]
enum PgTimestampTz {
Infinite,
NegativeInfinite,
DateTime(OffsetDateTime),
}
impl Type<Postgres> for PgTimestampTz {
fn type_info() -> PgTypeInfo {
<OffsetDateTime as Type<Postgres>>::type_info()
}
}
impl<'r> Decode<'r, Postgres> for PgTimestampTz {
fn decode(value: PgValueRef<'r>) -> Result<Self, sqlx::error::BoxDynError> {
match value.format() {
PgValueFormat::Text => match value.as_str()? {
"infinity" => Ok(PgTimestampTz::Infinite),
"-infinity" => Ok(PgTimestampTz::NegativeInfinite),
_ => OffsetDateTime::decode(value).map(PgTimestampTz::DateTime),
},
PgValueFormat::Binary => match value.as_bytes()? {
b"\x7F\xFF\xFF\xFF\xFF\xFF\xFF\xFF" => Ok(PgTimestampTz::Infinite),
b"\x80\x00\x00\x00\x00\x00\x00\x00" => Ok(PgTimestampTz::NegativeInfinite),
_ => OffsetDateTime::decode(value).map(PgTimestampTz::DateTime),
},
}
}
} |
@dns2utf8-novaziun when you say "crash", do you mean the code panics? |
Yes, it panics with an error like this: .cargo/registry/src/index.crates.io-6f17d22bba15001f/time-0.3.37/src/primitive_date_time.rs:943:14:
resulting value is out of range I notice that I have the same error with dates that are far in the future like this: {
"[\"2024-01-01 00:00:00+00\",\"2050-01-01 00:00:00+00\")",
"[\"2024-08-01 00:00:00+00\",\"2025-01-01 00:00:00+00\")"
} In sqlx I get the type Is there a way to tell sqlx that there is no need for the |
I tried the solution from @joeydewaal too, but I can not get it to compile. I get there two errors valid_range: row.try_get(8).context("valid_range fail")?,
^^^^^^^ the trait `Type<Postgres>` is not implemented for `PgRange<PgTimestampTz>`
ranges: row.try_get(9).context("ranges fail")?,
^^^^^^^ the trait `Type<Postgres>` is not implemented for `PgRange<PgTimestampTz>` I extended your code to this: #[derive(Debug)]
pub enum PgTimestampTz {
Infinite,
NegativeInfinite,
DateTime(OffsetDateTime),
}
impl sqlx::Type<sqlx::Postgres> for PgTimestampTz {
fn type_info() -> sqlx::postgres::PgTypeInfo {
<OffsetDateTime as sqlx::Type<sqlx::Postgres>>::type_info()
}
}
impl sqlx::postgres::PgHasArrayType for PgTimestampTz {
fn array_type_info() -> sqlx::postgres::PgTypeInfo {
//PgType::TimestamptzArray
PgTypeInfo::with_oid(Oid(1185))
}
}
impl<'r> sqlx::Decode<'r, sqlx::Postgres> for PgTimestampTz {
fn decode(value: sqlx::postgres::PgValueRef<'r>) -> Result<Self, sqlx::error::BoxDynError> {
match value.format() {
sqlx::postgres::PgValueFormat::Text => match value.as_str()? {
"infinity" => Ok(PgTimestampTz::Infinite),
"-infinity" => Ok(PgTimestampTz::NegativeInfinite),
_ => OffsetDateTime::decode(value).map(PgTimestampTz::DateTime),
},
sqlx::postgres::PgValueFormat::Binary => {
let bytes = value.as_bytes()?;
info!("decoding PgTimestampTz bytes {bytes:#?}");
match bytes {
b"\x7F\xFF\xFF\xFF\xFF\xFF\xFF\xFF" => Ok(PgTimestampTz::Infinite),
b"\x80\x00\x00\x00\x00\x00\x00\x00" => Ok(PgTimestampTz::NegativeInfinite),
_ => OffsetDateTime::decode(value).map(PgTimestampTz::DateTime),
}
}
}
}
} Do I assume correctly, that I will not be able to implement the trait on |
I have found these related issues/pull requests
This is also happening in Array's containing any infinities
Description
sqlx crashes when a query contains +/-infinity in any result
Reproduction steps
Use a query like:
SQLx version
0.8.2
Enabled SQLx features
default-features = false, features = [ "runtime-tokio-rustls", "postgres", "chrono", "time" ]
Database server and version
Postgres 16.3-1.pgdg22.04+1
Operating system
Linux or Docker
Rust version
1.84.0
The text was updated successfully, but these errors were encountered: