Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Detect encoding and decode text response #256

Merged
merged 5 commits into from
Feb 15, 2018
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ categories = ["web-programming::http-client"]

[dependencies]
bytes = "0.4"
encoding_rs = "0.7"
futures = "0.1.15"
hyper = "0.11.9"
hyper-tls = "0.1.2"
Expand All @@ -27,6 +28,7 @@ tokio-tls = "0.1"
url = "1.2"
uuid = { version = "0.5", features = ["v4"] }
hyper-proxy = "0.4.0"
uchardet = "2.0"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't use chardet because of license issue, it's LGPL-3.0


[dev-dependencies]
env_logger = "0.5"
Expand Down
2 changes: 2 additions & 0 deletions src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,7 @@
//! [cookiejar_issue]: https://github.com/seanmonstar/reqwest/issues/14

extern crate bytes;
extern crate encoding_rs;
#[macro_use]
extern crate futures;
extern crate hyper;
Expand All @@ -150,6 +151,7 @@ extern crate tokio_io;
extern crate tokio_tls;
extern crate url;
extern crate uuid;
extern crate uchardet;

pub use hyper::header;
pub use hyper::mime;
Expand Down
11 changes: 8 additions & 3 deletions src/response.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,11 @@ use std::fmt;
use std::io::{self, Read};
use std::time::Duration;

use encoding_rs::{Encoding, UTF_8};
use futures::{Async, Poll, Stream};
use serde::de::DeserializeOwned;
use serde_json;
use uchardet;

use client::KeepCoreThreadAlive;
use header::Headers;
Expand Down Expand Up @@ -180,9 +182,12 @@ impl Response {
let len = self.headers().get::<::header::ContentLength>()
.map(|ct_len| **ct_len)
.unwrap_or(0);
let mut content = String::with_capacity(len as usize);
self.read_to_string(&mut content).map_err(::error::from)?;
Ok(content)
let mut content = Vec::with_capacity(len as usize);
self.read_to_end(&mut content).map_err(::error::from)?;
let encoding_name = uchardet::detect_encoding_name(&content).unwrap_or_else(|_| "utf-8".to_string());
let encoding = Encoding::for_label(encoding_name.as_bytes()).unwrap_or(UTF_8);
let (text, _, _) = encoding.decode(&content);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure about this.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like decode returns a Cow<str>, since it may have detected that the bytes were valid UTF-8 and didn't need to do any copying. So, we can handle the Cow if it is Cow::Borrowed, that means we don't need to make a new copy, since the bytes in content were valid! Eliminating this copy is a bigger deal depending on how big the body was.

So, seems this could be handled like so:

// a block because of borrow checker
{
    let (text, _, _) = encoding.decode(&content);
    match text {
        Cow::Owned(s) => return Ok(s),
        _ => (),
    }
}
unsafe {
    // decoding returned Cow::Borrowed, meaning these bytes
    // are already valid utf8
    Ok(String::from_utf8_unchecked(content))
}

Ok(text.to_string())
}

/// Copy the response body into a writer.
Expand Down