I have been trying to find a good way to create an express.js like backend using Rust. Everything seems to be going really well except for the fact that I can't find a good way connect and query a SQL Server database.
I have tried using actix and sqlx 0.6.3 which still has SQL Server support but this I get an error when trying to query the database.
proc macro panicked
message: not implemented: name: unsupported data type DecimalN
I used this code:
use actix_web::{web, App, HttpServer, Responder, get, middleware::Logger};
use sqlx::mssql::MssqlConnection;
use sqlx::Connection;
const DATABASE_URL: &str = "mssql://User:password@server/database";
#[actix_web::main]
async fn main() -> std::io::Result<()> {
// Create a single connection
let mut connection = MssqlConnection::connect(DATABASE_URL)
.await
.expect("Failed to connect to the database");
// Example of using the connection
let result = sqlx::query!("
SELECT * From my_table
")
.fetch_all(&mut connection)
.await?;
// Print the result or handle it as needed
println!("{:?}", result);
// Start the Actix web server
HttpServer::new(move || {
App::new()
.wrap(Logger::default())
.route("/", web::get().to(example_handler))
})
.bind("127.0.0.1:8080")?
.run()
.await
}
I tried the above code hoping for a simple response but got an error. When I just give a static response, I can fetch data from the endpoints. Only when I try to execute SQL statements, I get the error shown above.
Any help on this subject would be greatly appreciated.
So I spent some time searching the Rust docs and the Internet for answers and finally came up with a solution.
use polars::prelude::*;
use connectorx::prelude::*;
const DATABASE_URL: &str = "mssql://UserName:Password@Server:PORT/DB_NAME";
let source_conn: SourceConn = SourceConn::try_from(DATABASE_URL).expect("failed to create connection");
//function needs to be wrapped in a blocking task so that the async runtime doesn't panic
let destination: Arrow2Destination= tokio::task::spawn_blocking(move || {
let queries: &[CXQuery;1] = &[CXQuery::from(&db_query)]; //&db_query is an already prepared SQL Statement
connectorx::get_arrow2::get_arrow2(
&source_conn,
None,
queries).expect("Query failed")})
.await
.expect("Failed to join the blocking task");
This way you get an Arrow2Destination variable.That can be converted to a DataFrame and then to a String.
//convert the 'destination' variable to a DataFrame Object
let df: DataFrame = destination.polars().unwrap();
//Convert DataFrame to String. This STring can then be later used to return JSON data
let df_json: String = serde_json::to_string(&df).unwrap();
//this is to convert the JSON String into an easier to handle JSON format
//The Stringify function is not a standard rust function
//The Stringify() method logic/body can be seen below on this post
let fmt_data = stringify(&df_json).await;
The JSON String is in a weird format after it is converted to String by serde_json::to_string(). The format is seen below:
{
"columns": [
{
"bit_settings": "",
"datatype": "Utf8",
"name": "ID",
"values": [
"F-1",
"F-2"
]
},
{
"bit_settings": "",
"datatype": "Utf8",
"name": "Field1",
"values": [
"Value 1-1",
"Value 1-2"
]
},
{
"bit_settings": "",
"datatype": "Utf8",
"name": "Field2",
"values": [
"Value 2-1",
"Value 2-2"
]
},
{
"bit_settings": "",
"datatype": "Int64",
"name": "Field3",
"values": [
1,
2
]
}
]
}
To convert this Format into a more 'classic' JSON style I wrote up my own 'stringify()' function that then creates a new JSON String in the 'normal' format. The format after stringify() looks like this then:
[
{
"ID": "F-1",
"Field1": "Value 1-1",
"Field2": "Value 2-1",
"Field3": 1
},
{
"ID": "F-2",
"Field1": "Value 1-2",
"Field2": "Value 2-2",
"Field3": 2
}
]
I find it is much easier to handle Data in this format than the format that is returned by serde_json::to_string()
The Stringify function is below:
use connectorx::prelude::*;
use serde_json::Value;
pub async fn stringify(data: &String) -> String{
let json: Value = serde_json::from_str(&data).unwrap();
let columns = json["columns"].as_array().unwrap();
// Create an empty vector to store the transformed data
let mut data: Vec<serde_json::Map<String, Value>> = Vec::new();
// Iterate over each column
for i in 0..columns.len() {
let column = &columns[i];
// Get the column name and values
let name = column["name"].as_str().unwrap();
let values = column["values"].as_array().unwrap();
// Iterate over the values and create a new data object for each
for j in 0..values.len() {
// Safe casting to string
let value = if let Some(value) = values[j].as_str() {
value.to_string()
} else {
// Handle the case where the value is not a string
values[j].to_string()
};
// If the data object doesn't exist, create it
if data.len() <= j {
data.push(serde_json::Map::new());
}
// Insert the value into the data object
data[j].insert(name.to_string(), Value::String(value.to_string()));
}
}
let transformed_json = serde_json::to_string(&data).unwrap();
//println!("Transformed Json: {transformed_json}");
return transformed_json;
}