postgresqlscalascalikejdbc

How to save record in postgres with custom type using scalikejdbc?


In my case i have a enum type in postgres:

create type my_type as enum (string value);

And some tables, using this as column type:

create table A (
...
t my_type,
...
)

In postgres i can insert new record in table A like this:

insert into A values(..., 'my_type_value', ...);

Scalikejdbc generates correct sql:

 insert into A (...) values (..., 'my_type_value', ...)

But fails with an error:

ERROR: column "t" is of type my_type but expression is of type character varying HINT: You will need to rewrite or cast the expression.

I tried to do this:

object MyType extends Enumeration {...}
case class A(..., t: MyType, ...)
object A extends SQLSyntaxSupport[A] {
  def apply(rs: WrappedResultSet): A = A(..., rs.getString('t'), ...)
}

Also, i tried to add implicit convertions in enum type in scala code:

object MyType extends Enumeration {
   implicit def stringToValue...
   implicit def valueToString ...
}

But it didn't help too.

Insert code looks like this:

 withSQL {
      insertInto(A).namedValues(
        ...
        A.column.t-> e.t, // e - passed entity into insert fun
        ....
      )
    }.update().apply()

Solution

  • Finally, solved it. I have to add implicit converter:

     implicit val valueToParameterBinder: ParameterBinderFactory[MyType] = ParameterBinderFactory {
        value => (stmt, indx) => stmt.setObject(indx, value, Types.OTHER)
      }
    
    object A extends SQLSyntaxSupport[A] {
      def apply(rs: WrappedResultSet): A = A(..., rs.getString('t'), ...) // just call getString as usual
    }