In may i've created a Java servlet which allow me to query my hive tables on Cosmos .
Before the migration to cygnus 0.8.2 , my data were pushed to a table named "hostabee" which I can still query from my Java application . But now a table is automatically created for each of the entity pushed to cosmos . At first , i've seen no problem, this even made my project simpler . But I can't query the new tables from my application . Instead i get this error
java.sql.SQLException: Query returned non-zero code: 10, cause: FAILED: Error in semantic analysis: Line 1:15 Table not found 'guillaume_jourdain_hostabee_hives_a_hive_column'
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:194)
at HiveBasicClientServlet.doQuery(HiveBasicClientServlet.java:100)
at HiveBasicClientServlet.demo(HiveBasicClientServlet.java:189)
at HiveBasicClientServlet.doGet(HiveBasicClientServlet.java:44)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:701)
or this one
java.sql.SQLException: Query returned non-zero code: 9, cause: FAILED: Execution Error, return code -101 from shark.execution.SparkTask
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:194)
at HiveBasicClientServlet.doQuery(HiveBasicClientServlet.java:100)
at HiveBasicClientServlet.demo(HiveBasicClientServlet.java:189)
at HiveBasicClientServlet.doGet(HiveBasicClientServlet.java:44)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:617)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)
at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:857)
at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)
at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)
at java.lang.Thread.run(Thread.java:701)
The table "guillaume_jourdain_hostabee_hives_a_hive_column" exist and i can query it from hive with a ssh connection . Do you have an idea from where my problem come from ? I can show you part of my code if necessary .
The following error:
Error in semantic analysis: Line 1:15 Table not found 'guillaume_jourdain_hostabee_hives_a_hive_column'
Is not due to Cygnus 0.8.2 but to a rare behaviour we have experienced with our Global Instance of Cosmos at FIWARE Lab. Fortunately, it is easily fixed by prefixing the default database name (which is default
:)) to the table name. I.e. if you had somethind like:
select * from mytable;
Now you have to write:
select * from default.mytable;