apache-sparkapache-spark-sqlhivecontext

Comparing 2 dataframes in spark when using hivecontext for 1 dataframe and sqlcontext for the other


when i am storing hive table in one dataframe using HiveContext and DB2 table in another dataframe using sqlcontext on querying both the dataframes it is not detecting the Db2 while it detects hive. What is a common sqlcontext that can be used?


Solution

  • TL;DR Use the same context for all tables.

    If you need Hive support use HiveContext or SparkSession with Hive support. Don't create a separate session to connect to specific DataSource.