I am using typesafe ConfigFactory to load the config into my scala application. I do not want to include the config files into my jar, but load them instead from an external hdfs filesystem. However, I cannot find a simple way to load the config from the fsDataInputStream object I get from hadoop:
//get HDFS file
val hadoopConfig: Configuration = sc.hadoopConfiguration
val fs: FileSystem = org.apache.hadoop.fs.FileSystem.get(hadoopConfig)
val file: FSDataInputStream = fs.open(new Path("hdfs://SOME_URL/application.conf"))
//read config from hdfs
val config: Config = ConfigFactory.load(file.readUTF())
However, this throws an EOFException. Is there an easy way to convert the FSDataInputStream object into the required java.io.File? I found Converting from FSDataInputStream to FileInputStream , but this would be pretty cumbersome for such a simple task.
Using ConfigFactory.parseReader
should work (but I haven't tested it):
val reader = new InputStreamReader(file)
val config = try {
ConfigFactory.parseReader(reader)
} finally {
reader.close()
}