javamavenintellij-ideaspark-framework

Java Spark Framework, deploy failing


This seems like it should be a trivial task. But I can't quite figure out what I am suppose to do. I am new to Maven/Spark. And after searching around, looking thorough the docs and what not. I can't figure out how to start my spark application?

I followed this guide to get set up in Intellij. https://sparktutorials.github.io/2015/04/02/setting-up-a-spark-project-with-maven.html

I can run all of the maven tasks, except deploy.

enter image description here

Deploy fails with this error.

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy (default-deploy) on project framework: Deployment failed: repository element was not specified in the POM inside distributionManagement element or in -DaltDeploymentRepository=id::layout::url parameter -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.

I am not sure if that matters or not? Is deploy the task intended to start the server? I am not sure.

POM.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>com.krishollenbeck.framework</groupId>
  <artifactId>framework</artifactId>
  <version>1.0</version>
  <properties>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>
  <dependencies>
    <dependency>
      <groupId>com.sparkjava</groupId>
      <artifactId>spark-core</artifactId>
      <version>2.5</version>
    </dependency>
  </dependencies>
</project>

This is what the DOCS say.

What about starting the server? The server is automatically started when you do something that requires the server to be started (i.e. declaring a route or setting the port). You can also manually start the server by calling init().

http://sparkjava.com/documentation.html#stopping-the-server

Okay? What does that mean? Normally there is some command or something to start a server.

Question: TL;DR

How do I start the spark server?

Additional sort of off topic:

Is spark still maintained? Is this a bad framework to use? I am looking for a light weight java server. Most of the app logic will be handled client side. Just need to handle some basic login/ CRUD stuff on the server side. And constructing some restful API.

Project Structure: (FYI)

enter image description here


Solution

  • RUN your main class from Intellij. OR, if you want to run it with maven do this:

    mvn exec:java -Dexec.mainClass=my.IakaMain
    

    and make sure you change my.IakaMain with yourpackage.YourClassName

    Or run via Intellij Debug Configuration: (like so)

    enter image description here

    Run and view: (please note the port number is not the usual 80 or 8080)

    http://localhost:4567/hello

    Note: If you get this warning (annoying).

    SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

    Add this to your pom.xml

    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-simple</artifactId>
        <version>1.7.21</version>
    </dependency>