DataSpell 2023.2 Help

Spark

With the Spark plugin, you can create, submit, and monitor your Spark jobs right in the IDE. The plugin features include:

  • The Spark Submit run configuration to build and upload your Spark application to a cluster.

  • The Spark monitoring tool window to monitor submitted jobs, view DAG visualizations, and more. This includes jobs submitted from the Spark Submit run configurations and EMR steps. If you have the Zeppelin plugin installed, you can also open Spark jobs from Zeppelin notebooks.

  • Integration with other big data tools without leaving the IDE (open Spark applications from AWS EMR, navigate to Spark jobs from Hadoop YARN, view logs in S3 storages).

Install the Spark plugin

This functionality relies on the Spark plugin, which you need to install and enable.

  1. Press Control+Alt+S to open the IDE settings and then select Plugins.

  2. Open the Marketplace tab, find the Spark plugin, and click Install (restart the IDE if prompted).

In this chapter:

Last modified: 10 August 2023