This 4 days course provides aspirants knowledge of Scala Programming Basics and Apache Spark. It covers all the fundamentals you need to write complex Spark applications. Get a clear understanding of the limitations of MapReduce and the role of Spark in overcoming these limitations. Expertise in using RDD for creating applications in Spark. Gain a thorough understanding of Spark Streaming features. Understand the fundamentals of Scala Programming Language and its features. Mastering SQL queries using SparkSQL
Loading...
You should know programming language like java, sql, networking concepts
Do you need to know Hadoop for learning Apache Spark with Scala?
You need not require to know hadoop for learning Spark with Scala
Do I need Hadoop to run Spark?
No, but if you run on a cluster, you will need some form of shared file system (for example, NFS mounted at the same path on each node). If you have this type of filesystem, you can just deploy Spark in standalone mode
Which are other languages in which Spark can be learn?
Apart from Scala, you can learn Spark using Java and Python.
How can I know which different international certification available for Spark?