How to use Classes and Objects in Scala programming

Last updated on May 30 2022
Shakuntala Deskmukh

Table of Contents

How to use Classes and Objects in Scala programming

Scala – Classes & Objects

This Blogs takes you through how to use classes and objects in Scala programming. A class is a blueprint for objects. Once you define a class, you can create objects from the class blueprint with the keyword new. Through the object you can use all functionalities of the defined class.

The subsequent diagram demonstrates the class and object by taking an example of class student, which contains the member variables (name and roll no) and member methods (setName() and setRollNo()). Finally all are members of the class. Class is a blue print and objects are real here. In the subsequent diagram, Student is a class and Harini, John, and Maria are the objects of Student class, those are having name and roll-number.

12.1

Basic Class

Subsequent is a simple syntax to define a basic class in Scala. This class defines two variables x and y and a method: move, which does not return a value. Class variables are called, fields of the class and methods are called class methods.

The class name works as a class constructor which can take a number of parameters. The above code defines two constructor arguments, xc and yc; they are both visible in the whole body of the class.

Syntax

class Point(xc: Int, yc: Int) {

   var x: Int = xc

   var y: Int = yc




   def move(dx: Int, dy: Int) {

      x = x + dx

      y = y + dy

      println ("Point x location : " + x);

      println ("Point y location : " + y);

   }

}

As mentioned earlier in this chapter, you can create objects using a keyword new and then you can access class fields and methods as shown below in the example −

Example

import java.io._




class Point(val xc: Int, val yc: Int) {

   var x: Int = xc

   var y: Int = yc

  

   def move(dx: Int, dy: Int) {

      x = x + dx

      y = y + dy

      println ("Point x location : " + x);

      println ("Point y location : " + y);

   }

}




object Demo {

   def main(args: Array[String]) {

      val pt = new Point(10, 20);




      // Move to a new location

      pt.move(10, 10);

   }

}

Save the above program in Demo.scala. The subsequent commands are used to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

Point x location : 20

Point y location : 30

Extending a Class

You can extend a base Scala class and you can design an inherited class in the equivalent way you do it in Java (use extends key word), but there are two restrictions: method overriding requires the override keyword, and only the primary constructor can pass parameters to the base constructor. Let us extend our above class and add one more class method.

Example

Let us take an example of two classes Point class (as equivalent example as above) and Location class is inherited class using extends keyword. Such an ‘extends’ clause has two effects: it makes Location class inherit all non-private members from Point class, and it makes the type Location a subtype of the type Point class. So here the Point class is called superclass and the class Location is called subclass. Extending a class and inheriting all the features of a parent class is called inheritance but Scala allows the inheritance from just one class only.

Note − Methods move() method in Point class and move() method in Location class do not override the corresponding definitions of move since they are different definitions (for example, the former take two arguments while the latter take three arguments).

Try the subsequent example program to implement inheritance.

import java.io._




class Point(val xc: Int, val yc: Int) {

   var x: Int = xc

   var y: Int = yc

  

   def move(dx: Int, dy: Int) {

      x = x + dx

      y = y + dy

      println ("Point x location : " + x);

      println ("Point y location : " + y);

   }

}




class Location(override val xc: Int, override val yc: Int,

   val zc :Int) extends Point(xc, yc){

   var z: Int = zc




   def move(dx: Int, dy: Int, dz: Int) {

      x = x + dx

      y = y + dy

      z = z + dz

      println ("Point x location : " + x);

      println ("Point y location : " + y);

      println ("Point z location : " + z);

   }

}




object Demo {

   def main(args: Array[String]) {

      val loc = new Location(10, 20, 15);




      // Move to a new location

      loc.move(10, 10, 5);

   }

}

Save the above program in Demo.scala. The subsequent commands are used to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

Point x location : 20

Point y location : 30

Point z location : 20

Implicit Classes

Implicit classes allow implicit conversations with class’s primary constructor when the class is in scope. Implicit class is a class marked with ‘implicit’ keyword. This feature is introduced in Scala 2.10.

Syntax − The subsequent is the syntax for implicit classes. Here implicit class is always in the object scope where all method definitions are allowed because implicit class cannot be a top level class.

Syntax

object <object name> {

   implicit class <class name>(<Variable>: Data type) {

      def <method>(): Unit =

   }

}

Example

Let us take an example of an implicit class named IntTimes with the method times(). It means the times () contain a loop transaction that will execute the given statement in number of times that we give. Let us assume the given statement is “4 times println (“Hello”)” means the println (“”Hello”) statement will execute 4 times.

The subsequent is the program for the given example. In this example two object classes are used (Run and Demo) so that we have to save those two classes in different files with their respective names as follows.

Run.scala − Save the subsequent program in Run.scala.

object Run {

   implicit class IntTimes(x: Int) {

      def times [A](f: =>A): Unit = {

         def loop(current: Int): Unit =

        

         if(current > 0){

            f

            loop(current - 1)

         }

         loop(x)

      }

   }

}

Demo.scala − Save the subsequent program in Demo.scala.

import Run._




object Demo {

   def main(args: Array[String]) {

      4 times println("hello")

   }

}

The subsequent commands are used to compile and execute these two programs.

Command

\>scalac Run.scala

\>scalac Demo.scala

\>scala Demo

Output

Hello

Hello

Hello

Hello

Note

  • Implicit classes must be defined inside another class/object/trait (not in top level).
  • Implicit classes may only take one non –implicit argument in their constructor.
  • Implicit classes may not be any method, member or object in scope with the equivalent name as the implicit class.

Singleton Objects

Scala is more object-oriented than Java because in Scala, we cannot have static members. Instead, Scala has singleton objects. A singleton is a class that can have only one instance, i.e., Object. You create singleton using the keyword object instead of class keyword. Since you can’t instantiate a singleton object, you can’t pass parameters to the primary constructor. You already have seen all the examples using singleton objects where you called Scala’s main method.

Subsequent is the equivalent example program to implement singleton.

Example

import java.io._




class Point(val xc: Int, val yc: Int) {

   var x: Int = xc

   var y: Int = yc

  

   def move(dx: Int, dy: Int) {

      x = x + dx

      y = y + dy

   }

}




object Demo {

   def main(args: Array[String]) {

      val point = new Point(10, 20)

      printPoint




      def printPoint{

         println ("Point x location : " + point.x);

         println ("Point y location : " + point.y);

      }

   }

}

Save the above program in Demo.scala. The subsequent commands are used to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

Point x location : 10

Point y location : 20

 

So, this brings us to the end of blog. This Tecklearn ‘How to use Classes and Objects in Scala Programming’ helps you with commonly asked questions if you are looking out for a job in Apache Spark and Scala and Big Data Developer. If you wish to learn Apache Spark and Scala and build a career in Big Data Hadoop domain, then check out our interactive, Apache Spark and Scala Training, that comes with 24*7 support to guide you throughout your learning period. Please find the link for course details:

https://www.tecklearn.com/course/apache-spark-and-scala-certification/

Apache Spark and Scala Training

About the Course

Tecklearn Spark training lets you master real-time data processing using Spark streaming, Spark SQL, Spark RDD and Spark Machine Learning libraries (Spark MLlib). This Spark certification training helps you master the essential skills of the Apache Spark open-source framework and Scala programming language, including Spark Streaming, Spark SQL, machine learning programming, GraphX programming, and Shell Scripting Spark. You will also understand the role of Spark in overcoming the limitations of MapReduce. Upon completion of this online training, you will hold a solid understanding and hands-on experience with Apache Spark.

Why Should you take Apache Spark and Scala Training?

  • The average salary for Apache Spark developer ranges from approximately $93,486 per year for Developer to $128,313 per year for Data Engineer. – Indeed.com
  • Wells Fargo, Microsoft, Capital One, Apple, JPMorgan Chase & many other MNC’s worldwide use Apache Spark across industries.
  • Global Spark market revenue will grow to $4.2 billion by 2022 with a CAGR of 67% Marketanalysis.com

What you will Learn in this Course?

Introduction to Scala for Apache Spark

  • What is Scala
  • Why Scala for Spark
  • Scala in other Frameworks
  • Scala REPL
  • Basic Scala Operations
  • Variable Types in Scala
  • Control Structures in Scala
  • Loop, Functions and Procedures
  • Collections in Scala
  • Array Buffer, Map, Tuples, Lists

Functional Programming and OOPs Concepts in Scala

  • Functional Programming
  • Higher Order Functions
  • Anonymous Functions
  • Class in Scala
  • Getters and Setters
  • Custom Getters and Setters
  • Constructors in Scala
  • Singletons
  • Extending a Class using Method Overriding

Introduction to Spark

  • Introduction to Spark
  • How Spark overcomes the drawbacks of MapReduce
  • Concept of In Memory MapReduce
  • Interactive operations on MapReduce
  • Understanding Spark Stack
  • HDFS Revision and Spark Hadoop YARN
  • Overview of Spark and Why it is better than Hadoop
  • Deployment of Spark without Hadoop
  • Cloudera distribution and Spark history server

Basics of Spark

  • Spark Installation guide
  • Spark configuration and memory management
  • Driver Memory Versus Executor Memory
  • Working with Spark Shell
  • Resilient distributed datasets (RDD)
  • Functional programming in Spark and Understanding Architecture of Spark

Playing with Spark RDDs

  • Challenges in Existing Computing Methods
  • Probable Solution and How RDD Solves the Problem
  • What is RDD, It’s Operations, Transformations & Actions Data Loading and Saving Through RDDs
  • Key-Value Pair RDDs
  • Other Pair RDDs and Two Pair RDDs
  • RDD Lineage
  • RDD Persistence
  • Using RDD Concepts Write a Wordcount Program
  • Concept of RDD Partitioning and How It Helps Achieve Parallelization
  • Passing Functions to Spark

Writing and Deploying Spark Applications

  • Creating a Spark application using Scala or Java
  • Deploying a Spark application
  • Scala built application
  • Creating application using SBT
  • Deploying application using Maven
  • Web user interface of Spark application
  • A real-world example of Spark and configuring of Spark

Parallel Processing

  • Concept of Spark parallel processing
  • Overview of Spark partitions
  • File Based partitioning of RDDs
  • Concept of HDFS and data locality
  • Technique of parallel operations
  • Comparing coalesce and Repartition and RDD actions

Machine Learning using Spark MLlib

  • Why Machine Learning
  • What is Machine Learning
  • Applications of Machine Learning
  • Face Detection: USE CASE
  • Machine Learning Techniques
  • Introduction to MLlib
  • Features of MLlib and MLlib Tools
  • Various ML algorithms supported by MLlib

Integrating Apache Flume and Apache Kafka

  • Why Kafka, what is Kafka and Kafka architecture
  • Kafka workflow and Configuring Kafka cluster
  • Basic operations and Kafka monitoring tools
  • Integrating Apache Flume and Apache Kafka

Apache Spark Streaming

  • Why Streaming is Necessary
  • What is Spark Streaming
  • Spark Streaming Features
  • Spark Streaming Workflow
  • Streaming Context and DStreams
  • Transformations on DStreams
  • Describe Windowed Operators and Why it is Useful
  • Important Windowed Operators
  • Slice, Window and ReduceByWindow Operators
  • Stateful Operators

Improving Spark Performance

  • Learning about accumulators
  • The common performance issues and troubleshooting the performance problems

DataFrames and Spark SQL

  • Need for Spark SQL
  • What is Spark SQL
  • Spark SQL Architecture
  • SQL Context in Spark SQL
  • User Defined Functions
  • Data Frames and Datasets
  • Interoperating with RDDs
  • JSON and Parquet File Formats
  • Loading Data through Different Sources

Scheduling and Partitioning in Apache Spark

  • Concept of Scheduling and Partitioning in Spark
  • Hash partition and range partition
  • Scheduling applications
  • Static partitioning and dynamic sharing
  • Concept of Fair scheduling
  • Map partition with index and Zip
  • High Availability
  • Single-node Recovery with Local File System and High Order Functions
Got a question for us? Please mention it in the comments section and we will get back to you.

 

 

0 responses on "How to use Classes and Objects in Scala programming"

Leave a Message

Your email address will not be published. Required fields are marked *