Conditional statements and Loop control structures in Scala

Last updated on May 30 2022
Shakuntala Deskmukh

Table of Contents

Conditional statements and Loop control structures in Scala

Scala – IF ELSE Statements

This blog takes you through the conditional construction statements in Scala programming. Determine is the general form of a typical decision making IF…ELSE structure found in most of the programming languages.

Flow Chart

The determine is a flow chart diagram for conditional statement.

if Statement

‘if’ statement consists of a Boolean expression followed by one or more statements.

Syntax

The syntax of an ‘if’ statement is as follows.

if(Boolean_expression) {

   // Statements will execute if the Boolean expression is true

}

If the Boolean expression evaluates to true then the block of code inside the ‘if’ expression will be executed. If not, the first set of code after the end of the ‘if’ expression (after the closing curly brace) will be executed.

Try the determine example program to understand conditional expressions (if expression) in Scala Programming Language.

Example

object Demo {

   def main(args: Array[String]) {

      var x = 10;




      if( x < 20 ){

         println("This is if statement");

      }

   }

}

Save the above program in Demo.scala. The determine commands are employed to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

This is if statement

If-else Statement

An ‘if’ statement can be followed by an optional else statement, which executes when the Boolean expression is false.

Syntax

The syntax of a if...else is −

if(Boolean_expression){

   //Executes when the Boolean expression is true

} else{

   //Executes when the Boolean expression is false

}

Try the determine example program to understand conditional statements (if- else statement) in Scala Programming Language.

Example

object Demo {

   def main(args: Array[String]) {

      var x = 30;




      if( x < 20 ){

         println("This is if statement");

      } else {

         println("This is else statement");

      }

   }

}

Save the above program in Demo.scala. The determine commands are employed to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

This is else statement

If-else-if-else Statement

An ‘if’ statement can be followed by an optional ‘else if…else‘ statement, which is very useful to test various conditions using single if…else if statement.

When using if , else if , else statements there are few points to keep in mind.

  • An ‘if’ can have zero or one else’s and it must come after any else if’s.
  • An ‘if’ can have zero to many else if’s and they must come before the else.
  • Once an else if succeeds, none of he remaining else if’s or else’s will be tested.

Syntax

The determine is the syntax of an ‘if…else if…else’ is as follows −

if(Boolean_expression 1){

   //Executes when the Boolean expression 1 is true

} else if(Boolean_expression 2){

   //Executes when the Boolean expression 2 is true

} else if(Boolean_expression 3){

   //Executes when the Boolean expression 3 is true

} else {

   //Executes when the none of the above condition is true.

}

Try the determine example program to understand conditional statements (if- else- if- else statement) in Scala Programming Language.

Example

object Demo {

   def main(args: Array[String]) {

      var x = 30;




      if( x == 10 ){

         println("Value of X is 10");

      } else if( x == 20 ){

         println("Value of X is 20");

      } else if( x == 30 ){

         println("Value of X is 30");

      } else{

         println("This is else statement");

      }

   }

}

Save the above program in Demo.scala. The determine commands are employed to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

Value of X is 30

Nested if-else Statement

It is always legal to nest if-else statements, which means you can use one if or else-if statement inside another if or else-if statement.

Syntax

The syntax for a nested if-else is as follows −

if(Boolean_expression 1){

   //Executes when the Boolean expression 1 is true

  

   if(Boolean_expression 2){

      //Executes when the Boolean expression 2 is true

   }

}

Try the determine example program to understand conditional statements (nested- if statement) in Scala Programming Language.

Example

object Demo {

   def main(args: Array[String]) {

      var x = 30;

      var y = 10;

     

      if( x == 30 ){

         if( y == 10 ){

            println("X = 30 and Y = 10");

         }

      }

   }

}

Save the above program in Demo.scala. The determine commands are employed to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

X = 30 and Y = 10

 

Scala – Loop Statements

This section takes you through the loop control structures in Scala programming languages.

There may be a situation, when you need to execute a block of code several number of times. In general, statements are executed sequentially: The first statement in a function is executed first, followed by the second, and so on.

Programming languages provide various control structures that allow for more complicated execution paths.

A loop statement allows us to execute a statement or group of statements multiple times and determine is the general form of a loop statement in most of the programming languages −

Flow Chart

Scala programming language provides the determine types of loops to handle looping requirements. Click the determine links in the table to check their detail.

Sr.No Loop Type & Description
1 while loop

Repeats a statement or group of statements while a given condition is true. It tests the condition before executing the loop body.

2 do-while loop

Like a while statement, except that it tests the condition at the end of the loop body.

3 for loop

Executes a sequence of statements multiple times and abbreviates the code that manages the loop variable.

Loop Control Statements

Loop control statements change execution from its normal sequence. When execution leaves a scope, all automatic objects that were created in that scope are destroyed. As such Scala does not support break or continue statement like Java does but starting from Scala version 2.8, there is a way to break the loops. Click the determine links to check the detail.

Sr.No Control Statement & Description
1 break statement

Terminates the loop statement and transfers execution to the statement immediately determine the loop.

The infinite Loop

A loop becomes an infinite loop if a condition never becomes false. If you are using Scala, the while loop is the best way to implement infinite loop.

The determine program implements infinite loop.

Example

object Demo {

   def main(args: Array[String]) {

      var a = 10;

     

      // An infinite loop.

      while( true ){

         println( "Value of a: " + a );

      }

   }

}

Save the above program in Demo.scala. The determine commands are employed to compile and execute this program.

Command

\>scalac Demo.scala

\>scala Demo

Output

If you’ll execute above code, it will go in infinite loop which you can terminate by pressing Ctrl + C keys.

Value of a: 10

Value of a: 10

Value of a: 10

Value of a: 10

…………….

 

So, this brings us to the end of blog. This Tecklearn ‘Conditional statements and Loop control structures in Scala’ helps you with commonly asked questions if you are looking out for a job in Apache Spark and Scala and Big Data Developer. If you wish to learn Apache Spark and Scala and build a career in Big Data Hadoop domain, then check out our interactive, Apache Spark and Scala Training, that comes with 24*7 support to guide you throughout your learning period. Please find the link for course details:

https://www.tecklearn.com/course/apache-spark-and-scala-certification/

Apache Spark and Scala Training

About the Course

Tecklearn Spark training lets you master real-time data processing using Spark streaming, Spark SQL, Spark RDD and Spark Machine Learning libraries (Spark MLlib). This Spark certification training helps you master the essential skills of the Apache Spark open-source framework and Scala programming language, including Spark Streaming, Spark SQL, machine learning programming, GraphX programming, and Shell Scripting Spark. You will also understand the role of Spark in overcoming the limitations of MapReduce. Upon completion of this online training, you will hold a solid understanding and hands-on experience with Apache Spark.

Why Should you take Apache Spark and Scala Training?

  • The average salary for Apache Spark developer ranges from approximately $93,486 per year for Developer to $128,313 per year for Data Engineer. – Indeed.com
  • Wells Fargo, Microsoft, Capital One, Apple, JPMorgan Chase & many other MNC’s worldwide use Apache Spark across industries.
  • Global Spark market revenue will grow to $4.2 billion by 2022 with a CAGR of 67% Marketanalysis.com

What you will Learn in this Course?

Introduction to Scala for Apache Spark

  • What is Scala
  • Why Scala for Spark
  • Scala in other Frameworks
  • Scala REPL
  • Basic Scala Operations
  • Variable Types in Scala
  • Control Structures in Scala
  • Loop, Functions and Procedures
  • Collections in Scala
  • Array Buffer, Map, Tuples, Lists

Functional Programming and OOPs Concepts in Scala

  • Functional Programming
  • Higher Order Functions
  • Anonymous Functions
  • Class in Scala
  • Getters and Setters
  • Custom Getters and Setters
  • Constructors in Scala
  • Singletons
  • Extending a Class using Method Overriding

Introduction to Spark

  • Introduction to Spark
  • How Spark overcomes the drawbacks of MapReduce
  • Concept of In Memory MapReduce
  • Interactive operations on MapReduce
  • Understanding Spark Stack
  • HDFS Revision and Spark Hadoop YARN
  • Overview of Spark and Why it is better than Hadoop
  • Deployment of Spark without Hadoop
  • Cloudera distribution and Spark history server

Basics of Spark

  • Spark Installation guide
  • Spark configuration and memory management
  • Driver Memory Versus Executor Memory
  • Working with Spark Shell
  • Resilient distributed datasets (RDD)
  • Functional programming in Spark and Understanding Architecture of Spark

Playing with Spark RDDs

  • Challenges in Existing Computing Methods
  • Probable Solution and How RDD Solves the Problem
  • What is RDD, It’s Operations, Transformations & Actions Data Loading and Saving Through RDDs
  • Key-Value Pair RDDs
  • Other Pair RDDs and Two Pair RDDs
  • RDD Lineage
  • RDD Persistence
  • Using RDD Concepts Write a Wordcount Program
  • Concept of RDD Partitioning and How It Helps Achieve Parallelization
  • Passing Functions to Spark

Writing and Deploying Spark Applications

  • Creating a Spark application using Scala or Java
  • Deploying a Spark application
  • Scala built application
  • Creating application using SBT
  • Deploying application using Maven
  • Web user interface of Spark application
  • A real-world example of Spark and configuring of Spark

Parallel Processing

  • Concept of Spark parallel processing
  • Overview of Spark partitions
  • File Based partitioning of RDDs
  • Concept of HDFS and data locality
  • Technique of parallel operations
  • Comparing coalesce and Repartition and RDD actions

Machine Learning using Spark MLlib

  • Why Machine Learning
  • What is Machine Learning
  • Applications of Machine Learning
  • Face Detection: USE CASE
  • Machine Learning Techniques
  • Introduction to MLlib
  • Features of MLlib and MLlib Tools
  • Various ML algorithms supported by MLlib

Integrating Apache Flume and Apache Kafka

  • Why Kafka, what is Kafka and Kafka architecture
  • Kafka workflow and Configuring Kafka cluster
  • Basic operations and Kafka monitoring tools
  • Integrating Apache Flume and Apache Kafka

Apache Spark Streaming

  • Why Streaming is Necessary
  • What is Spark Streaming
  • Spark Streaming Features
  • Spark Streaming Workflow
  • Streaming Context and DStreams
  • Transformations on DStreams
  • Describe Windowed Operators and Why it is Useful
  • Important Windowed Operators
  • Slice, Window and ReduceByWindow Operators
  • Stateful Operators

Improving Spark Performance

  • Learning about accumulators
  • The common performance issues and troubleshooting the performance problems

DataFrames and Spark SQL

  • Need for Spark SQL
  • What is Spark SQL
  • Spark SQL Architecture
  • SQL Context in Spark SQL
  • User Defined Functions
  • Data Frames and Datasets
  • Interoperating with RDDs
  • JSON and Parquet File Formats
  • Loading Data through Different Sources

Scheduling and Partitioning in Apache Spark

  • Concept of Scheduling and Partitioning in Spark
  • Hash partition and range partition
  • Scheduling applications
  • Static partitioning and dynamic sharing
  • Concept of Fair scheduling
  • Map partition with index and Zip
  • High Availability
  • Single-node Recovery with Local File System and High Order Functions
Got a question for us? Please mention it in the comments section and we will get back to you.

 

 

0 responses on "Conditional statements and Loop control structures in Scala"

Leave a Message

Your email address will not be published. Required fields are marked *