Details of Data Types and Basic Literals in Scala

Last updated on May 30 2022
Shakuntala Deskmukh

Table of Contents

Details of Data Types and Basic Literals in Scala

Scala – Data Types

Scala has all the same data types as Java, with the same memory footprint and precision. Subsequent is the table giving details about all the data types available withinScala −

Sr.No Data Type & Description
1 Byte

8 bit signed value. Range from -128 to 127

2 Short

16 bit signed value. Range -32768 to 32767

3 Int

32 bit signed value. Range -2147483648 to 2147483647

4 Long

64 bit signed value. -9223372036854775808 to 9223372036854775807

5 Float

32 bit IEEE 754 single-precision float

6 Double

64 bit IEEE 754 double-precision float

7 Char

16 bit unsigned Unicode character. Range from U+0000 to U+FFFF

8 String

A sequence of Chars

9 Boolean

Either the literal true or the literal false

10 Unit

Corresponds to no value

11 Null

null or empty reference

12 Nothing

The subtype of every other type; includes no values

13 Any

The supertype of any type; any object is of type Any

14 AnyRef

The supertype of any reference type

All the data types listed above are objects. There are no primitive types like withinJava. This means that you can call methods on an Int, Long, etc.

Scala Basic Literals

The rules Scala uses for literals are simple and intuitive. This section of blog explains all basic Scala Literals.

Integral Literals

Integer literals are usually of type Int, or of type Long when followed by a L or l suffix. Here are some integer literals −

0

035

21

0xFFFFFFFF

0777L

Floating Point Literal

Floating point literals are of type Float when followed by a floating point type suffix F or f, and are of type Double otherwise. Here are some floating point literals −

0.0

1e30f

3.14159f

1.0e100

.1

Boolean Literals

The Boolean literals true and false are members of type Boolean.

Symbol Literals

A symbol literal ‘x is a shorthand for the expression scala.Symbol(“x”). Symbol is a case class, which is defined as follows.

package scala

final case class Symbol private (name: String) {

   override def toString: String = "'" + name

}

Character Literals

A character literal is a single character enclosed withinquotes. The character is either a printable Unicode character or is described by an escape sequence. Here are some character literals −

'a'

'\u0041'

'\n'

'\t'

String Literals

A string literal is a sequence of characters withindouble quotes. The characters are either printable Unicode character or are described by escape sequences. Here are some string literals −

"Hello,\nWorld!"

"This string contains a \" character."

Multi-Line Strings

A multi-line string literal is a sequence of characters enclosed withintriple quotes “”” … “””. The sequence of characters is arbitrary, except that it may contawithinthree or more consecutive quote characters only at the very end.

Characters must not necessarily be printable; newlines or other control characters are also permitted. Here is a multi-line string literal −

"""the present string

spans three

lines."""

Null Values

The null value is of type scala.Null and is thus compatible with every reference type. It denotes a reference value which refers to a special “null” object.

Escape Sequences

The subsequent escape sequences are recognized withincharacter and string literals.

Escape Sequences Unicode Description
\b \u0008 backspace BS
\t \u0009 horizontal tab HT
\n \u000c formfeed FF
\f \u000c formfeed FF
\r \u000d carriage return CR
\” \u0022 double quote “
\’ \u0027 single quote .
\\ \u005c backslash \

A character with Unicode between 0 and 255 may also be represented by an octal escape, i.e., a backslash ‘\’ followed by a sequence of up to three octal characters. Subsequent is the example to show few escape sequence characters −

Example

object Test {

   def main(args: Array[String]) {

      println("Hello\tWorld\n\n" );

   }

}

When the above code is compiled and executed, it produces the subsequent result −

Output

Hello   World

 

So, this brings us to the end of blog. This Tecklearn ‘Details of Data Types and Basic Literals in Scala’ helps you with commonly asked questions if you are looking out for a job in Apache Spark and Scala and Big Data Developer. If you wish to learn Apache Spark and Scala and build a career in Big Data Hadoop domain, then check out our interactive, Apache Spark and Scala Training, that comes with 24*7 support to guide you throughout your learning period. Please find the link for course details:

https://www.tecklearn.com/course/apache-spark-and-scala-certification/

Apache Spark and Scala Training

About the Course

Tecklearn Spark training lets you master real-time data processing using Spark streaming, Spark SQL, Spark RDD and Spark Machine Learning libraries (Spark MLlib). This Spark certification training helps you master the essential skills of the Apache Spark open-source framework and Scala programming language, including Spark Streaming, Spark SQL, machine learning programming, GraphX programming, and Shell Scripting Spark. You will also understand the role of Spark in overcoming the limitations of MapReduce. Upon completion of this online training, you will hold a solid understanding and hands-on experience with Apache Spark.

Why Should you take Apache Spark and Scala Training?

  • The average salary for Apache Spark developer ranges from approximately $93,486 per year for Developer to $128,313 per year for Data Engineer. – Indeed.com
  • Wells Fargo, Microsoft, Capital One, Apple, JPMorgan Chase & many other MNC’s worldwide use Apache Spark across industries.
  • Global Spark market revenue will grow to $4.2 billion by 2022 with a CAGR of 67% Marketanalysis.com

What you will Learn in this Course?

Introduction to Scala for Apache Spark

  • What is Scala
  • Why Scala for Spark
  • Scala in other Frameworks
  • Scala REPL
  • Basic Scala Operations
  • Variable Types in Scala
  • Control Structures in Scala
  • Loop, Functions and Procedures
  • Collections in Scala
  • Array Buffer, Map, Tuples, Lists

Functional Programming and OOPs Concepts in Scala

  • Functional Programming
  • Higher Order Functions
  • Anonymous Functions
  • Class in Scala
  • Getters and Setters
  • Custom Getters and Setters
  • Constructors in Scala
  • Singletons
  • Extending a Class using Method Overriding

Introduction to Spark

  • Introduction to Spark
  • How Spark overcomes the drawbacks of MapReduce
  • Concept of In Memory MapReduce
  • Interactive operations on MapReduce
  • Understanding Spark Stack
  • HDFS Revision and Spark Hadoop YARN
  • Overview of Spark and Why it is better than Hadoop
  • Deployment of Spark without Hadoop
  • Cloudera distribution and Spark history server

Basics of Spark

  • Spark Installation guide
  • Spark configuration and memory management
  • Driver Memory Versus Executor Memory
  • Working with Spark Shell
  • Resilient distributed datasets (RDD)
  • Functional programming in Spark and Understanding Architecture of Spark

Playing with Spark RDDs

  • Challenges in Existing Computing Methods
  • Probable Solution and How RDD Solves the Problem
  • What is RDD, It’s Operations, Transformations & Actions Data Loading and Saving Through RDDs
  • Key-Value Pair RDDs
  • Other Pair RDDs and Two Pair RDDs
  • RDD Lineage
  • RDD Persistence
  • Using RDD Concepts Write a Wordcount Program
  • Concept of RDD Partitioning and How It Helps Achieve Parallelization
  • Passing Functions to Spark

Writing and Deploying Spark Applications

  • Creating a Spark application using Scala or Java
  • Deploying a Spark application
  • Scala built application
  • Creating application using SBT
  • Deploying application using Maven
  • Web user interface of Spark application
  • A real-world example of Spark and configuring of Spark

Parallel Processing

  • Concept of Spark parallel processing
  • Overview of Spark partitions
  • File Based partitioning of RDDs
  • Concept of HDFS and data locality
  • Technique of parallel operations
  • Comparing coalesce and Repartition and RDD actions

Machine Learning using Spark MLlib

  • Why Machine Learning
  • What is Machine Learning
  • Applications of Machine Learning
  • Face Detection: USE CASE
  • Machine Learning Techniques
  • Introduction to MLlib
  • Features of MLlib and MLlib Tools
  • Various ML algorithms supported by MLlib

Integrating Apache Flume and Apache Kafka

  • Why Kafka, what is Kafka and Kafka architecture
  • Kafka workflow and Configuring Kafka cluster
  • Basic operations and Kafka monitoring tools
  • Integrating Apache Flume and Apache Kafka

Apache Spark Streaming

  • Why Streaming is Necessary
  • What is Spark Streaming
  • Spark Streaming Features
  • Spark Streaming Workflow
  • Streaming Context and DStreams
  • Transformations on DStreams
  • Describe Windowed Operators and Why it is Useful
  • Important Windowed Operators
  • Slice, Window and ReduceByWindow Operators
  • Stateful Operators

Improving Spark Performance

  • Learning about accumulators
  • The common performance issues and troubleshooting the performance problems

DataFrames and Spark SQL

  • Need for Spark SQL
  • What is Spark SQL
  • Spark SQL Architecture
  • SQL Context in Spark SQL
  • User Defined Functions
  • Data Frames and Datasets
  • Interoperating with RDDs
  • JSON and Parquet File Formats
  • Loading Data through Different Sources

Scheduling and Partitioning in Apache Spark

  • Concept of Scheduling and Partitioning in Spark
  • Hash partition and range partition
  • Scheduling applications
  • Static partitioning and dynamic sharing
  • Concept of Fair scheduling
  • Map partition with index and Zip
  • High Availability
  • Single-node Recovery with Local File System and High Order Functions
Got a question for us? Please mention it in the comments section and we will get back to you.

 

 

0 responses on "Details of Data Types and Basic Literals in Scala"

Leave a Message

Your email address will not be published. Required fields are marked *