diff --git a/pyspark/Colab and PySpark.html b/pyspark/Colab and PySpark.html deleted file mode 100644 index 42fca06..0000000 --- a/pyspark/Colab and PySpark.html +++ /dev/null @@ -1,16416 +0,0 @@ - - - - -Colab and PySpark - - - - - - - - - - - - - - - - - - - - - - -
-
- -
-
-
-

Introduction to Google Colab and PySpark

-
-
-
- -
-
-
-

-

Objective

The objective of this notebook is to:

-
  • Give a proper understanding about the different PySpark functions available.
  • -
  • A short introduction to Google Colab, as that is the platform on which this notebook is written on.
  • -
    -

    Once you complete this notebook, you should be able to write pyspark programs in an efficent way. The ideal way to use this is by going through the examples given and then trying them on Colab. At the end there are a few hands on questions which you can use to evaluate yourself.

    - -
    -
    -
    -
    -
    -
    -

    -

    Prerequisite

  • Although some theory about pyspark and big data will be given in this notebook, I recommend everyone to read more about it and have a deeper understanding on how the functions get executed and the relevance of big data in the current scenario.

    -

  • A good understanding on python will be an added bonus.

    -
  • - -
    -
    -
    -
    -
    -
    -

    -

    Notes from the author

    This tutorial was made using Google Colab so the code you see here is meant to run on a colab notebook.
    -It goes through basic PySpark Functions and a short introduction on how to use Colab.
    -If you want to view my colab notebook for this particular tutorial, you can view it here. The viewing experience and readability is much better there.
    -If you want to try out things with this notebook as a base, feel free to download it from my repo here and then use it with jupyter notebook.

    - -
    -
    -
    -
    -
    -
    -

    -

    Big data, PySpark and Colaboratory

    -
    -
    -
    -
    -
    -
    -

    -

    Big data

    -
    -
    -
    -
    -
    -
    -

    Big data usually means data of such huge volume that normal data storage solutions cannot efficently store and process it. In this era, data is being generated at an absurd rate. Data is collected for each movement a person makes. The bulk of big data comes from three primary sources:

    -
      -
    1. Social data
    2. -
    3. Machine data
    4. -
    5. Transactional data
    6. -

    Some common examples for the sources of such data include internet searches, facebook posts, doorbell cams, smartwatches, online shopping history etc. Every action creates data, it is just a matter of of there is a way to collect them or not. But what's interesting is that out of all this data collected, not even 5% of it is being used fully. There is a huge demand for big data professionals in the industry. Even though the number of graduates with a specialization in big data are rising, the problem is that they don't have the practical knowledge about big data scenarios, which leads to bad architecutres and inefficent methods of processing data.

    -

    If you are interested to know more about the landscape and technologies involved, here is an article which I found really interesting!

    -
    - -
    -
    -
    -
    -
    -
    -

    -

    PySpark

    -
    -
    -
    -
    -
    -
    -

    If you are working in the field of big data, you must have definelty heard of spark. If you look at the Apache Spark website, you will see that it is said to be a Lightning-fast unified analytics engine. PySPark is a flavour of Spark used for processing and analysing massive volumes of data. If you are familiar with python and have tried it for huge datasets, you should know that the execution time can get ridiculous. Enter PySpark!

    -

    Imagine your data resides in a distributed manner at different places. If you try brining your data to one point and executing your code there, not only would that be inefficent, but also cause memory issues. Now let's say your code goes to the data rather than the data coming to where your code. This will help avoid unneccesary data movement which will thereby decrease the running time.

    -

    PySpark is the Python API of Spark; which means it can do almost all the things python can. Machine learning(ML) pipelines, exploratory data analysis (at scale), ETLs for data platform, and much more! And all of them in a distributed manner. One of the best parts of pyspark is that if you are already familiar with python, it's really easy to learn.

    -

    Apart from PySpark, there is another language called Scala used for big data processing. Scala is frequently over 10 times faster than Python is native for Hadoop as its based on JVM. But PySpark is getting adopted at a fast rate because of the ease of use, easier learning curve and ML capabilities.

    -

    I will briefly explain how a PySpark job works, but I strongly recommend you read more about the architecture and how everything works. Now, before I get into it, let me talk about some basic jargons first:

    -

    Cluster is a set of loosely or tightly connected computers that work together so that they can be viewed as a single system.

    -

    Hadoop is an open source, scalable, and fault tolerant framework written in Java. It efficiently processes large volumes of data on a cluster of commodity hardware. Hadoop is not only a storage system but is a platform for large data storage as well as processing.

    -

    HDFS (Hadoop distributed file system). It is one of the world's most reliable storage system. HDFS is a Filesystem of Hadoop designed for storing very large files running on a cluster of commodity hardware.

    -

    MapReduce is a data Processing framework, which has 2 phases - Mapper and Reducer. The map procedure performs filtering and sorting, and the reduce method performs a summary operation. It usually runs on a hadoop cluster.

    -

    Transformation refers to the operations applied on a dataset to create a new dataset. Filter, groupBy and map are the examples of transformations.

    -

    Actions Actions refer to an operation which instructs Spark to perform computation and send the result back to driver. This is an example of action.

    -

    Alright! Now that that's out of the way, let me explain how a spark job runs. In simple terma, each time you submit a pyspark job, the code gets internally converted into a MapReduce program and gets executed in the Java virtual machine. Now one of the thoughts that might be popping in your mind will probably be:
    So the code gets converted into a MapReduce program. Wouldn't that mean MapReduce is faster than pySpark?
    Well, the answer is a big NO. This is what makes spark jobs special. Spark is capable of handling a massive amount of data at a time, in it's distributed environment. It does this through in-memory processing, which is what makes it almost 100 times faster than Hadoop. Another factor which amkes it fast is Lazy Evaluation. Spark delays its evaluation as much as it can. Each time you submit a job, spark creates an action plan for how to execute the code, and then does nothing. Finally, when you ask for the result(i.e, calls an action), it executes the plan, which is basically all the transofrmations you have mentioned in your code. That's basically the gist of it.

    -

    Now lastly, I want to talk about on more thing. Spark mainly consists of 4 modules:

    -
      -
    1. Spark SQL - helps to write spark programs using SQL like queries.
    2. -
    3. Spark Streaming - is an extension of the core Spark API that enables scalable, high-throughput, fault-tolerant stream processing of live data streams. used heavily in processing of social media data.
    4. -
    5. Spark MLLib - is the machine learning component of SPark. It helps train ML models on massive datasets with very high efficeny.
    6. -
    7. Spark GraphX - is the visualization component of Spark. It enables users to view data both as graphs and as collections without data movement or duplication.
    8. -

    Hopefully this image gives a better idea of what I am talking about: -Spark Modules

    -

    Source: Datanami

    - -
    -
    -
    -
    -
    -
    -

    -

    Colaboratory

    -
    -
    -
    -
    -
    -
    -

    In the words of Google:
    -Colaboratory, or “Colab” for short, is a product from Google Research. Colab allows anybody to write and execute arbitrary python code through the browser, and is especially well suited to machine learning, data analysis and education. More technically, Colab is a hosted Jupyter notebook service that requires no setup to use, while providing free access to computing resources including GPUs.

    -

    The reason why I used colab is because of its shareability and free GPU. Yeah you read that right. A FREE GPU! Additionally, it helps use different Google services conveniently. It saves to Google Drive and all the services are very closely related. I recommend you go through the offical overview documentation if you want to know more about it. -If you have more questions about colab, please refer this link

    -

    While using a colab notebook, you will need an active internet connection to keep a session alive. If you lose the connection you will have to download the datasets again.

    -
    - -
    -
    -
    -
    -
    -
    -

    -

    Jupyter notebook basics

    -
    -
    -
    -
    -
    -
    -

    -

    Code cells

    -
    -
    -
    -
    -
    -
    In [1]:
    -
    -
    -
    2*3
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[1]:
    - - - - -
    -
    6
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [2]:
    -
    -
    -
    import pandas as pd
    -
    - -
    -
    -
    - -
    -
    -
    -
    In [3]:
    -
    -
    -
    print("Hello!")
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    Hello!
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Text cells

    -
    -
    -
    -
    -
    -
    -

    Hello world!

    - -
    -
    -
    -
    -
    -
    -

    -

    Access to the shell

    -
    -
    -
    -
    -
    -
    In [4]:
    -
    -
    -
    ls
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    Backup DAGs/
    -Colab_and_PySpark.ipynb
    -Reference/
    -~$ctory Audit_AS-IS Process_03142019.docx
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [5]:
    -
    -
    -
    pwd
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[5]:
    - - - - -
    -
    u'/Users/jcele1/Downloads'
    -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Install Spark

    -
    -
    -
    -
    -
    -
    In [ ]:
    -
    -
    -
    !apt-get update
    -!apt-get install openjdk-8-jdk-headless -qq > /dev/null
    -!wget -q http://archive.apache.org/dist/spark/spark-2.4.4/spark-2.4.4-bin-hadoop2.7.tgz
    -!tar xf spark-2.4.4-bin-hadoop2.7.tgz
    -!pip install -q findspark
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    /bin/sh: apt-get: command not found
    -/bin/sh: apt-get: command not found
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [ ]:
    -
    -
    -
    import os
    -os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
    -os.environ["SPARK_HOME"] = "/content/spark-2.4.4-bin-hadoop2.7"
    -
    - -
    -
    -
    - -
    -
    -
    -
    In [ ]:
    -
    -
    -
    !ls
    -
    - -
    -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    import findspark
    -findspark.init()
    -from pyspark import SparkContext
    -
    -sc = SparkContext.getOrCreate()
    -sc
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - -
    - -
    -

    SparkContext

    - -

    Spark UI

    - -
    -
    Version
    -
    v2.4.4
    -
    Master
    -
    local[*]
    -
    AppName
    -
    pyspark-shell
    -
    -
    - -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    import pyspark
    -from pyspark.sql import SparkSession
    -spark = SparkSession.builder.getOrCreate() 
    -spark
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - -
    - -
    -

    SparkSession - in-memory

    - -
    -

    SparkContext

    - -

    Spark UI

    - -
    -
    Version
    -
    v2.4.4
    -
    Master
    -
    local[*]
    -
    AppName
    -
    pyspark-shell
    -
    -
    - -
    - -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Loading Dataset

    -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Downloading and preprocessing Chicago's Reported Crime Data
    -!wget https://data.cityofchicago.org/api/views/w98m-zvie/rows.csv?accessType=DOWNLOAD
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    --2020-01-22 04:20:04--  https://data.cityofchicago.org/api/views/w98m-zvie/rows.csv?accessType=DOWNLOAD
    -Resolving data.cityofchicago.org (data.cityofchicago.org)... 52.206.140.199, 52.206.68.26, 52.206.140.205
    -Connecting to data.cityofchicago.org (data.cityofchicago.org)|52.206.140.199|:443... connected.
    -HTTP request sent, awaiting response... 200 OK
    -Length: unspecified [text/csv]
    -Saving to: ‘rows.csv?accessType=DOWNLOAD’
    -
    -rows.csv?accessType     [              <=>   ]  58.56M  3.32MB/s    in 18s     
    -
    -2020-01-22 04:20:23 (3.29 MB/s) - ‘rows.csv?accessType=DOWNLOAD’ saved [61404826]
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    !ls
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    'rows.csv?accessType=DOWNLOAD'	 spark-2.4.4-bin-hadoop2.7
    - sample_data			 spark-2.4.4-bin-hadoop2.7.tgz
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    #Renaming the downloaded file
    -!mv rows.csv?accessType=DOWNLOAD reported-crimes.csv
    -
    - -
    -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df = spark.read.csv('reported-crimes.csv',header=True)
    -df.show(5)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+--------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -|      ID|Case Number|                Date|               Block|IUCR|        Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|          Updated On|Latitude|Longitude|Location|
    -+--------+-----------+--------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -|11955940|   JD121140|07/31/2019 09:00:...|  026XX S HALSTED ST|1154|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|  SMALL RETAIL STORE| false|   false|0913|     009|  11|            60|      11|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956035|   JD121288|09/10/2019 12:01:...|070XX S STONY ISL...|1582|OFFENSE INVOLVING...|   CHILD PORNOGRAPHY|           RESIDENCE| false|   false|0332|     003|   5|            43|      17|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956045|   JD121237|12/15/2019 12:00:...|      069XX S ADA ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|0734|     007|   6|            67|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956146|   JD121344|12/31/2019 12:00:...|   016XX S THROOP ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|1233|     012|  25|            31|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956304|   JD121233|11/27/2019 10:35:...|   002XX W GOETHE ST|0810|               THEFT|           OVER $500|           RESIDENCE| false|   false|1821|     018|   2|             8|      06|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -+--------+-----------+--------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -only showing top 5 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Working with the Dataframe API

    -
    -
    -
    -
    -
    -
    -

    -

    Viewing Dataframe

    -
    -
    -
    -
    -
    -
    -

    In Spark, you have a couple of options to view the DataFrame(DF).

    -
      -
    1. take(3) will return a list of three row objects.
    2. -
    3. df.collect() will get all of the data from the entire DataFrame . Be careful when using it, because if you have a large data set when you run collect, you can easily crash the driver node.
    4. -
    5. If you want Spark to print out your DataFrame in a nice format, then try df.show() with the number of rows as paramter.
    6. -
    -

    N.B: The limit function returns a new DataFrame by taking the first n rows.

    - -
    -
    -
    -
    -
    -
    -

    -

    Schema of a DataFrame

    -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.dtypes
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    [('ID', 'string'),
    - ('Case Number', 'string'),
    - ('Date', 'string'),
    - ('Block', 'string'),
    - ('IUCR', 'string'),
    - ('Primary Type', 'string'),
    - ('Description', 'string'),
    - ('Location Description', 'string'),
    - ('Arrest', 'string'),
    - ('Domestic', 'string'),
    - ('Beat', 'string'),
    - ('District', 'string'),
    - ('Ward', 'string'),
    - ('Community Area', 'string'),
    - ('FBI Code', 'string'),
    - ('X Coordinate', 'string'),
    - ('Y Coordinate', 'string'),
    - ('Year', 'string'),
    - ('Updated On', 'string'),
    - ('Latitude', 'string'),
    - ('Longitude', 'string'),
    - ('Location', 'string')]
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.printSchema()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    root
    - |-- ID: string (nullable = true)
    - |-- Case Number: string (nullable = true)
    - |-- Date: string (nullable = true)
    - |-- Block: string (nullable = true)
    - |-- IUCR: string (nullable = true)
    - |-- Primary Type: string (nullable = true)
    - |-- Description: string (nullable = true)
    - |-- Location Description: string (nullable = true)
    - |-- Arrest: string (nullable = true)
    - |-- Domestic: string (nullable = true)
    - |-- Beat: string (nullable = true)
    - |-- District: string (nullable = true)
    - |-- Ward: string (nullable = true)
    - |-- Community Area: string (nullable = true)
    - |-- FBI Code: string (nullable = true)
    - |-- X Coordinate: string (nullable = true)
    - |-- Y Coordinate: string (nullable = true)
    - |-- Year: string (nullable = true)
    - |-- Updated On: string (nullable = true)
    - |-- Latitude: string (nullable = true)
    - |-- Longitude: string (nullable = true)
    - |-- Location: string (nullable = true)
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Defining a schema
    -from pyspark.sql.types import StructType, StructField, StringType, TimestampType, BooleanType, DoubleType, IntegerType
    -df.columns
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    ['ID',
    - 'Case Number',
    - 'Date',
    - 'Block',
    - 'IUCR',
    - 'Primary Type',
    - 'Description',
    - 'Location Description',
    - 'Arrest',
    - 'Domestic',
    - 'Beat',
    - 'District',
    - 'Ward',
    - 'Community Area',
    - 'FBI Code',
    - 'X Coordinate',
    - 'Y Coordinate',
    - 'Year',
    - 'Updated On',
    - 'Latitude',
    - 'Longitude',
    - 'Location']
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    labels = [
    -     ('ID',StringType()),
    -     ('Case Number',StringType()),
    -     ('Date',TimestampType()),
    -     ('Block',StringType()),
    -     ('IUCR',StringType()),
    -     ('Primary Type',StringType()),
    -     ('Description',StringType()),
    -     ('Location Description',StringType()),
    -     ('Arrest',StringType()),
    -     ('Domestic',BooleanType()),
    -     ('Beat',StringType()),
    -     ('District',StringType()),
    -     ('Ward',StringType()),
    -     ('Community Area',StringType()),
    -     ('FBI Code',StringType()),
    -     ('X Coordinate',StringType()),
    -     ('Y Coordinate',StringType()),
    -     ('Year',IntegerType()),
    -     ('Updated On',StringType()),
    -     ('Latitude',DoubleType()),
    -     ('Longitude',DoubleType()),
    -     ('Location',StringType()),
    -     ('Historical Wards 2003-2015',StringType()),
    -     ('Zip Codes',StringType()),
    -     ('Community Areas',StringType()),
    -     ('Census Tracts',StringType()),
    -     ('Wards',StringType()),
    -     ('Boundaries - ZIP Codes',StringType()),
    -     ('Police Districts',StringType()),
    -     ('Police Beats',StringType())
    -]
    -
    - -
    -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    schema = StructType([StructField (x[0], x[1], True) for x in labels])
    -schema
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    StructType(List(StructField(ID,StringType,true),StructField(Case Number,StringType,true),StructField(Date,TimestampType,true),StructField(Block,StringType,true),StructField(IUCR,StringType,true),StructField(Primary Type,StringType,true),StructField(Description,StringType,true),StructField(Location Description,StringType,true),StructField(Arrest,StringType,true),StructField(Domestic,BooleanType,true),StructField(Beat,StringType,true),StructField(District,StringType,true),StructField(Ward,StringType,true),StructField(Community Area,StringType,true),StructField(FBI Code,StringType,true),StructField(X Coordinate,StringType,true),StructField(Y Coordinate,StringType,true),StructField(Year,IntegerType,true),StructField(Updated On,StringType,true),StructField(Latitude,DoubleType,true),StructField(Longitude,DoubleType,true),StructField(Location,StringType,true),StructField(Historical Wards 2003-2015,StringType,true),StructField(Zip Codes,StringType,true),StructField(Community Areas,StringType,true),StructField(Census Tracts,StringType,true),StructField(Wards,StringType,true),StructField(Boundaries - ZIP Codes,StringType,true),StructField(Police Districts,StringType,true),StructField(Police Beats,StringType,true)))
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df = spark.read.csv('reported-crimes.csv',schema=schema)
    -df.printSchema()
    -# The schema comes as we gave!
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    root
    - |-- ID: string (nullable = true)
    - |-- Case Number: string (nullable = true)
    - |-- Date: timestamp (nullable = true)
    - |-- Block: string (nullable = true)
    - |-- IUCR: string (nullable = true)
    - |-- Primary Type: string (nullable = true)
    - |-- Description: string (nullable = true)
    - |-- Location Description: string (nullable = true)
    - |-- Arrest: string (nullable = true)
    - |-- Domestic: boolean (nullable = true)
    - |-- Beat: string (nullable = true)
    - |-- District: string (nullable = true)
    - |-- Ward: string (nullable = true)
    - |-- Community Area: string (nullable = true)
    - |-- FBI Code: string (nullable = true)
    - |-- X Coordinate: string (nullable = true)
    - |-- Y Coordinate: string (nullable = true)
    - |-- Year: integer (nullable = true)
    - |-- Updated On: string (nullable = true)
    - |-- Latitude: double (nullable = true)
    - |-- Longitude: double (nullable = true)
    - |-- Location: string (nullable = true)
    - |-- Historical Wards 2003-2015: string (nullable = true)
    - |-- Zip Codes: string (nullable = true)
    - |-- Community Areas: string (nullable = true)
    - |-- Census Tracts: string (nullable = true)
    - |-- Wards: string (nullable = true)
    - |-- Boundaries - ZIP Codes: string (nullable = true)
    - |-- Police Districts: string (nullable = true)
    - |-- Police Beats: string (nullable = true)
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.show()
    -# This comes as null which means the datatypes we gave were wrong.
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +----+-----------+----+-----+----+------------+-----------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+----------+--------+---------+--------+--------------------------+---------+---------------+-------------+-----+----------------------+----------------+------------+
    -|  ID|Case Number|Date|Block|IUCR|Primary Type|Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|Updated On|Latitude|Longitude|Location|Historical Wards 2003-2015|Zip Codes|Community Areas|Census Tracts|Wards|Boundaries - ZIP Codes|Police Districts|Police Beats|
    -+----+-----------+----+-----+----+------------+-----------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+----------+--------+---------+--------+--------------------------+---------+---------------+-------------+-----+----------------------+----------------+------------+
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -|null|       null|null| null|null|        null|       null|                null|  null|    null|null|    null|null|          null|    null|        null|        null|null|      null|    null|     null|    null|                      null|     null|           null|         null| null|                  null|            null|        null|
    -+----+-----------+----+-----+----+------------+-----------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+----------+--------+---------+--------+--------------------------+---------+---------------+-------------+-----+----------------------+----------------+------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    # So let's just stick with infered schema for now, casting date to date type on the way
    -from pyspark.sql.functions import col, to_timestamp
    -df = spark.read.csv('reported-crimes.csv',header=True).withColumn('Date',to_timestamp(col('Date'),'MM/dd/yyyy hh:mm:ss a'))
    -df.show(5)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -|      ID|Case Number|               Date|               Block|IUCR|        Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|          Updated On|Latitude|Longitude|Location|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -|11955940|   JD121140|2019-07-31 09:00:00|  026XX S HALSTED ST|1154|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|  SMALL RETAIL STORE| false|   false|0913|     009|  11|            60|      11|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956035|   JD121288|2019-09-10 00:01:00|070XX S STONY ISL...|1582|OFFENSE INVOLVING...|   CHILD PORNOGRAPHY|           RESIDENCE| false|   false|0332|     003|   5|            43|      17|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956045|   JD121237|2019-12-15 12:00:00|      069XX S ADA ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|0734|     007|   6|            67|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956146|   JD121344|2019-12-31 00:00:00|   016XX S THROOP ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|1233|     012|  25|            31|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956304|   JD121233|2019-11-27 10:35:00|   002XX W GOETHE ST|0810|               THEFT|           OVER $500|           RESIDENCE| false|   false|1821|     018|   2|             8|      06|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -only showing top 5 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Working with columns

    -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.Block
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    Column<b'Block'>
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df['Block']
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    Column<b'Block'>
    -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    NOTE:

    -

    We can't always use the dot notation because this will break when the column names have reserved names or attributes to the data frame class.

    -
    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.select(col('Block')).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------------------+
    -|               Block|
    -+--------------------+
    -|  026XX S HALSTED ST|
    -|070XX S STONY ISL...|
    -|      069XX S ADA ST|
    -|   016XX S THROOP ST|
    -|   002XX W GOETHE ST|
    -|083XX S MUSKEGON AVE|
    -|   050XX W NELSON ST|
    -|     011XX W 83RD ST|
    -|  081XX W ADDISON ST|
    -| 031XX S PRAIRIE AVE|
    -|023XX N MILWAUKEE...|
    -| 0000X E CHESTNUT ST|
    -| 0000X N LATROBE AVE|
    -|    104XX S AVENUE J|
    -|025XX S CALIFORNI...|
    -|     021XX E 70TH ST|
    -|082XX S JEFFERY BLVD|
    -|  076XX S CICERO AVE|
    -|     016XX W LAKE ST|
    -| 018XX W DIVISION ST|
    -+--------------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.select(df.Block).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------------------+
    -|               Block|
    -+--------------------+
    -|  026XX S HALSTED ST|
    -|070XX S STONY ISL...|
    -|      069XX S ADA ST|
    -|   016XX S THROOP ST|
    -|   002XX W GOETHE ST|
    -|083XX S MUSKEGON AVE|
    -|   050XX W NELSON ST|
    -|     011XX W 83RD ST|
    -|  081XX W ADDISON ST|
    -| 031XX S PRAIRIE AVE|
    -|023XX N MILWAUKEE...|
    -| 0000X E CHESTNUT ST|
    -| 0000X N LATROBE AVE|
    -|    104XX S AVENUE J|
    -|025XX S CALIFORNI...|
    -|     021XX E 70TH ST|
    -|082XX S JEFFERY BLVD|
    -|  076XX S CICERO AVE|
    -|     016XX W LAKE ST|
    -| 018XX W DIVISION ST|
    -+--------------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.select('Block','Description').show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------------------+--------------------+
    -|               Block|         Description|
    -+--------------------+--------------------+
    -|  026XX S HALSTED ST|FINANCIAL IDENTIT...|
    -|070XX S STONY ISL...|   CHILD PORNOGRAPHY|
    -|      069XX S ADA ST|      FORCIBLE ENTRY|
    -|   016XX S THROOP ST|      FORCIBLE ENTRY|
    -|   002XX W GOETHE ST|           OVER $500|
    -|083XX S MUSKEGON AVE|      $500 AND UNDER|
    -|   050XX W NELSON ST|HARASSMENT BY TEL...|
    -|     011XX W 83RD ST|SEXUAL EXPLOITATI...|
    -|  081XX W ADDISON ST|FINANCIAL IDENTIT...|
    -| 031XX S PRAIRIE AVE|ILLEGAL USE CASH ...|
    -|023XX N MILWAUKEE...|           OVER $500|
    -| 0000X E CHESTNUT ST|       FROM BUILDING|
    -| 0000X N LATROBE AVE|         TO PROPERTY|
    -|    104XX S AVENUE J|HARASSMENT BY TEL...|
    -|025XX S CALIFORNI...|AGG CRIM SEX ABUS...|
    -|     021XX E 70TH ST|UNLAWFUL POSS OF ...|
    -|082XX S JEFFERY BLVD|PRO EMP HANDS NO/...|
    -|  076XX S CICERO AVE|      $500 AND UNDER|
    -|     016XX W LAKE ST|DOMESTIC BATTERY ...|
    -| 018XX W DIVISION ST|      UNLAWFUL ENTRY|
    -+--------------------+--------------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    #Adding a column in PySpark
    -# We are adding a column called 'One' at the end
    -from pyspark.sql.functions import lit
    -df = df.withColumn('One',lit(1))
    -df.show(5)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+---+
    -|      ID|Case Number|               Date|               Block|IUCR|        Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|          Updated On|Latitude|Longitude|Location|One|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+---+
    -|11955940|   JD121140|2019-07-31 09:00:00|  026XX S HALSTED ST|1154|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|  SMALL RETAIL STORE| false|   false|0913|     009|  11|            60|      11|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|  1|
    -|11956035|   JD121288|2019-09-10 00:01:00|070XX S STONY ISL...|1582|OFFENSE INVOLVING...|   CHILD PORNOGRAPHY|           RESIDENCE| false|   false|0332|     003|   5|            43|      17|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|  1|
    -|11956045|   JD121237|2019-12-15 12:00:00|      069XX S ADA ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|0734|     007|   6|            67|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|  1|
    -|11956146|   JD121344|2019-12-31 00:00:00|   016XX S THROOP ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|1233|     012|  25|            31|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|  1|
    -|11956304|   JD121233|2019-11-27 10:35:00|   002XX W GOETHE ST|0810|               THEFT|           OVER $500|           RESIDENCE| false|   false|1821|     018|   2|             8|      06|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|  1|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+---+
    -only showing top 5 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    #Renaming a column in PySpark
    -df = df.withColumnRenamed('One', 'Test')
    -df.show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+----+
    -|      ID|Case Number|               Date|               Block|IUCR|        Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|          Updated On|    Latitude|    Longitude|            Location|Test|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+----+
    -|11955940|   JD121140|2019-07-31 09:00:00|  026XX S HALSTED ST|1154|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|  SMALL RETAIL STORE| false|   false|0913|     009|  11|            60|      11|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956035|   JD121288|2019-09-10 00:01:00|070XX S STONY ISL...|1582|OFFENSE INVOLVING...|   CHILD PORNOGRAPHY|           RESIDENCE| false|   false|0332|     003|   5|            43|      17|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956045|   JD121237|2019-12-15 12:00:00|      069XX S ADA ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|0734|     007|   6|            67|      05|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956146|   JD121344|2019-12-31 00:00:00|   016XX S THROOP ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|1233|     012|  25|            31|      05|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956304|   JD121233|2019-11-27 10:35:00|   002XX W GOETHE ST|0810|               THEFT|           OVER $500|           RESIDENCE| false|   false|1821|     018|   2|             8|      06|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956126|   JD121418|2019-12-23 06:00:00|083XX S MUSKEGON AVE|0820|               THEFT|      $500 AND UNDER|              STREET| false|   false|0423|     004|   7|            46|      06|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956221|   JD121487|2019-09-13 17:00:00|   050XX W NELSON ST|2825|       OTHER OFFENSE|HARASSMENT BY TEL...|           RESIDENCE| false|   false|2521|     025|  31|            19|      26|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956049|   JD121196|2019-12-19 19:07:00|     011XX W 83RD ST|1544|         SEX OFFENSE|SEXUAL EXPLOITATI...|           RESIDENCE| false|   false|0613|     006|  21|            71|      17|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956468|   JD121650|2019-06-01 08:00:00|  081XX W ADDISON ST|1153|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|           APARTMENT| false|   false|1631|     016|  38|            17|      11|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11955888|   JD121051|2019-12-01 00:01:00| 031XX S PRAIRIE AVE|1152|  DECEPTIVE PRACTICE|ILLEGAL USE CASH ...|           RESIDENCE| false|   false|0211|     002|   4|            35|      11|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956127|   JD121334|2019-12-14 22:00:00|023XX N MILWAUKEE...|0810|               THEFT|           OVER $500|               OTHER| false|   false|1414|     014|   1|            22|      06|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956215|   JD121376|2019-12-12 19:00:00| 0000X E CHESTNUT ST|0890|               THEFT|       FROM BUILDING|          RESTAURANT| false|   false|1833|     018|  42|             8|      06|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956004|   JD121156|2019-11-15 12:00:00| 0000X N LATROBE AVE|1310|     CRIMINAL DAMAGE|         TO PROPERTY|           RESIDENCE| false|   false|1522|     015|  28|            25|      14|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956129|   JD121379|2019-11-03 15:00:00|    104XX S AVENUE J|2825|       OTHER OFFENSE|HARASSMENT BY TEL...|               OTHER| false|   false|0432|     004|  10|            52|      26|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11956041|   JD121266|2019-12-21 14:18:00|025XX S CALIFORNI...|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           APARTMENT| false|   false|1033|     010|  12|            30|      17|        null|        null|2019|01/20/2020 03:52:...|        null|         null|                null|   1|
    -|11935337|   JC563784|2019-12-29 02:25:00|     021XX E 70TH ST|143A|   WEAPONS VIOLATION|UNLAWFUL POSS OF ...|               ALLEY|  true|   false|0331|     003|   5|            43|      15|     1191863|     1858976|2019|01/20/2020 03:49:...|41.768020578|-87.572284866|(41.768020578, -8...|   1|
    -|11927831|   JC553936|2019-12-20 13:15:00|082XX S JEFFERY BLVD|0545|             ASSAULT|PRO EMP HANDS NO/...|              STREET|  true|   false|0414|     004|   8|            46|     08A|     1190954|     1850799|2019|01/20/2020 03:49:...|41.745604221|-87.575880606|(41.745604221, -8...|   1|
    -|11927034|   JC549451|2019-12-16 19:15:00|  076XX S CICERO AVE|0820|               THEFT|      $500 AND UNDER|  SMALL RETAIL STORE| false|   false|0833|     008|  18|            65|      06|     1145727|     1853720|2019|01/20/2020 03:49:...|41.754592961|-87.741528537|(41.754592961, -8...|   1|
    -|11926969|   JC553305|2019-12-19 21:30:00|     016XX W LAKE ST|0486|             BATTERY|DOMESTIC BATTERY ...|           RESIDENCE|  true|   false|1224|     012|  27|            28|     08B|     1165379|     1901492|2019|01/20/2020 03:49:...|41.885291047| -87.66815409|(41.885291047, -8...|   1|
    -|11926797|   JC553193|2019-12-19 19:04:00| 018XX W DIVISION ST|0620|            BURGLARY|      UNLAWFUL ENTRY|           RESIDENCE|  true|   false|1212|     012|   1|            24|      05|     1164005|     1908028|2019|01/20/2020 03:49:...|41.903255445|-87.673014935|(41.903255445, -8...|   1|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+----+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    #Group By a column in PySpark
    -df.groupBy('Year').count().show(5)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +----+------+
    -|Year| count|
    -+----+------+
    -|2019|257625|
    -+----+------+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    #Remove columns in PySpark
    -df = df.drop('Test')
    -df.show(5)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -|      ID|Case Number|               Date|               Block|IUCR|        Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|          Updated On|Latitude|Longitude|Location|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -|11955940|   JD121140|2019-07-31 09:00:00|  026XX S HALSTED ST|1154|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|  SMALL RETAIL STORE| false|   false|0913|     009|  11|            60|      11|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956035|   JD121288|2019-09-10 00:01:00|070XX S STONY ISL...|1582|OFFENSE INVOLVING...|   CHILD PORNOGRAPHY|           RESIDENCE| false|   false|0332|     003|   5|            43|      17|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956045|   JD121237|2019-12-15 12:00:00|      069XX S ADA ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|0734|     007|   6|            67|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956146|   JD121344|2019-12-31 00:00:00|   016XX S THROOP ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|1233|     012|  25|            31|      05|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -|11956304|   JD121233|2019-11-27 10:35:00|   002XX W GOETHE ST|0810|               THEFT|           OVER $500|           RESIDENCE| false|   false|1821|     018|   2|             8|      06|        null|        null|2019|01/20/2020 03:52:...|    null|     null|    null|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+--------+---------+--------+
    -only showing top 5 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Working with Rows

    -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Filtering rows in PySpark
    -df.filter(col('Date')<'2019-06-01').show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+-------------------+--------------------+----+-------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+
    -|      ID|Case Number|               Date|               Block|IUCR|       Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|          Updated On|    Latitude|    Longitude|            Location|
    -+--------+-----------+-------------------+--------------------+----+-------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+
    -|11652463|   JC221068|2019-04-11 22:30:00| 012XX W GUNNISON ST|0325|            ROBBERY| VEHICULAR HIJACKING|              STREET|  true|   false|2033|     020|  46|             3|      03|     1167294|     1932462|2019|01/20/2020 03:49:...|41.970233435|-87.660229117|(41.970233435, -8...|
    -|11580399|   JC126732|2019-01-23 13:05:00| 033XX W FILLMORE ST|2093|          NARCOTICS|FOUND SUSPECT NAR...|POLICE FACILITY/V...|  true|   false|1134|     011|  24|            29|      18|     1154228|     1895173|2019|01/19/2020 03:47:...|41.868180939|-87.709271389|(41.868180939, -8...|
    -|11953660|   JD118414|2019-03-01 14:10:00|     013XX E 62ND ST|1153| DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|                null| false|   false|0314|     003|  20|            42|      11|     1185906|     1864152|2019|01/19/2020 03:47:...|41.782366539| -87.59395659|(41.782366539, -8...|
    -|11955048|   JD120087|2019-01-29 12:00:00|  046XX N PAULINA ST|0890|              THEFT|       FROM BUILDING|               OTHER| false|   false|1912|     019|  47|             3|      06|        null|        null|2019|01/19/2020 03:49:...|        null|         null|                null|
    -|11953654|   JD118405|2019-03-01 12:50:00|     008XX E 38TH PL|1153| DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|                null| false|   false|0212|     002|   4|            36|      11|     1182247|     1879773|2019|01/19/2020 03:47:...| 41.82531739|-87.606887309|(41.82531739, -87...|
    -|11955104|   JD120139|2019-05-15 16:00:00|     035XX W 71ST PL|1563|        SEX OFFENSE|CRIMINAL SEXUAL A...|           RESIDENCE| false|    true|0831|     008|  17|            66|      17|        null|        null|2019|01/19/2020 03:49:...|        null|         null|                null|
    -|11581568|   JC126573|2019-01-23 11:00:00| 033XX W FILLMORE ST|2093|          NARCOTICS|FOUND SUSPECT NAR...|POLICE FACILITY/V...|  true|   false|1134|     011|  24|            29|      18|     1154228|     1895173|2019|01/19/2020 03:47:...|41.868180939|-87.709271389|(41.868180939, -8...|
    -|11953335|   JD117914|2019-01-10 21:30:00| 004XX N MC CLURG CT|1330|  CRIMINAL TRESPASS|             TO LAND|    RESIDENCE-GARAGE| false|   false|1834|     018|  42|             8|      26|     1179120|     1903342|2019|01/19/2020 03:47:...|41.890064174|-87.617638563|(41.890064174, -8...|
    -|11946410|   JC151544|2019-02-13 20:49:00|     043XX S TROY ST|2024|          NARCOTICS| POSS: HEROIN(WHITE)|           RESIDENCE|  true|   false|0922|     009|  15|            58|      18|     1156071|     1875567|2019|01/18/2020 03:44:...|41.814342797|-87.703033557|(41.814342797, -8...|
    -|11677953|   JC235447|2019-04-23 14:46:00|040XX W VAN BUREN ST|2014|          NARCOTICS|MANU/DELIVER: HER...|     VACANT LOT/LAND|  true|   false|1132|     011|  28|            26|      18|     1149531|     1897718|2019|01/18/2020 03:44:...|41.875257119| -87.72644908|(41.875257119, -8...|
    -|11672673|   JC231408|2019-04-20 09:42:00| 085XX S ABERDEEN ST|1812|          NARCOTICS|POSS: CANNABIS MO...|           RESIDENCE|  true|   false|0613|     006|  21|            71|      18|     1170476|     1848176|2019|01/18/2020 03:44:...| 41.73887651|-87.650991641|(41.73887651, -87...|
    -|11672370|   JC232204|2019-04-20 22:45:00|     011XX W 90TH ST|1812|          NARCOTICS|POSS: CANNABIS MO...|           APARTMENT|  true|   false|2222|     022|  21|            73|      18|     1170443|     1845120|2019|01/18/2020 03:44:...|41.730491134|-87.651201287|(41.730491134, -8...|
    -|11671344|   JC235165|2019-04-23 13:17:00| 053XX W CHICAGO AVE|2017|          NARCOTICS|  MANU/DELIVER:CRACK|               ALLEY|  true|   false|1524|     015|  37|            25|      18|     1140417|     1904813|2019|01/18/2020 03:44:...|41.894898651| -87.75973838|(41.894898651, -8...|
    -|11666457|   JC228439|2019-04-17 18:34:14|054XX S CHRISTIAN...|1811|          NARCOTICS|POSS: CANNABIS 30...|           RESIDENCE|  true|   false|0822|     008|  14|            63|      18|     1154972|     1868333|2019|01/18/2020 03:44:...|41.794513753| -87.70725813|(41.794513753, -8...|
    -|11951328|   JD115421|2019-05-01 09:00:00|  001XX E ONTARIO ST|1153| DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|COMMERCIAL / BUSI...| false|   false|1834|     018|  42|             8|      11|     1177462|     1904523|2019|01/17/2020 03:45:...|41.893342683| -87.62369154|(41.893342683, -8...|
    -|11953844|   JD118256|2019-02-16 12:00:00|   062XX S KOLIN AVE|0281|CRIM SEXUAL ASSAULT|      NON-AGGRAVATED|           APARTMENT| false|    true|0813|     008|  23|            65|      02|        null|        null|2019|01/17/2020 03:48:...|        null|         null|                null|
    -|11950821|   JC550653|2019-04-04 08:00:00|  079XX S KARLOV AVE|1154| DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|           RESIDENCE| false|   false|0834|     008|  18|            70|      11|     1150460|     1851422|2019|01/17/2020 03:45:...|41.748196202|-87.724242888|(41.748196202, -8...|
    -|11951415|   JD115614|2019-01-13 11:00:00|115XX S WENTWORTH...|1153| DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|           APARTMENT| false|   false|0522|     005|  34|            53|      11|     1177005|     1828396|2019|01/17/2020 03:45:...|41.684452919|-87.627664687|(41.684452919, -8...|
    -|11951493|   JD115459|2019-05-14 22:00:00|033XX W ARTHINGTO...|1563|        SEX OFFENSE|CRIMINAL SEXUAL A...|           RESIDENCE| false|    true|1134|     011|  24|            29|      17|     1154345|     1895847|2019|01/17/2020 03:45:...|41.870028132|-87.708823855|(41.870028132, -8...|
    -|11951147|   JD115168|2019-04-16 00:01:00|   005XX W MONROE ST|1120| DECEPTIVE PRACTICE|             FORGERY|                BANK| false|   false|0121|     001|  42|            28|      10|     1172908|     1899834|2019|01/17/2020 03:45:...|41.880577917|-87.640555606|(41.880577917, -8...|
    -+--------+-----------+-------------------+--------------------+----+-------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    #Get Unique Rows in PySpark
    -df.select('Year').distinct().show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +----+
    -|Year|
    -+----+
    -|2019|
    -+----+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Sort Rows in PySpark
    -df.orderBy('Date').show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+
    -|      ID|Case Number|               Date|               Block|IUCR|        Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|X Coordinate|Y Coordinate|Year|          Updated On|    Latitude|    Longitude|            Location|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+
    -|11895528|   JC515003|2019-01-01 00:00:00|  059XX W LELAND AVE|1153|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|                null| false|   false|1622|     016|  45|            15|      11|        null|        null|2019|11/19/2019 03:57:...|        null|         null|                null|
    -|11940493|   JD102895|2019-01-01 00:00:00|012XX S KOMENSKY AVE|141A|   WEAPONS VIOLATION|UNLAWFUL USE HANDGUN|RESIDENCE PORCH/H...| false|   false|1011|     010|  24|            29|      15|     1149582|     1893991|2019|01/08/2020 03:47:...|41.865028808|-87.726358587|(41.865028808, -8...|
    -|11937662|   JC566620|2019-01-01 00:00:00|  019XX N HAMLIN AVE|1582|OFFENSE INVOLVING...|   CHILD PORNOGRAPHY|SCHOOL, PRIVATE, ...| false|   false|2535|     025|  26|            22|      17|     1150737|     1912590|2019|01/03/2020 03:56:...|41.916043916|-87.721631784|(41.916043916, -8...|
    -|11739188|   JC326320|2019-01-01 00:00:00|010XX N RIDGEWAY AVE|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           APARTMENT| false|    true|1112|     011|  27|            23|      17|     1151192|     1906670|2019|12/18/2019 03:47:...|41.899789956|-87.720115618|(41.899789956, -8...|
    -|11911175|   JC534087|2019-01-01 00:00:00|    002XX E 121ST ST|1153|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|           RESIDENCE| false|   false|0532|     005|   9|            53|      11|     1180018|     1824734|2019|12/07/2019 03:47:...|41.674335599|-87.616746527|(41.674335599, -8...|
    -|11906205|   JC528532|2019-01-01 00:00:00|069XX S WOODLAWN AVE|1754|OFFENSE INVOLVING...|AGG SEX ASSLT OF ...|           RESIDENCE| false|    true|0321|     003|   5|            69|      02|     1185599|     1859302|2019|12/04/2019 03:49:...|41.769064952|-87.595234737|(41.769064952, -8...|
    -|11954981|   JD119994|2019-01-01 00:00:00| 073XX W FARWELL AVE|1750|OFFENSE INVOLVING...|         CHILD ABUSE|           RESIDENCE| false|    true|1611|     016|  41|             9|     08B|        null|        null|2019|01/19/2020 03:49:...|        null|         null|                null|
    -|11889579|   JC507635|2019-01-01 00:00:00|082XX S WENTWORTH...|1562|         SEX OFFENSE|AGG CRIMINAL SEXU...|           RESIDENCE| false|   false|0622|     006|  21|            44|      17|     1176419|     1850436|2019|11/15/2019 03:57:...|41.744946864|-87.629149986|(41.744946864, -8...|
    -|11878985|   JC494834|2019-01-01 00:00:00|   067XX W GRAND AVE|0810|               THEFT|           OVER $500|               OTHER| false|   false|2512|     025|  36|            18|      06|     1131156|     1915217|2019|11/06/2019 03:54:...|41.923613437|-87.793511665|(41.923613437, -8...|
    -|11560259|   JC109129|2019-01-01 00:00:00|  049XX N ALBANY AVE|1750|OFFENSE INVOLVING...|         CHILD ABUSE|           APARTMENT| false|   false|1713|     017|  33|            14|     08B|     1154815|     1932723|2019|10/30/2019 03:53:...|  41.9712095|-87.706108126|(41.9712095, -87....|
    -|11861735|   JC473834|2019-01-01 00:00:00| 108XX S SANGAMON ST|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           RESIDENCE| false|    true|2234|     022|  34|            75|      17|     1171912|     1832941|2019|10/18/2019 04:02:...|41.697038087|-87.646175932|(41.697038087, -8...|
    -|11752917|   JC342582|2019-01-01 00:00:00|     028XX E 76TH ST|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           APARTMENT| false|    true|0421|     004|   7|            43|      17|     1196262|     1855497|2019|10/15/2019 04:01:...| 41.75836602|-87.556276032|(41.75836602, -87...|
    -|11838300|   JC441914|2019-01-01 00:00:00|031XX W ARTHINGTO...|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           RESIDENCE| false|    true|1134|     011|  24|            27|      17|     1155548|     1895868|2019|09/26/2019 04:16:...|41.870061657|-87.704406691|(41.870061657, -8...|
    -|11752915|   JC342546|2019-01-01 00:00:00|063XX S STONY ISL...|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           APARTMENT| false|    true|0314|     003|  20|            42|      17|     1187962|     1863227|2019|09/25/2019 03:50:...|41.779779505|-87.586448286|(41.779779505, -8...|
    -|11739161|   JC326172|2019-01-01 00:00:00| 131XX S LANGLEY AVE|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           APARTMENT| false|    true|0533|     005|   9|            54|      17|     1183247|     1818205|2019|09/20/2019 03:53:...| 41.65634477|-87.605129962|(41.65634477, -87...|
    -|11739167|   JC326145|2019-01-01 00:00:00|  024XX W CARMEN AVE|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           APARTMENT| false|    true|2031|     020|  40|             4|      17|     1159097|     1933800|2019|09/13/2019 04:06:...|41.974077735|-87.690332986|(41.974077735, -8...|
    -|11808448|   JC409430|2019-01-01 00:00:00|016XX E HYDE PARK...|1153|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|               OTHER| false|   false|0222|     002|   5|            39|      11|     1188001|     1871507|2019|08/30/2019 03:57:...|41.802499513|-87.586041451|(41.802499513, -8...|
    -|11803895|   JC404008|2019-01-01 00:00:00| 070XX S CALUMET AVE|1153|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|           RESIDENCE| false|   false|0322|     003|   6|            69|      11|     1179694|     1858501|2019|08/28/2019 04:08:...|41.767003943|-87.616903894|(41.767003943, -8...|
    -|11785377|   JC381800|2019-01-01 00:00:00| 011XX N KOSTNER AVE|1310|     CRIMINAL DAMAGE|         TO PROPERTY|           APARTMENT| false|   false|1111|     011|  37|            23|      14|     1146837|     1907072|2019|08/09/2019 04:09:...|41.900977393|-87.736101441|(41.900977393, -8...|
    -|11777719|   JC372752|2019-01-01 00:00:00|  036XX S HALSTED ST|1120|  DECEPTIVE PRACTICE|             FORGERY|           RESIDENCE| false|   false|0915|     009|  11|            60|      10|     1171564|     1880789|2019|08/03/2019 04:02:...|41.828346576|-87.646050351|(41.828346576, -8...|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+------------+------------+----+--------------------+------------+-------------+--------------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Append rows in PySpark.
    -one_day = spark.read.csv('reported-crimes.csv',header=True).withColumn('Date',to_timestamp(col('Date'),'MM/dd/yyyy hh:mm:ss a')).filter(col('Date')==lit('2019-07-30'))
    -df.filter(col('Date')==lit('2019-07-30')).count()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    10
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.union(one_day).filter(col('Date')==lit('2019-07-30')).count()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    20
    -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    Result:

    -

    As you can see here, there were 6 crimes commited on 2019-07-30, and after union, there's 12 records.

    -
    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Top 10 number of reported crimes by Primary Type, in descending order of Occurence
    -df.groupBy("Primary Type").count().orderBy('count',ascending=False).show(10)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +-------------------+-----+
    -|       Primary Type|count|
    -+-------------------+-----+
    -|              THEFT|62137|
    -|            BATTERY|49425|
    -|    CRIMINAL DAMAGE|26638|
    -|            ASSAULT|20584|
    -| DECEPTIVE PRACTICE|17674|
    -|      OTHER OFFENSE|16543|
    -|          NARCOTICS|13946|
    -|           BURGLARY| 9590|
    -|MOTOR VEHICLE THEFT| 8968|
    -|            ROBBERY| 7985|
    -+-------------------+-----+
    -only showing top 10 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Hands-on Questions 🤚 !

    -
    -
    -
    -
    -
    -
    -

    What percentage of reported crimes resulted in an arrest?

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Answer
    -
    - -
    -
    -
    - -
    -
    -
    -
    -

    What are the top 3 locations for reported crimes?

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Answer
    -
    - -
    -
    -
    - -
    -
    -
    -
    -

    -

    Functions

    -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Functions available in PySpark
    -from pyspark.sql import functions
    -print(dir(functions))
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    ['Column', 'DataFrame', 'DataType', 'PandasUDFType', 'PythonEvalType', 'SparkContext', 'StringType', 'UserDefinedFunction', '__all__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__spec__', '_binary_mathfunctions', '_collect_list_doc', '_collect_set_doc', '_create_binary_mathfunction', '_create_column_from_literal', '_create_function', '_create_udf', '_create_window_function', '_functions', '_functions_1_4', '_functions_1_6', '_functions_2_1', '_functions_2_4', '_functions_deprecated', '_lit_doc', '_message', '_string_functions', '_test', '_to_java_column', '_to_seq', '_window_functions', '_wrap_deprecated_function', 'abs', 'acos', 'add_months', 'approxCountDistinct', 'approx_count_distinct', 'array', 'array_contains', 'array_distinct', 'array_except', 'array_intersect', 'array_join', 'array_max', 'array_min', 'array_position', 'array_remove', 'array_repeat', 'array_sort', 'array_union', 'arrays_overlap', 'arrays_zip', 'asc', 'asc_nulls_first', 'asc_nulls_last', 'ascii', 'asin', 'atan', 'atan2', 'avg', 'base64', 'basestring', 'bin', 'bitwiseNOT', 'blacklist', 'broadcast', 'bround', 'cbrt', 'ceil', 'coalesce', 'col', 'collect_list', 'collect_set', 'column', 'concat', 'concat_ws', 'conv', 'corr', 'cos', 'cosh', 'count', 'countDistinct', 'covar_pop', 'covar_samp', 'crc32', 'create_map', 'cume_dist', 'current_date', 'current_timestamp', 'date_add', 'date_format', 'date_sub', 'date_trunc', 'datediff', 'dayofmonth', 'dayofweek', 'dayofyear', 'decode', 'degrees', 'dense_rank', 'desc', 'desc_nulls_first', 'desc_nulls_last', 'element_at', 'encode', 'exp', 'explode', 'explode_outer', 'expm1', 'expr', 'factorial', 'first', 'flatten', 'floor', 'format_number', 'format_string', 'from_json', 'from_unixtime', 'from_utc_timestamp', 'functools', 'get_json_object', 'greatest', 'grouping', 'grouping_id', 'hash', 'hex', 'hour', 'hypot', 'ignore_unicode_prefix', 'initcap', 'input_file_name', 'instr', 'isnan', 'isnull', 'json_tuple', 'kurtosis', 'lag', 'last', 'last_day', 'lead', 'least', 'length', 'levenshtein', 'lit', 'locate', 'log', 'log10', 'log1p', 'log2', 'lower', 'lpad', 'ltrim', 'map_concat', 'map_from_arrays', 'map_from_entries', 'map_keys', 'map_values', 'max', 'md5', 'mean', 'min', 'minute', 'monotonically_increasing_id', 'month', 'months_between', 'nanvl', 'next_day', 'ntile', 'pandas_udf', 'percent_rank', 'posexplode', 'posexplode_outer', 'pow', 'quarter', 'radians', 'rand', 'randn', 'rank', 'regexp_extract', 'regexp_replace', 'repeat', 'reverse', 'rint', 'round', 'row_number', 'rpad', 'rtrim', 'schema_of_json', 'second', 'sequence', 'sha1', 'sha2', 'shiftLeft', 'shiftRight', 'shiftRightUnsigned', 'shuffle', 'signum', 'sin', 'since', 'sinh', 'size', 'skewness', 'slice', 'sort_array', 'soundex', 'spark_partition_id', 'split', 'sqrt', 'stddev', 'stddev_pop', 'stddev_samp', 'struct', 'substring', 'substring_index', 'sum', 'sumDistinct', 'sys', 'tan', 'tanh', 'toDegrees', 'toRadians', 'to_date', 'to_json', 'to_timestamp', 'to_utc_timestamp', 'translate', 'trim', 'trunc', 'udf', 'unbase64', 'unhex', 'unix_timestamp', 'upper', 'var_pop', 'var_samp', 'variance', 'warnings', 'weekofyear', 'when', 'window', 'year']
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    String Functions

    -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Loading the data
    -from pyspark.sql.functions import col
    -df = spark.read.csv('reported-crimes.csv',header=True).withColumn('Date',to_timestamp(col('Date'),'MM/dd/yyyy hh:mm:ss a'))
    -
    - -
    -
    -
    - -
    -
    -
    -
    -

    Display the Primary Type column in lower and upper characters, and the first 4 characters of the column

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    from pyspark.sql.functions import col,lower, upper, substring
    -help(substring)
    -df.select(lower(col('Primary Type')),upper(col('Primary Type')),substring(col('Primary Type'),1,4)).show(5)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    Help on function substring in module pyspark.sql.functions:
    -
    -substring(str, pos, len)
    -    Substring starts at `pos` and is of length `len` when str is String type or
    -    returns the slice of byte array that starts at `pos` in byte and is of length `len`
    -    when str is Binary type.
    -    
    -    .. note:: The position is not zero based, but 1 based index.
    -    
    -    >>> df = spark.createDataFrame([('abcd',)], ['s',])
    -    >>> df.select(substring(df.s, 1, 2).alias('s')).collect()
    -    [Row(s='ab')]
    -    
    -    .. versionadded:: 1.5
    -
    -+--------------------+--------------------+-----------------------------+
    -| lower(Primary Type)| upper(Primary Type)|substring(Primary Type, 1, 4)|
    -+--------------------+--------------------+-----------------------------+
    -|  deceptive practice|  DECEPTIVE PRACTICE|                         DECE|
    -|offense involving...|OFFENSE INVOLVING...|                         OFFE|
    -|            burglary|            BURGLARY|                         BURG|
    -|            burglary|            BURGLARY|                         BURG|
    -|               theft|               THEFT|                         THEF|
    -+--------------------+--------------------+-----------------------------+
    -only showing top 5 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Numeric functions

    -
    -
    -
    -
    -
    -
    -

    Show the oldest date and the most recent date

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    from pyspark.sql.functions import min, max
    -df.select(min(col('Date')), max(col('Date'))).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +-------------------+-------------------+
    -|          min(Date)|          max(Date)|
    -+-------------------+-------------------+
    -|2019-01-01 00:00:00|2019-12-31 23:55:00|
    -+-------------------+-------------------+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Date

    -
    -
    -
    -
    -
    -
    -

    What is 3 days earlier that the oldest date and 3 days later than the most recent date?

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    from pyspark.sql.functions import date_add, date_sub
    -df.select(date_add(max(col('Date')),3), date_sub(min(col('Date')),3)).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +----------------------+----------------------+
    -|date_add(max(Date), 3)|date_sub(min(Date), 3)|
    -+----------------------+----------------------+
    -|            2020-01-03|            2018-12-29|
    -+----------------------+----------------------+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Working with Dates

    -
    -
    -
    - -
    -
    -
    In [0]:
    -
    -
    -
    from pyspark.sql.functions import to_date, to_timestamp, lit
    -df = spark.createDataFrame([('2019-12-25 13:30:00',)], ['Christmas'])
    -df.show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +-------------------+
    -|          Christmas|
    -+-------------------+
    -|2019-12-25 13:30:00|
    -+-------------------+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df.select(to_date(col('Christmas'),'yyyy-MM-dd HH:mm:ss'), to_timestamp(col('Christmas'),'yyyy-MM-dd HH:mm:ss')).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +-------------------------------------------+------------------------------------------------+
    -|to_date(`Christmas`, 'yyyy-MM-dd HH:mm:ss')|to_timestamp(`Christmas`, 'yyyy-MM-dd HH:mm:ss')|
    -+-------------------------------------------+------------------------------------------------+
    -|                                 2019-12-25|                             2019-12-25 13:30:00|
    -+-------------------------------------------+------------------------------------------------+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df = spark.createDataFrame([('25/Dec/2019 13:30:00',)], ['Christmas'])
    -df.select(to_date(col('Christmas'),'dd/MMM/yyyy HH:mm:ss'), to_timestamp(col('Christmas'),'dd/MMM/yyyy HH:mm:ss')).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------------------------------------------+-------------------------------------------------+
    -|to_date(`Christmas`, 'dd/MMM/yyyy HH:mm:ss')|to_timestamp(`Christmas`, 'dd/MMM/yyyy HH:mm:ss')|
    -+--------------------------------------------+-------------------------------------------------+
    -|                                  2019-12-25|                              2019-12-25 13:30:00|
    -+--------------------------------------------+-------------------------------------------------+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    df = spark.createDataFrame([('12/25/2019 01:30:00 PM',)], ['Christmas'])
    -df.show(1)
    -df.show(1, truncate = False)
    -df.select(to_date(col('Christmas'),'MM/dd/yyyy hh:mm:ss aa'), to_timestamp(col('Christmas'),'MM/dd/yyyy hh:mm:ss aa')).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------------------+
    -|           Christmas|
    -+--------------------+
    -|12/25/2019 01:30:...|
    -+--------------------+
    -
    -+----------------------+
    -|Christmas             |
    -+----------------------+
    -|12/25/2019 01:30:00 PM|
    -+----------------------+
    -
    -+----------------------------------------------+---------------------------------------------------+
    -|to_date(`Christmas`, 'MM/dd/yyyy hh:mm:ss aa')|to_timestamp(`Christmas`, 'MM/dd/yyyy hh:mm:ss aa')|
    -+----------------------------------------------+---------------------------------------------------+
    -|                                    2019-12-25|                                2019-12-25 13:30:00|
    -+----------------------------------------------+---------------------------------------------------+
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Working with joins

    -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Loading the data
    -from pyspark.sql.functions import col
    -df = spark.read.csv('reported-crimes.csv',header=True).withColumn('Date',to_timestamp(col('Date'),'MM/dd/yyyy hh:mm:ss a'))
    -
    - -
    -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Dowloading police station data
    -!wget -O police-station.csv https://data.cityofchicago.org/api/views/z8bn-74gv/rows.csv?accessType=DOWNLOAD
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    --2020-01-22 04:30:02--  https://data.cityofchicago.org/api/views/z8bn-74gv/rows.csv?accessType=DOWNLOAD
    -Resolving data.cityofchicago.org (data.cityofchicago.org)... 52.206.140.205, 52.206.68.26, 52.206.140.199
    -Connecting to data.cityofchicago.org (data.cityofchicago.org)|52.206.140.205|:443... connected.
    -HTTP request sent, awaiting response... 200 OK
    -Length: unspecified [text/csv]
    -Saving to: ‘police-station.csv’
    -
    -police-station.csv      [ <=>                ]   5.57K  --.-KB/s    in 0s      
    -
    -2020-01-22 04:30:03 (562 MB/s) - ‘police-station.csv’ saved [5699]
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    ps = spark.read.csv("police-station.csv", header=True)
    -ps.show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------------------+-----------------+--------------------+-------+-----+-----+--------------------+------------+------------+------------+------------+------------+-----------+------------+--------------------+
    -|            DISTRICT|    DISTRICT NAME|             ADDRESS|   CITY|STATE|  ZIP|             WEBSITE|       PHONE|         FAX|         TTY|X COORDINATE|Y COORDINATE|   LATITUDE|   LONGITUDE|            LOCATION|
    -+--------------------+-----------------+--------------------+-------+-----+-----+--------------------+------------+------------+------------+------------+------------+-----------+------------+--------------------+
    -|                   1|          Central|     1718 S State St|Chicago|   IL|60616|http://home.chica...|312-745-4290|312-745-3694|312-745-3693| 1176569.052| 1891771.704|41.85837259|-87.62735617|(41.8583725929, -...|
    -|                   6|          Gresham|   7808 S Halsted St|Chicago|   IL|60620|http://home.chica...|312-745-3617|312-745-3649|312-745-3639| 1172283.013| 1853022.646|41.75213684|-87.64422891|(41.7521368378, -...|
    -|                  11|         Harrison|  3151 W Harrison St|Chicago|   IL|60612|http://home.chica...|312-746-8386|312-746-4281|312-746-5151| 1155244.069| 1897148.755|41.87358229|-87.70548813|(41.8735822883, -...|
    -|                  16|   Jefferson Park|5151 N Milwaukee Ave|Chicago|   IL|60630|http://home.chica...|312-742-4480|312-742-4421|312-742-4423| 1138480.758| 1933660.473|41.97409445|-87.76614884|(41.9740944511, -...|
    -|        Headquarters|     Headquarters| 3510 S Michigan Ave|Chicago|   IL|60653|http://home.chica...|        null|        null|        null| 1177731.401| 1881697.404|41.83070169|-87.62339535|(41.8307016873, -...|
    -|                  24|      Rogers Park|     6464 N Clark St|Chicago|   IL|60626|http://home.chica...|312-744-5907|312-744-6928|312-744-7603| 1164193.588| 1943199.401|41.99976348|-87.67132429|(41.9997634842, -...|
    -|                   2|        Wentworth|5101 S Wentworth Ave|Chicago|   IL|60609|http://home.chica...|312-747-8366|312-747-5396|312-747-6656| 1175864.837| 1871153.753|41.80181109|-87.63056018|(41.8018110912, -...|
    -|                   7|        Englewood|      1438 W 63rd St|Chicago|   IL|60636|http://home.chica...|312-747-8223|312-747-6558|312-747-6652| 1167659.235| 1863005.522|41.77963154|-87.66088702|(41.7796315359, -...|
    -|                  25|    Grand Central|    5555 W Grand Ave|Chicago|   IL|60639|http://home.chica...|312-746-8605|312-746-4353|312-746-8383| 1138770.871| 1913442.439|41.91860889|-87.76557448|(41.9186088912, -...|
    -|                  10|            Ogden|    3315 W Ogden Ave|Chicago|   IL|60623|http://home.chica...|312-747-7511|312-747-7429|312-747-7471| 1154500.753| 1890985.501|41.85668453|-87.70838196|(41.8566845327, -...|
    -|                  15|           Austin|   5701 W Madison St|Chicago|   IL|60644|http://home.chica...|312-743-1440|312-743-1366|312-743-1485| 1138148.815| 1899399.078|41.88008346|-87.76819989|(41.8800834614, -...|
    -|                   3|   Grand Crossing|7040 S Cottage Gr...|Chicago|   IL|60637|http://home.chica...|312-747-8201|312-747-5479|312-747-9168| 1182739.183| 1858317.732|41.76643089|-87.60574786|(41.7664308925, -...|
    -|                  14|      Shakespeare|2150 N California...|Chicago|   IL|60647|http://home.chica...|312-744-8250|312-744-2422|312-744-8260| 1157304.426| 1914481.521|41.92110332|-87.69745182|(41.9211033246, -...|
    -|                   8|     Chicago Lawn|      3420 W 63rd St|Chicago|   IL|60629|http://home.chica...|312-747-8730|312-747-8545|312-747-8116| 1154575.242| 1862672.049|41.77898719|-87.70886382|(41.778987189, -8...|
    -|                   4|    South Chicago|     2255 E 103rd St|Chicago|   IL|60617|http://home.chica...|312-747-7581|312-747-5276|312-747-9169| 1193131.299| 1837090.265|41.70793329|-87.56834912|(41.7079332906, -...|
    -|                  20|          Lincoln|  5400 N Lincoln Ave|Chicago|   IL|60625|http://home.chica...|312-742-8714|312-742-8803|312-742-8841| 1158399.146| 1935788.826|41.97954951|-87.69284451|(41.9795495131, -...|
    -|                  18|       Near North|  1160 N Larrabee St|Chicago|   IL|60610|http://home.chica...|312-742-5870|312-742-5771|312-742-5773| 1172080.029| 1908086.527|41.90324165|-87.64335214|(41.9032416531, -...|
    -|                  12|        Near West|1412 S Blue Islan...|   null| null| null|                null|        null|        null|        null|        null|        null|       null|        null|                null|
    -|",Chicago,IL,6060...| -87.6569725149)"|                null|   null| null| null|                null|        null|        null|        null|        null|        null|       null|        null|                null|
    -|                   9|          Deering|   3120 S Halsted St|Chicago|   IL|60608|http://home.chica...|312-747-8227|312-747-5329|312-747-9172|  1171440.24| 1884085.224|41.83739443|-87.64640771|(41.8373944311, -...|
    -+--------------------+-----------------+--------------------+-------+-----+-----+--------------------+------------+------------+------------+------------+------------+-----------+------------+--------------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    The reported crimes dataset has only the district number. Add the district name by joining with the police station dataset.

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Caching the crimes dataset to speed things up, and then, since spark does lazy evaluation, gonna run an action to make it evaluated.
    -df.cache()
    -df.count()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    257625
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    ps.select(col('DISTRICT')).distinct().show()
    -df.select(col('District')).distinct().show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +------------+
    -|    DISTRICT|
    -+------------+
    -|           7|
    -|          15|
    -|          11|
    -|           3|
    -|           8|
    -|          22|
    -|          16|
    -|           5|
    -|          18|
    -|          17|
    -|           6|
    -|          19|
    -|          25|
    -|Headquarters|
    -|          24|
    -|           9|
    -|           1|
    -|          20|
    -|          10|
    -|           4|
    -+------------+
    -only showing top 20 rows
    -
    -+--------+
    -|District|
    -+--------+
    -|     009|
    -|     012|
    -|     024|
    -|     031|
    -|     015|
    -|     006|
    -|     019|
    -|     020|
    -|     011|
    -|     025|
    -|     003|
    -|     005|
    -|     016|
    -|     018|
    -|     008|
    -|     022|
    -|     001|
    -|     014|
    -|     010|
    -|     004|
    -+--------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Transfomring crime data to remove 0 from beginning, inroder to match the data
    -from pyspark.sql.functions import lpad
    -ps = ps.withColumn('Format_district',lpad(col('DISTRICT'),3,'0'))
    -ps.select(col('Format_district')).show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +---------------+
    -|Format_district|
    -+---------------+
    -|            001|
    -|            006|
    -|            011|
    -|            016|
    -|            Hea|
    -|            024|
    -|            002|
    -|            007|
    -|            025|
    -|            010|
    -|            015|
    -|            003|
    -|            014|
    -|            008|
    -|            004|
    -|            020|
    -|            018|
    -|            012|
    -|            ",C|
    -|            009|
    -+---------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    # Executing the join and deleting some column so that we don't get too much data
    -df.join(ps, df.District == ps.Format_district, 'left_outer').drop(
    - 'ADDRESS',
    - 'CITY',
    - 'STATE',
    - 'ZIP',
    - 'WEBSITE',
    - 'PHONE',
    - 'FAX',
    - 'TTY',
    - 'X COORDINATE',
    - 'Y COORDINATE',
    - 'LATITUDE',
    - 'LONGITUDE',
    - 'LOCATION').show()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    +--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+----+--------------------+--------+--------------+---------------+
    -|      ID|Case Number|               Date|               Block|IUCR|        Primary Type|         Description|Location Description|Arrest|Domestic|Beat|District|Ward|Community Area|FBI Code|Year|          Updated On|DISTRICT| DISTRICT NAME|Format_district|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+----+--------------------+--------+--------------+---------------+
    -|11955940|   JD121140|2019-07-31 09:00:00|  026XX S HALSTED ST|1154|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|  SMALL RETAIL STORE| false|   false|0913|     009|  11|            60|      11|2019|01/20/2020 03:52:...|       9|       Deering|            009|
    -|11956035|   JD121288|2019-09-10 00:01:00|070XX S STONY ISL...|1582|OFFENSE INVOLVING...|   CHILD PORNOGRAPHY|           RESIDENCE| false|   false|0332|     003|   5|            43|      17|2019|01/20/2020 03:52:...|       3|Grand Crossing|            003|
    -|11956045|   JD121237|2019-12-15 12:00:00|      069XX S ADA ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|0734|     007|   6|            67|      05|2019|01/20/2020 03:52:...|       7|     Englewood|            007|
    -|11956146|   JD121344|2019-12-31 00:00:00|   016XX S THROOP ST|0610|            BURGLARY|      FORCIBLE ENTRY|           RESIDENCE| false|   false|1233|     012|  25|            31|      05|2019|01/20/2020 03:52:...|      12|     Near West|            012|
    -|11956304|   JD121233|2019-11-27 10:35:00|   002XX W GOETHE ST|0810|               THEFT|           OVER $500|           RESIDENCE| false|   false|1821|     018|   2|             8|      06|2019|01/20/2020 03:52:...|      18|    Near North|            018|
    -|11956126|   JD121418|2019-12-23 06:00:00|083XX S MUSKEGON AVE|0820|               THEFT|      $500 AND UNDER|              STREET| false|   false|0423|     004|   7|            46|      06|2019|01/20/2020 03:52:...|       4| South Chicago|            004|
    -|11956221|   JD121487|2019-09-13 17:00:00|   050XX W NELSON ST|2825|       OTHER OFFENSE|HARASSMENT BY TEL...|           RESIDENCE| false|   false|2521|     025|  31|            19|      26|2019|01/20/2020 03:52:...|      25| Grand Central|            025|
    -|11956049|   JD121196|2019-12-19 19:07:00|     011XX W 83RD ST|1544|         SEX OFFENSE|SEXUAL EXPLOITATI...|           RESIDENCE| false|   false|0613|     006|  21|            71|      17|2019|01/20/2020 03:52:...|       6|       Gresham|            006|
    -|11956468|   JD121650|2019-06-01 08:00:00|  081XX W ADDISON ST|1153|  DECEPTIVE PRACTICE|FINANCIAL IDENTIT...|           APARTMENT| false|   false|1631|     016|  38|            17|      11|2019|01/20/2020 03:52:...|      16|Jefferson Park|            016|
    -|11955888|   JD121051|2019-12-01 00:01:00| 031XX S PRAIRIE AVE|1152|  DECEPTIVE PRACTICE|ILLEGAL USE CASH ...|           RESIDENCE| false|   false|0211|     002|   4|            35|      11|2019|01/20/2020 03:52:...|       2|     Wentworth|            002|
    -|11956127|   JD121334|2019-12-14 22:00:00|023XX N MILWAUKEE...|0810|               THEFT|           OVER $500|               OTHER| false|   false|1414|     014|   1|            22|      06|2019|01/20/2020 03:52:...|      14|   Shakespeare|            014|
    -|11956215|   JD121376|2019-12-12 19:00:00| 0000X E CHESTNUT ST|0890|               THEFT|       FROM BUILDING|          RESTAURANT| false|   false|1833|     018|  42|             8|      06|2019|01/20/2020 03:52:...|      18|    Near North|            018|
    -|11956004|   JD121156|2019-11-15 12:00:00| 0000X N LATROBE AVE|1310|     CRIMINAL DAMAGE|         TO PROPERTY|           RESIDENCE| false|   false|1522|     015|  28|            25|      14|2019|01/20/2020 03:52:...|      15|        Austin|            015|
    -|11956129|   JD121379|2019-11-03 15:00:00|    104XX S AVENUE J|2825|       OTHER OFFENSE|HARASSMENT BY TEL...|               OTHER| false|   false|0432|     004|  10|            52|      26|2019|01/20/2020 03:52:...|       4| South Chicago|            004|
    -|11956041|   JD121266|2019-12-21 14:18:00|025XX S CALIFORNI...|1752|OFFENSE INVOLVING...|AGG CRIM SEX ABUS...|           APARTMENT| false|   false|1033|     010|  12|            30|      17|2019|01/20/2020 03:52:...|      10|         Ogden|            010|
    -|11935337|   JC563784|2019-12-29 02:25:00|     021XX E 70TH ST|143A|   WEAPONS VIOLATION|UNLAWFUL POSS OF ...|               ALLEY|  true|   false|0331|     003|   5|            43|      15|2019|01/20/2020 03:49:...|       3|Grand Crossing|            003|
    -|11927831|   JC553936|2019-12-20 13:15:00|082XX S JEFFERY BLVD|0545|             ASSAULT|PRO EMP HANDS NO/...|              STREET|  true|   false|0414|     004|   8|            46|     08A|2019|01/20/2020 03:49:...|       4| South Chicago|            004|
    -|11927034|   JC549451|2019-12-16 19:15:00|  076XX S CICERO AVE|0820|               THEFT|      $500 AND UNDER|  SMALL RETAIL STORE| false|   false|0833|     008|  18|            65|      06|2019|01/20/2020 03:49:...|       8|  Chicago Lawn|            008|
    -|11926969|   JC553305|2019-12-19 21:30:00|     016XX W LAKE ST|0486|             BATTERY|DOMESTIC BATTERY ...|           RESIDENCE|  true|   false|1224|     012|  27|            28|     08B|2019|01/20/2020 03:49:...|      12|     Near West|            012|
    -|11926797|   JC553193|2019-12-19 19:04:00| 018XX W DIVISION ST|0620|            BURGLARY|      UNLAWFUL ENTRY|           RESIDENCE|  true|   false|1212|     012|   1|            24|      05|2019|01/20/2020 03:49:...|      12|     Near West|            012|
    -+--------+-----------+-------------------+--------------------+----+--------------------+--------------------+--------------------+------+--------+----+--------+----+--------------+--------+----+--------------------+--------+--------------+---------------+
    -only showing top 20 rows
    -
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    -

    Hands-on again!

    -
    -
    -
    -
    -
    -
    -

    What is the most frequently reported non-criminal activity?

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
     
    -
    - -
    -
    -
    - -
    -
    -
    -
    -

    Find the day of the week with the most reported crime?

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
     
    -
    - -
    -
    -
    - -
    -
    -
    -
    -

    Using a bar chart, plot which day of the week has the most number of reported crime.

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    from pyspark.sql.functions import date_format, col
    -dow = [x[0] for x in df.groupBy(date_format(col('Date'),'E')).count().collect()]
    -print(dow)
    -cnt = [x[1] for x in df.groupBy(date_format(col('Date'),'E')).count().collect()]
    -print(cnt)
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    ['Sun', 'Mon', 'Thu', 'Sat', 'Wed', 'Tue', 'Fri']
    -[20036, 21041, 20701, 21786, 20652, 21331, 22147]
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    import pandas as pd
    -import matplotlib.pyplot as plt
    -
    -cp = pd.DataFrame({'Day_of_week':dow, 'Count':cnt})
    -cp.head()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - -
    -
    - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
    Day_of_weekCount
    0Sun20036
    1Mon21041
    2Thu20701
    3Sat21786
    4Wed20652
    -
    -
    - -
    - -
    -
    - -
    -
    -
    -
    In [0]:
    -
    -
    -
    cp.sort_values('Count', ascending=False).plot(kind='bar', color= 'red', x='Day_of_week', y='Count')
    -plt.xlabel("Day of week")
    -plt.ylabel("Number of reported crimes")
    -plt.title("No.of reported crimes per day")
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    Text(0.5, 1.0, 'No.of reported crimes per day')
    -
    - -
    - -
    - -
    - - - - -
    - -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    -

    RDD

    -
    -
    -
    -
    -
    -
    -

    With map, you define a function and then apply it record by record. Flatmap returns a new RDD by first applying a function to all of the elements in RDDs and then flattening the result. Filter, returns a new RDD. Meaning only the elements that satisfy a condition. With reduce, we are taking neighboring elements and producing a single combined result. -For example, let's say you have a set of numbers. You can reduce this to its sum by providing a function that takes as input two values and reduces them to one.

    -
    -

    Some of the reasons you would use a dataframe over RDD are:

    -

      -
    1. It's ability to represnt data as rows and columns. But this also means it can only hold structred and semi-structured data.
    2. -
    3. It allows processing data in different formats (AVRO, CSV, JSON, and storage system HDFS, HIVE tables, MySQL).
    4. -
    5. It's superior job Optimization capability.
    6. -
    7. DataFrame API is very easy to use.
    8. - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    psrdd = sc.textFile('police-station.csv')
    -print(psrdd.first())
    -ps_header = psrdd.first()
    -ps_rest = psrdd.filter(lambda line: line!=ps_header)
    -print(ps_rest.first())
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    - - -
    -
    DISTRICT,DISTRICT NAME,ADDRESS,CITY,STATE,ZIP,WEBSITE,PHONE,FAX,TTY,X COORDINATE,Y COORDINATE,LATITUDE,LONGITUDE,LOCATION
    -1,Central,1718 S State St,Chicago,IL,60616,http://home.chicagopolice.org/community/districts/1st-district-central/,312-745-4290,312-745-3694,312-745-3693,1176569.052,1891771.704,41.85837259,-87.62735617,"(41.8583725929, -87.627356171)"
    -
    -
    -
    - -
    -
    - -
    -
    -
    -
    -

    How many police stations are there?

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    ps_rest.map(lambda line: line.split(",")).count()
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    24
    -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    Display the District ID, District name, Address and Zip for the police station with District ID 7

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # District is column  0
    -(ps_rest.filter(lambda line: line.split(",")[0]=='7').
    - map(lambda line: (line.split(",")[0],
    -    line.split(",")[1],
    -    line.split(",")[2],
    -    line.split(",")[5])).collect())
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    [('7', 'Englewood', '1438 W 63rd St', '60636')]
    -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    Police stations 10 and 11 are geographically close to each other. Display the District ID, District name, address and zip code

    - -
    -
    -
    -
    -
    -
    In [0]:
    -
    -
    -
    # District is column  0
    -(ps_rest.filter(lambda line: line.split(",")[0] in ['10', '11']).
    - map(lambda line: (line.split(",")[0],
    -    line.split(",")[1],
    -    line.split(",")[2],
    -    line.split(",")[5])).collect())
    -
    - -
    -
    -
    - -
    -
    - - -
    - -
    Out[0]:
    - - - - -
    -
    [('11', 'Harrison', '3151 W Harrison St', '60612'),
    - ('10', 'Ogden', '3315 W Ogden Ave', '60623')]
    -
    - -
    - -
    -
    - -
    -
    -
    -
    -

    -

    User-Defined Functions (UDF)

    -
    -
    -
    -
    -
    -
    -

    PySpark User-Defined Functions (UDFs) help you convert your python code into a scalable version of itself. It comes in handy more than you can imagine, but beware, as the performance is less when you compare it with pyspark functions. You can view examples of how UDF works here. What I will give in this section is some theory on how it works, and why it is slower.

    -

    When you try to run a UDF in PySpark, each executor creates a python process. Data will be serialised and deserialised between each executor and python. This leads to lots of performance impact and overhead on spark jobs, making it less efficent than using spark dataframes. Apart from this, sometimes you might have memory issues while using UDFs. The Python worker consumes huge off-heap memory and so it often leads to memoryOverhead, thereby failing your job. Keeping these in mind, I wouldn't recommend using them, but at the end of the day, your choice.

    - -
    -
    -
    -
    -
    -
    -

    -

    Common Questions

    -
    -
    -
    -
    -
    -
    -

    Recommeded IDE

    I personally prefer PyCharm while coding in Python/PySpark. It's based on IntelliJ IDEA so it has a lot of features! And the main advantage I have felt is the ease of installing pyspark and other packages. You can customize it with themes and plugins, and it lets you enhance productivity while coding by providing some features like suggestions, Local VCS etc.

    - -
    -
    -
    -
    -
    -
    -

    Submitting a spark job

    The python syntax for running jobs is: python <file_name>.py <arg1> <arg2> ...
    -But when you submit a spark job you have to use spark-submit to run the application.

    -

    Here is a simple example of a spark-submit command:
    -spark-submit filename.py --named_argument 'arguemnt value'
    -Here, named_argument is an arguemnt that you are reading from inside your script.

    -

    There are other options you can pass in the command, like:
    ---py-files which helps you pass a python file to read in your file,
    ---files which helps pass other files like txt or config,
    ---deploy-mode which tells wether to deploy your worker node on cluster or locally ---conf which helps pass different configurations, like memoryOverhead, dynamicAllocation etc.

    -

    There is an entire page in spark documentation dedicated to this. I highly recommend you go through it once.

    - -
    -
    -
    -
    -
    - - - - - -