Envision

List of Abbreviations - Latex

This is a short post about generating a list of abbreviations for your document with Latex. While there are many packages available for this, I'm going to use glossaries package. You can find the user mannual of glossaries package from here.

Here is the basic example.

\documentclass[a4paper,12pt]{report} %We are creating a report

\usepackage[automake]{glossaries} %Load glossaries package
\makeglossaries

%Here we define a set of example acronyms
\newglossaryentry{Aaa}{name={AAA},description={First abbreviation}}
\newglossaryentry{Bbb}{name={BBB},description={Second abbreviation}}
\newglossaryentry{Ccc}{name={CCC},description={Third abbreviation}}

\begin{document}

\printglossary[title={List of Abbreviations}] %Generate List of Abbreviations

\chapter{Sample Chapter}

Here we are using the first abbreviation \gls{Aaa}. This is our second abbreviation \gls{Bbb}. The last one is \gls{Ccc}

\end{document}

The above code snippt provides following output.

List of Abbreviations

Sample Chapter

Note that "automake" parameter is not necessary to generate List of Abbreviations if you are using MikTex to compile Latex.

You can pass different parameters to glossaries package load command to achieve various customizations.  A list of few as follows. You can find more details from the user manual.

  1. Remove the dot at the end of each abbreviation - nopostdot
  2. Remove page number at the end of each abbreviation - nonumberlist
  3. Prevent abbreviation grouping - nogroupskip
  4. Add "List of Abbreviations" to Table of Content - toc
Further, if you want to change line spacing between elements in the abbreviation list you can use \singlespacing, \onehalfspacing or \doublespacing according to your requirement.

Here is the complete example.



\documentclass[a4paper,12pt]{report} %We are creating a report

\usepackage[nopostdot,nogroupskip,style=super,nonumberlist,toc,automake,toc]{glossaries} %Load glossaries package

\makeglossaries

%Here we define a set of example acronyms
\newglossaryentry{Aaa}{name={AAA},description={First abbreviation}}
\newglossaryentry{Bbb}{name={BBB},description={Second abbreviation}}
\newglossaryentry{Ccc}{name={CCC},description={Third abbreviation}}

\begin{document}

\tableofcontents

\onehalfspacing
\printglossary[title={List of Abbreviations}] %Generate List of Abbreviations


\chapter{Sample Chapter}

Here we are using the first abbreviation \gls{Aaa}. This is our second abbreviation \gls{Bbb}. The last one is \gls{Ccc}

\end{document}

Here is the final output.

Table of Contents
List of Abbreviations

Sample Chapter
Hope this quick guide will help you.

Happy writing!

Grad-CAM Implementation in pycaffe

You can find the code discussed in this post in this git repository.

This post discusses how to implement Gradient-weighted Class Activation Mapping (Grad-CAM) approach discussed in the paper Grad-CAM: Why did you say that? Visual Explanations from Deep Networks via Gradient-based Localization.

Grad-CAM is a technique that makes Convolutional Neural Network (CNN) based models more interpretable by visualizing input regions where the model would look at while making a predictions.

Grad-CAM model architecture

I'm not going to go deeper in the paper, for a more detailed explanation please refer the paper.

You can find different implementations of this technique in KerasTorch+Caffe, and Tensorflow.
However, I was not able to find pycaffe implementation of GradCAM in the web. As pycaffe is a commonly used deep learning framework for CNN based classification model development, it would be useful to have a pycaffe implementation as well.

If you are looking for a quick solution to interpret your Caffe classification model, this post is for you!

Install

If you are completely new to Caffe, refer the Caffe official page for installation instructions and some tutorials. As we are going to use python interface to Caffe (pycaffe), make sure you install pycaffe as well. All the required instructions are given in the Caffe web site.

Implementation

For this implementation I'm using a pretrained image classification model downloaded from the community in Caffe Model Zoo.

For this example, I will use BVLC reference caffenet model which is trained to classify images into 1000 classes. To download the model, go to the folder where you installed Caffe, e.g. C:\Caffe and run

 ./scripts/download_model_binary.py models/bvlc_reference_caffenet
./data/ilsvrc12/get_ilsvrc_aux.sh

Then let's write the gradCAM.py script

 #load the model
net = caffe.Net('---path to caffe installation folder---/models/bvlc_reference_caffenet/deploy.prototxt',
                '---path to caffe installation folder---/models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel',
                caffe.TEST)

# load input and preprocess it
transformer = caffe.io.Transformer({'data': net.blobs['data'].data.shape})
transformer.set_mean('data', np.load('--path to caffe installation folder--/python/caffe/imagenet/ilsvrc_2012_mean.npy').mean(1).mean(1))
transformer.set_transpose('data', (2,0,1))
transformer.set_channel_swap('data', (2,1,0))
transformer.set_raw_scale('data', 255.0)

#We reshape the image as we classify only one image
net.blobs['data'].reshape(1,3,227,227)

#load the image to the data layer of the model
im = caffe.io.load_image('--path to caffe installation folder--/examples/images/cat.jpg')
net.blobs['data'].data[...] = transformer.preprocess('data', im)

#classify the image
out = net.forward()

#predicted class
print (out['prob'].argmax())

Next we have to calculate the gradient of the predicted class socre w.r.t to the convolution layer of interest. This is the tricky part. Caffe framework provides an inbuilt function

 net.backward()

to calculate gradients of the network. However, if you study the documentation of backward() function you would understand that, this method calculates gradients of  loss w.r.t. input layer (or as commonly used in Caffe 'data' layer).

To implement Grad-CAM we need gradients of the layer just before the softmax layer with respect to a convolution layer, preferably the last convolution layer. To achieve this you have to modify the deploy.prototxt file. You just have to remove the softmax layer and add following line just after the model name.

 force_backward: true

Then by using the following code snippet we can derive Grad-CAM


final_layer = "fc8" #output layer whose gradients are being calculated
image_size = (227,227) #input image size
feature_map_shape = (13, 13) #size of the feature map generated by 'conv5'
layer_name = 'conv5' #convolution layer of interest
category_index = out['fc8'].argmax() #-if you want to get the saliency map of predicted class or else you can get saliency map for any interested class by specifying here

#Make the loss value class specific    
label = np.zeros(input_model.blobs[final_layer].shape)
label[0, category_index] = 1    

imdiff = net.backward(diffs= ['data', layer_name], **{input_model.outputs[0]: label}) 
gradients = imdiff[layer_name] #gradients of the loss value/ predicted class score w.r.t conv5 layer

#Normalizing gradients for better visualization
gradients = gradients/(np.sqrt(np.mean(np.square(gradients)))+1e-5)
gradients = gradients[0,:,:,:]

print("Gradients Calculated")

activations = net.blobs[layer_name].data[0, :, :, :] 

#Calculating importance of each activation map
weights = np.mean(gradients, axis=(1, 2))

cam = np.ones(feature_map_shape, dtype=np.float32)

for i, w in enumerate(weights):
    cam += w * activations[i, :, :]    

#Let's visualize Grad-CAM
cam = cv2.resize(cam, image_size)
cam = np.maximum(cam, 0)
heatmap = cam / np.max(cam)
cam = cv2.applyColorMap(np.uint8(255 * heatmap), cv2.COLORMAP_JET) 

#We are going to overlay the saliency map on the image
new_image = cv2.imread(''--path to caffe installation folder--/examples/images/cat.jpg'')
new_image = cv2.resize(new_image, image_size)

cam = np.float32(cam) + np.float32(new_image)
cam = 255 * cam / np.max(cam)
cam = np.uint8(cam)

#Finally saving the result
cv2.imwrite("gradcam.jpg", cam) 

That's it. If everything goes smoothly you will get the following result.



Input Image

Grad-CAM image

Hope this will be helpful. If you need any clarification please feel free to comment below, I'm happy to help you.






Data Pre-Processing with R

In this post I hope to discuss how we can pre-process data using R language. I'm using R Studio for the data analysis.

For this analysis I'm using freely available Ta-Feng Supermarket data set. You can download the data set here. A description about the data set can be found here.

First of all we have to set our working directory. Let's assume that we are going to use the folder named "R_Work_Space" in the Desktop as our working directory. Then we can set the working directly as:

 setwd("{Path to Desktop}/Desktop/R_Work_Space")  

Then let's load our data set as follows.

 suppermarket_dataset <- read.csv(file="SupperMarketData.csv",head=TRUE,sep=",")  

You can view the loaded data set from "View" command.

 View(suppermarket_dataset)  

You may see the data set as follows in R Studio.


Let's start pre-processing.

First of all let's see the type of each attribute in the data set. This will be useful for our future analysis. Use "str ( )" function in R for this purpose.

 str(suppermarket_dataset)  

The output will be a description as follows:


Here Customer ID and Product Sub class being integer values doesn't make any sense as those fields have distinct values.  We should convert those fields to have factorial values. Following code segment does the job.

 suppermarket_dataset$Customer.ID <- as.factor(suppermarket_dataset$Customer.ID)  
 suppermarket_dataset$Product.Subclass <- as.factor(suppermarket_dataset$Product.Subclass)  

Again use "str( )" function to verify your conversion.

Next, for our analysis we need the day of the week information. Using R we can add a new column named "Day" to our data set with the day of the week related to value in the "Date" column.

 suppermarket_dataset$Day <- as.factor(weekdays(as.Date(suppermarket_dataset$Date, "%m/%d/%Y")))  

Use "View" command to view the data set and you can now see that a new column called "Day" has been added to the data set.

If we had the "Time" information and if we want to analysis data hour wise, we can extract hour information from time using "hour" in "lubridate" package. For that first we have to install "lubridate" package using install.packages( ).

 install.packages("lubridate")  

Next step is loading the installed package.

 library(lubridate)  

Now we can call "hour" function as below to derive hour from the "Time" information.

 suppermarket_dataset$Hour <- hour(strptime(suppermarket_dataset$Time, "%H:%M:%S"))  

Please note that to use the above command, we should have our time in the international standard notation.

Then, we have to calculate total amount spend by each customer in each transaction. We can calculate that by,
 suppermarket_dataset$Total_Amount <- suppermarket_dataset$Amount*suppermarket_dataset$Sales.price  

In our data set we have a column called "Assest" which would not be used for our analysis. Therefore let's remove that column

 suppermarket_dataset <- suppermarket_dataset[,-c(8)]  

Let's assume that in our analysis we want to exclude the purchases done by people belong "Below 25" age category. "Below 25" age category is represented by the letter "A" in "Age" column.

 suppermarket_dataset <- suppermarket_dataset[-(suppermarket_dataset$Age == "A"),]  

We can use the above code segment to remove records that belong to "Below 25" age category.
Now we are done with pre-processing our data set. We should save our new data set for future use. We can write the data set to a .csv file using following command.

 write.csv(file="Final_Suppermarket_Dataset.csv", x=suppermarket_dataset)  

Done for the day!
Let's meet with the next post which will discuss how to do a descriptive analysis of this data set.

Implementing a sever fail over feature

Suppose you have a client-server application and you want to automatically switch to a standby/backup server when the primary server is unavailable due to either failure or scheduled shut down. I present you how to achieve that by this post.

We can have the primary and secondary urls in a configuration file and load them to a list at the beginning of the system. For the demonstration purpose I have hard coded list of urls.

We can achieve it easily by checking the response code sent by the server. I believe the code it self-explanatory.

public class Envision {

    public static void main(String[] args) throws MalformedURLException {
        HttpURLConnection httpURLConnection = null;
        InputStream inputStream = null;
        BufferedReader bufferedReader = null;
        String result = "";
        String inStr = null;
        List<URL> urls = new ArrayList<URL>();
        urls.add(new URL("http://example_primary.com"));
        urls.add(new URL("http://example_secondary.com"));


        try {
            for (int i = 0; i < urls.size(); i++) {

                try {
                    httpURLConnection = (HttpURLConnection) urls.get(i).openConnection();
                    inputStream = httpURLConnection.getInputStream();
                    bufferedReader = new BufferedReader(new InputStreamReader(inputStream));

                    int responseCode = httpURLConnection.getResponseCode();

                    if (responseCode == 200) {
                        while ((inStr = bufferedReader.readLine()) != null) {
                            result = result + inStr;
                        }
                        break;
                    }
                } catch (Exception e) {
                    System.out.println("Error: While requesting data from server "+urls.get(i)+" " + e.getMessage() );

                }
            }

            System.out.println(">>>>Res: " + result);

        } catch (Exception e) {
            System.out.println("Error: While requesting data from server" + e.getMessage());

        }
    }
}

Happy Coding!

How to create a batch file to run a Java program?

I demonstrated how to create an .exe using maven in my previous post. Another way of distributing a software is by a run time which includes a batch file.

What is a batch file? 
A batch file is a type of script file which contains a series of instructions to be executed in turn. These are used to automate frequently performed tasks.  

You can write a batch file to compile, to create the JAR file and run the program. But in this post I mostly focus on creating run time and I assume that you have the .jar file already with you. You can use a tool like Maven or Ant to build your .jar file.

The following image shows the folder structure of my run time.
Folder Structure of the runtime
Here Blog-1.0-SNAPSHOT.jar is the JAR file of my program. My program is the same one I used in the previous post. So the program requires jdom2 library. One thing you should notice is that if you are using maven to build the program and unless you use a maven plugin to add dependencies into your JAR file, it does not include the other used .jars. Therefore in creating the run time you have to add those libraries in to the /lib folder. Also if your program requires any config files you can place them in the /config folder. Like wise if you have any log files write the program so that log files are placed in the /log folder.

Then let's create the start.bat file. Actually what you have to do is very easy. Just place following lines in a text file and save it with the extension of .bat.


title=XML Reader
java -Xmx256m -DAlert=true -classpath .;Blog-1.0-SNAPSHOT.jar;.\lib\jdom-2.0.5.jar Envision


pause

As you can see I have given the name of the JAR file, libraries to be used in the program (if you have many libraries give them as a series separated by semi colon ) and at the end the name of the main class.

It's really easy to make a run time, isn't it?

Happy coding! 

How to create an exe for a java program using maven?

Usually a software is distributed as an executable file. Therefore as programmers we would need to create an .exe file for our programs. Through this post I will present you how to achieve it easily with Maven (a build automation tool used primarily for Java projects) .

A windows executable can be created by using a combination of two maven plug-ins , Maven Shade plugin and launch4j plugin. 

Here is my program. It is to read an .xml file and write its content to the standard out put. In order to read the .xml file we have to use a library. Here I have used jdom2. Like that in developing software we have to depend on many libraries in order to prevent reinventing the wheel. If we are building our projects using maven we can include those dependencies in the pom.xml file


import org.jdom2.Document;
import org.jdom2.Element;
import org.jdom2.JDOMException;
import org.jdom2.input.SAXBuilder;

import java.io.File;
import java.io.IOException;
import java.util.List;

public class Envision {
    public static void main(String[] args){
        File xmlFile = new File("D:\\example.xml");
        SAXBuilder builder = new SAXBuilder();
        try {
            Document document = (Document) builder.build(xmlFile);
            Element rootNode = document.getRootElement();
            List books = rootNode.getChildren("book");

            System.out.println("This is my book store");

            for (int l = 0; l < books.size(); l++) {

                Element book = (Element) books.get(l);
                System.out.println("Name :"+book.getChildText("name")+"     Author :"+book.getChildText("author"));
                System.out.println("----------------------------------------------------------------------");

            }



        } catch (IOException io) {
            System.out.println(io.getMessage());
        } catch (JDOMException jdomex) {
            System.out.println(jdomex.getMessage());
        }
    }
}

Here is my pom.xml file. Here Maven Shade plugin is used to add all the dependencies in the program into the runnable jar file. The launch4j creates the .exe with vender information and a nice icon too. 

I have added the exe.ico in src/main/resources. 



<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>Blog</groupId>
    <artifactId>Blog</artifactId>
    <version>1.0</version>

    <dependencies>
        <dependency>
            <groupId>org.jdom</groupId>
            <artifactId>jdom2</artifactId>
            <version>0.0.6-BETA</version>
        </dependency>
        <dependency>
            <groupId>com.vaadin.external.google</groupId>
            <artifactId>android-json</artifactId>
            <version>0.0.20131108.vaadin1</version>
        </dependency>
        <dependency>
            <groupId>com.vaadin.external.google</groupId>
            <artifactId>android-json</artifactId>
            <version>0.0.20131108.vaadin1</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>

            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>1.7.1</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <shadedArtifactAttached>true</shadedArtifactAttached>
                    <shadedClassifierName>shaded</shadedClassifierName>
                    <transformers>
                        <transformer
                                implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                            <mainClass>Envision</mainClass>
                        </transformer>
                    </transformers>
                </configuration>
            </plugin>
            <plugin>
                <groupId>com.akathist.maven.plugins.launch4j</groupId>
                <artifactId>launch4j-maven-plugin</artifactId>
                <version>1.5.1</version>
                <executions>

                    <!-- Command-line exe -->
                    <execution>
                        <id>l4j-cli</id>
                        <phase>package</phase>
                        <goals>
                            <goal>launch4j</goal>
                        </goals>
                        <configuration>
                            <headerType>console</headerType>
                            <outfile>target/envision.exe</outfile>
                            <jar>target/${artifactId}-${version}.jar</jar>
                            <errTitle>App Err</errTitle>
                            <classPath>
                                <mainClass>Envision</mainClass>
                            </classPath>
                            <icon>src/main/resources/exe.ico</icon>
                            <jre>
                                <minVersion>1.5.0</minVersion>
                            </jre>
                            <versionInfo>
                               <fileVersion>1.0.0.0</fileVersion>
                               <txtFileVersion>${project.version}</txtFileVersion>
                               <fileDescription>${project.name}</fileDescription>
                               <copyright>2014 envision.com</copyright>
                               <productVersion>1.0.0.0</productVersion>
                               <txtProductVersion>1.0.0.0</txtProductVersion>
                               <productName>${project.name}</productName>
                               <companyName>Envision</companyName>
                               <internalName>envision</internalName>
                               <originalFilename>envision.exe</originalFilename>
                            </versionInfo>
                        </configuration>
                    </execution>
                </executions>
            </plugin>

        </plugins>
    </build>


</project>

After configuring the pom.xml file just execute maven install to get the .exe file. 

That's it. Check the target folder in you project folder to find the .exe. 

Hope this would help you :) 

Happy Coding! 

Masks...!


"Please let me be myself....!" Have you ever heard your soul is yelling like this? After being fed up of pretending : presenting yourself as someone else to the world.
Everyone of us has played this game, may be when you are with your crush, with your boss or with whom you want to impress. You feel like it's not real you that they see. Yet you don't want to change it, because you are afraid of loosing their attention. You hide yourself behind a "MASK".

The reason may be different : to impress some one, not to be neglected or to hide your own emotions, but most of the time you use a mask. So do I. Sometimes we have a collection of masks to be put on based on our immediate environment. Rather to meet different masks out there. Now if you feel "this is not my story", well! that is fantastic! But to be honest it is not the case.

Life is almost a masquerade party where everyone is wearing a mask and doing pretty crazy stuff. They know they are secured behind the MASK. We need this feeling of security because most of us always bother about others more than ourselves. We always hesitate what others would think about oneself, how would they react or would they laugh at us. Ultimately, your heart is getting heavy with the fear of being rejected, ignored  and criticized. That fear forces you to put on a MASK. You try to make sure that you present yourself as it suits to the society.

Sometimes the society itself forces to hide our true inner selves. Expressing your true opinions on something may be destructive. In other case your views might not be compatible with others'. May be things are going wrong with you, yet you have to put a smile in your face, because no one is there to care. In each of these cases we have to be with a mask.

Sometimes we use masks neither because of the fear being rejected nor the expectations of the society, but for our benefits. It's an open secret that politicians are pretending in front of the public to retain their power. Not only them but also us, in our day to day life, hide our true selves in order to have benefits.

Whatever the reasons we are wearing masks, we add layers and accessories that add more credibility to the costume. Gradually the costume become heavier and heavier until it becomes to a point where you are living in someone else's life.

We end up with mental and physical exhaustion. We get burnt out from the effort of trying to maintain a facade. We breakup with ourselves - most valuable relationship in this planet. At the end, we will have a life full of suffering, or at least a life not enjoyable. We  lose our authenticity. Our own thoughts, own views and ideas would be buried with our dead bodies which could have changed the world or at least our own lives.
So get real. Let the world identify you as YOU. You are another master piece on this earth.