Unlock Insights to Boost User Experience Online

FORMCEPT has achieved pioneering position in extracting in-depth insights from piles of data. An interesting testimony to the degree of our influence in data analytics solutions space is our recent collaboration with ESPN Cricinfo to deliver a data analytics solution on our patent-pending platform that coincided with Cricket World Cup 2015. The solution for one of the leading providers of high value cricket analysis, news and trends, fit well with its popularity as a trusted authority on the game.

The visually eye pleasing data points are arranged in tabular and chart form for easy readability. The ability to extract insights from the site helps deliver a superior level of engagement with web visitors, irrespective of what their position is regarding cricket as a game.

Player Profile Analysis

Player Profile Analysis

Records - Most Wins by Teams

Records – Team-wise Wins

Some of the ways in which it provides engrossing and interesting data analytics experience to different categories of website visitors are as below -

1. A Team Manager

a. Single View – A team manager can easily correlate past performance and gain an idea of a player’s form over a period of time stretching back to last 10-15 years.

b. Decision Making – It helps him make insightful decision on whether to select, retain or drop a players based on his strengths and performance.

c. All-inclusive – The holistic data platform from FORMCEPT allows a well-thought view of how a player performs in different formats of the game. So if you are a team manager and want to see if a player is fit for IPL, you can check out his T20 record.

2. A Player

a. Visually Attractive – A player can gain invaluable performance insights in a highly interactive tabular and graphical format.

b. Performance Analysis – A batsman can conduct a meticulous self-analysis with help of detailed breakdown. This helps him to carry out an all-inclusive data-backed SWOT analysis and thus improve his game based on the website’s insights.

c. Up-to-date – The platform collates data in near real time. It allows a player to see latest figures in an easy to read manner and do a competitive analysis.

d. Takeaways – After a prolonged duration, it becomes difficult to track what are a player’s strong points, weak points or which team or player he is susceptible to. The impactful visualizations helps him glean useful data on performance trend of recent past and take corrective action for future.

3. End Users (fans)

a. Engrossing – As a fan, you get interesting facts quickly about your favorite player or team. For any cricket sportsperson you get insights on various aspects around his performance.

b. Authority – You comes across as a knowledgeable authority on the game of cricket when you share visually appealing stats over social media with your friends and acquaintances

c. Interesting Data Cuts – Fans get practically innumerous ways to look at data filtered by various parameters such as match formats, players, opposition, country played, ground played, and year etc. for both batting and bowling.

4. ESPN Cricinfo site

a. User Loyalty – As a result of the high amount of time spent by website visitors playing around with its interactive platform, the chances of sales conversion and business revenues is higher than its competitors.

b. Competitive differentiator – The site gains a competitive upper hand from the platform’s enhanced repeat visit potential and amazingly captivating content.

c. Rewarding User experience – The platform provides a highly appealing and immensely immersive UI/UX experience online for website visitors, thereby giving the site a distinct appeal.

Here is a glimpse of Insights Interface-

Sachin Tendulkar and Opposition

Sachin Tendulkar and Opposition

Sachin Tendulkar - Pace vs Spin

Sachin Tendulkar – Pace vs Spin

Sachin Tendulkar at World Cup

Sachin Tendulkar at World Cup

ESPNCricinfo site successfully merges ESPN’s proven cricket expertise with FORMCEPT’s superior technical acumen. The outcome is a visually stunning data analytics and insight platform that increases the value of information consumed by the website visitors.

Posted in Analysis, FORMCEPT | Tagged , , , | Comments Off

Nolan Scheduler

How often have you come across requirements that demand tasks to be performed repetitively at a defined interval? Yes, I am talking about a scheduler but a simple, yet powerful one that justifies its name- Just schedules. That is what Nolan Scheduler is all about.

Kuldeep, a champion clojurist, wrote the library and it is now an important part of FORMCEPT platform. It schedules all the jobs within the platform and keeps users up-to-date with the job status.

Email Scheduler

Lets take an example of email scheduler that is required to read emails from an email account, say GMail and do the same periodically. This is the classic use case for a scheduler. So, here is how you can schedule your email reader job-

Step-1: Pick the function to schedule

In this case, we can create a simple function that reads all the unread emails from the specified GMail account. Here is my namespace with the function read-email-

(ns fcgmail.core
  ^{:author "Anuj" :doc "FORMCEPT GMail Reader"}
  (:require [clojure-mail.core :as mcore]
            [clojure-mail.message :as m]
            1
            1))

; GMail Store Connection
(def ^:private gstore (atom nil))

(defn- read-msg
  "Reads the message and returns the subject and body"
  [msg]
  {:subject (msg :subject)
   :body (-> (filter
               #(and (:content-type %)
                     (.startsWith (:content-type %) "TEXT/PLAIN"))
               (msg :body))
             first :body)})
             
(defn read-email
  "Reads unread emails and marks them as read"
  [uri email pwd]
  (try
    (reset! gstore (mcore/gen-store email pwd))
    (let [msgs (mcore/unread-messages @gstore :inbox)
          fcmsgs (map #(read-msg (m/read-message %)) msgs)]
      (doseq [fcmsg fcmsgs]
        (log/info (str "Retrieved: " fcmsg))
        ; Do whatever you want with the message
    (catch Exception e (log/error (str "Failed: " (.getMessage e))))
    (finally
      (do (mcore/mark-all-read @gstore :inbox)
          (mcore/close-store @gstore)))))

It uses clojure-mail project to connect to GMail and read the messages. I will keep that explanation for the next blog but I encourage readers to go ahead and take a look at this project as well.

Step-2: Schedule

Now, comes the most interesting part. This how you can schedule your target function, i.e. read-email for this example-

(ns fcgmail.core
  (:require [nolan.core :as n]))

; Create Scheduler
(defonce sc (n/get-mem-scheduler))

; Schedule
(n/add-schedule sc "R//PT30S" #(read-email uri email pwd))

That is it :-) – Your scheduled function will be called every 30 seconds as per the repeating intervals syntax of ISO 8601. The function add-schedule returns a schedule ID which can be used later to expire a scehdule which stops all further executions and removes it from schedule store as shown below-

(expire sc scid)
; Check expiry status
(expired? sc scid)
; Should return true

By default, the library comes with built-in in-memory scheduler but you can extend the ScheduleStore protocol to the store of your choice. Please give it a try.

Posted in Development, FORMCEPT, Open Source, Research | Tagged , , | Comments Off

GDF Graph Loader for TinkerPop 2.x

Recently, we came across .gdf files that are a CSV like format for Graphs primarily used by GUESS. Although GDF file format is supported by Gephi, it was still missing from TinkerPop, one of the widely used graph computing framework.

Today, we are happy to release gdfpop, an open source implementation of GDF File Reader for TinkerPop 2.x under Apache License, Version 2.0. It allows you directly import .gdf files into FORMCEPT’s FactorDB storage engine that is compliant to TinkerPop 2.x blueprint APIs.

gdfpop APIs

gdfpop provides a method GDFReader.inputGraph that takes in an existing com.tinkerpop.blueprints.Graph instance and an input stream to the GDF file. There are three optional parameters-

  1. buf: Buffer size for BatchGraph. See BatchGraph for more details.
  2. quote: You can specify the quote character that is being used for the values. Default is double quotes.
  3. eidp: Edge property to be used as an ID

The implementation handles all the missing values, datatypes, default values and quotes gracefully. Here is a sample .gdf file that can be loaded via gdfpop-

nodedef>name VARCHAR,label VARCHAR2,class INT, visible BOOLEAN default false,color VARCHAR,width FLOAT,height DOUBLE
a,'Hello "world" !',1,true,'114,116,177',10.10,20.24567
b,'Well, this is',2, ,'219,116,251',10.98,10.986123
c,'A correct 'GDF' file',,,, ,
edgedef>node1 VARCHAR,node2 VARCHAR,directed BOOLEAN,color VARCHAR, weight LONG default 100
a, b,true,' 114,116,177',
b,c ,false,'219,116,251 ',300
c, a  , ,,

Example

For example, consider the following graph taken from default TinkerPop implementation-

gdfpop

It has 6 vertices and 6 edges with each vertex having two properties- label and age and each edge having a weight. The only change that we have done to convert it into a GDF file is that the property name has been renamed to label because name is used as node/vertex ID in GDF. See GDF File Format for all the possible properties for a vertex. The gdf file corresponding to the above graph is shown below-

nodedef>name VARCHAR,label VARCHAR,age INT,lang VARCHAR
1,marko,29,
2,vadas,27,
3,lop,,java
4,josh,32,
5,ripple,,java
6,peter,35,
edgedef>node1 VARCHAR,node2 VARCHAR,name VARCHAR,label VARCHAR,weight FLOAT
1,2,7,knows,0.5
1,4,8,knows,1.0
1,3,9,created,0.4
4,5,10,created,1.0
4,3,11,created,0.4
6,3,12,created,0.2

Although, GDF specification does not talk about an ID for the edges but you can ask gdfpop to use a specific edge property as an edge ID using the eidp parameter.

Using gdfpop

Consider an example.gdf file with the above vertices and edges is provided as input and you wish to use all the awesomness of TinkerPop 2.x stack on it. To do so, follow these steps-

Step-1: Build gdfpop

Currently, gdfpop is not available on Maven Central, so you will have to pick the latest release or build from source using the following command-

mvn clean compile install

Once Maven builds gdfpop, it will be available within your local maven repository and good to be integrated with your existing code base using the following maven dependency-

<dependency>
	<groupId>org.formcept</groupId>
	<artifactId>gdfpop</artifactId>
	<version>0.2.0</version>
</dependency>

Step-2: Load GDF files

Now, you can use the org.formcept.gdfpop.GDFReader functions to process and load the above example.gdf file as shown below-

// initialize
Graph graph = new TinkerGraph();
// load the gdf file
GDFReader.inputGraph(graph, new FileInputStream(new File("example.gdf")), "\"", "name");
// write it out as GraphSON
GraphSONWriter.outputGraph(graph, System.out);

The above code snippet will create a TinkerGraph, load it with all the vertices and edges as defined in example.gdf file and dump the loaded graph in GraphSON format that we can easily verify. For example, here is a JSON dump from the sample run of the above code-

{
    "mode": "NORMAL",
    "vertices": [{
        "name": "3",
        "label": "lop",
        "lang": "java",
        "_id": "3",
        "_type": "vertex"
    }, {
        "age": 27,
        "name": "2",
        "label": "vadas",
        "_id": "2",
        "_type": "vertex"
    }, {
        "age": 29,
        "name": "1",
        "label": "marko",
        "_id": "1",
        "_type": "vertex"
    }, {
        "age": 35,
        "name": "6",
        "label": "peter",
        "_id": "6",
        "_type": "vertex"
    }, {
        "name": "5",
        "label": "ripple",
        "lang": "java",
        "_id": "5",
        "_type": "vertex"
    }, {
        "age": 32,
        "name": "4",
        "label": "josh",
        "_id": "4",
        "_type": "vertex"
    }],
    "edges": [{
        "weight": 1.0,
        "node1": "4",
        "name": "10",
        "node2": "5",
        "_id": "10",
        "_type": "edge",
        "_outV": "4",
        "_inV": "5",
        "_label": "created"
    }, {
        "weight": 0.5,
        "node1": "1",
        "name": "7",
        "node2": "2",
        "_id": "7",
        "_type": "edge",
        "_outV": "1",
        "_inV": "2",
        "_label": "knows"
    }, {
        "weight": 0.4,
        "node1": "1",
        "name": "9",
        "node2": "3",
        "_id": "9",
        "_type": "edge",
        "_outV": "1",
        "_inV": "3",
        "_label": "created"
    }, {
        "weight": 1.0,
        "node1": "1",
        "name": "8",
        "node2": "4",
        "_id": "8",
        "_type": "edge",
        "_outV": "1",
        "_inV": "4",
        "_label": "knows"
    }, {
        "weight": 0.4,
        "node1": "4",
        "name": "11",
        "node2": "3",
        "_id": "11",
        "_type": "edge",
        "_outV": "4",
        "_inV": "3",
        "_label": "created"
    }, {
        "weight": 0.2,
        "node1": "6",
        "name": "12",
        "node2": "3",
        "_id": "12",
        "_type": "edge",
        "_outV": "6",
        "_inV": "3",
        "_label": "created"
    }]
}

You can notice that it has 6 vertices and 6 edges that were defined in the example.gdf file earlier.

Currently, gdfpop is compatible with only TinkerPop 2.x implementation. Going forward we may look into providing a plug-in for TinkerPop 3.x as well based on the interest of the community. Feel free to give us a shout at gdfpop.

References

  1. GDF: A CSV Like Format For Graphs – http://datascholars.com/post/2013/03/09/gdf/
  2. GUESS: The Graph Exploration System – http://guess.wikispot.org/The\_GUESS\_.gdf_format
  3. Gephi: The Open Graph Viz Platform – http://gephi.github.io/
  4. TinkerPop: An Open Source Graph Computing Framework – http://www.tinkerpop.com/
  5. gdfpop: Open source GDF File Reader for TinkerPop 2.x – https://github.com/formcept/gdfpop
  6. Apache License, Version 2.0: http://www.apache.org/licenses/LICENSE-2.0.html
  7. GraphSON Reader and Writer Library: https://github.com/tinkerpop/blueprints/wiki/GraphSON-Reader-and-Writer-Library
Posted in Development, FORMCEPT, Open Source, Research | Tagged , , , , | Comments Off

Gen-next of resumes : From standard text to visual infographics

Earlier this week, veteran HR executive, Lee E. Miller in his column for NJ.com, noted how visual resumes will dominate the next big wave in the recruitment industry. With recruiters starting to see more visual resumes, candidates are considering to traverse that path and catch the attention of recruiters by turning to infographics over textual resumes.

Recruiters, who have gladly received the idea of visual resumes, believe that the acceptance is going to increase across the recruitment industry as the innovation and creativity involved reduces the effort from recruiter’s end to quite an extent. Stunning visuals are often showcased to mesmerize the HR managers and stand out above the crowd.

 Resume Intent @FORMCEPT

“It is easier to absorb visual content as vision rate of humans is very high and over 90% of visual information that is captured gets stored in the brain.”

Images are easily captured by a human brain and are retained for longer periods of time. To deliver a lasting impression on HR managers – FORMCEPT offers infographics to illustrate candidates’ profiles as visual summary of skills, experience, achievements, education and interests. This is how a resume infographic looks like-

Visual Resume Infographics

Over and above, FORMCEPT provides advanced analytics options for the recruiters to query and explore multiple resumes and also compare them alongside. For more details, please contact us.

Posted in FORMCEPT, Infographics, resume, visual CV | Tagged , , , , , , , , , , , | Comments Off

Data Analysis should be your Compass

Imagine that you are going from a well-known location- Point A, to an unknown location- Point B. Along your journey, you are referring to a GPS based navigation system and deciding how to proceed in a particular direction. In this scenario, there are can be two possibilities:

GPS Scenario

  1. You might know how to reach Point C optimally (event though the GPS may be suggesting a longer route via Point-X) and then rely on the GPS system to reach the destination, i.e. Point-B.
  2. You might blindly follow the GPS based navigation system to take you to the destination (Point-B) through Point-X that it thinks at that time might be the best possible route for you.

While you are on your way, you might change your course in-between due to traffic jams, or road blocks. In that case, the online navigation system will re-calculate the route to pick up where you are and start guiding you.

In fact, navigation systems have become intelligent enough to find out whether there is a traffic jam at certain places and provide alternate efficient routes, all in real time. In addition to that, they are non-intrusive and they provide the driver with complete freedom to follow the navigation system or change the course- “Navigation system adapts to the change”.

The navigation system provide you insights on the traffic data/route and you as a decision maker take the input and act on that.

So, how is this relevant in the context of Business? Consider, a typical organization where, the CXOs know the current state of the business (Point-A) and are eager to accomplish business goals (Point-B) faster. They have enough data collected inherently (knowledge) and are progressing towards the goal (Point B). In the context of business, Point-B might be any of these depending on the CXO level within the company-

  1. Increasing the revenue by  x%
  2. Increase product features as per the market demand
  3. Save cost by y%
  4. Increase customer base by z%
  5. Save inventory cost etc.
Company Compass

What is missing is a data driven analysis platform (GPS Navigation System) that can guide them to reach the desired destination faster and with the existing resources.

Why they need a platform rather than an application is that, one application may not be the silver bullet for all the requirements. An organization needs more than one application, custom built, for the business using the available data and resources to solve a particular business problem. The data driven analysis platform should inherently support that. The platform should be agile so that it can support multiple applications and adapt to the business requirement by doing all the heavy lifting of the repetitive and common tasks related to data analysis. In other words, it should quickly re-calculate the optimal path to the destination as and when there is a deviation from the earlier suggested path.

Can the current traditional Business Intelligence systems do that? It is challenging because the traditional BI systems are designed to work on structured data and are monolithic by nature. Moreover, the rate at which the data is being generated these days is much higher and mostly unstructured. The platform that can capture, store and analyze such data should

  • Treat the unstructured data in the same rigour as the structured data
  • Provide quick insights in as and when they are required (on-demand/real-time)
  • Understand context, i.e. put forth the possible strategies to reach the wanted destination and based on the choice taken by the decision maker assist them optimally

FORMCEPT Big Data platform is designed just for that. It enables enterprises

  • To gain business insights faster by leveraging the available data
  • To respond faster to the ever changing Business Intelligence requirements
  • To make “Dark Data” extinct by leveraging the historical data of the organization

FORMCEPT Data Analysis FlowFORMCEPT uses proprietary Data FoldingSM techniques to discover the relations and patterns that exists across the datasets and generates fact based unified views. What it means to business is that different business units can create their own virtual data in the form of unified data view and can write their own cognitive based data driven applications for the business problem.

For example, an e-commerce company’s marketing department can build their own Influencer Application which understands the customers holistically based on not only transactional data but also public data, like- social media, blogs, etc.. Based on this application, they can target a product promotion campaign effectively, thereby, increasing the revenue and customer base.

To learn more about FORMCEPT and how it can solve your business problem, please contactus@formcept.com

Posted in FORMCEPT | Tagged , , , , , , , , , | Comments Off

Big Data Tech Conclave 2013 – Part-2

In the previous blog, we discussed how FORMCEPT addresses the “Data Infrastructure Issues” using its MECBOT platform. In this blog we will take you through two real customer use-cases and show how enterprises can leverage MECBOT to solve the business problems.

Use Case 1: Loyalty Analysis and Targeted Promotional Campaign

Data Sources Goal
Bank Statements and Bills (PDF documents) To segment the customers
based on loyalty and target a promotional campaign
on specific set of products
Public data sources, like Geolocation, Region, Country, etc.

Following are the basic requirements for this use case-

  • Deploy a scalable data analysis platform for storage and analysis of documents *
  • Extract facts, like- account numbers, transactions, etc. from these documents
  • Enrich the content using the location data
  • Identify transaction patterns from the data and come up with a loyalty model
  • Validate the model
  • Represent the results such that key stakeholders can explore the results and initiate a targeted promotional campaign

* One of the key factor for underlying Data Infrastructure

Continue reading

Posted in FORMCEPT, Infographics, Retail | Comments Off

Big Data Tech Conclave 2013 – Part-1

Leaders from around the world gathered at the “Big Data Tech Conclave 2013 Winter Edition” marking the success of the event held on the 6th and 7th December 2013 at Bangalore. FORMCEPT was associated with the global conclave as an endorsing partner.

Big Data Tech Conclave winter editionThe 2-day event hosted back-to-back inspiring session around the deluge called Big Data. It was well attended and eminent personalities from the industry shared their knowledge and experience with the audience.

In this blog, FORMCEPT would like to share the key takeaways from the event.

Big Data Tech ConclaveOn the first day of the event there was one thing common across all the talks- “Data Infrastructure Issues”. It is a broader term for the issues related to Data Capturing (Structured and/or Unstructured), Storage, Analysis, Delivery and Visualization.

Most of the talks forced us to think- Do enterprises need to worry about the “Data Infrastructure Issues? and that too all of them?” or do they just need to worry about solving their business problem? It made us think- when we buy a Fridge or AC do we ever ask about the compressor being attached or any of the electronic system being used? If not then why can’t we ease the pain for the enterprises in the similar way for their Data Infrastructure issues?

If we talk about the current scenario of data infrastructure, it is evident that the traditional technologies are slowly being replaced by the upcoming technologies and the gap between human expertise and the technology is increasing at a rapid pace. This scenario is jeopardizing the data analysis, typically Big Data analysis adoption in most of the enterprises due to the lack of robust Data Infrastructure. On the other hand, if you ask the CXOs, they definitely want to adopt the same as they are aware of the competition that is taking advantage of emerging data analysis techniques.

FORMCEPT addresses this by MECBOT, a unified analytics platform, built on top of state of the art Open source like- Hadoop, HBase, Storm and Spark. Enterprises are now taking advantage of MECBOT that does all the heavy-lifting around data and makes it available on-demand as well as in real-time. Enterprises can focus on their business problem rather than worrying about the “Data Infrastructure Issues”. MECBOT also allows enterprises to develop Data Driven applications faster and scale it on demand using their existing skill set.

To know more about FORMCEPT and MECBOT, please contactus@formcept.com

Posted in FORMCEPT | Tagged , , | Comments Off

FORMCEPT featured at TechCrunch Bangalore

For the first time ever, TechCrunch International City event arrived in India and was held in the tech-hub Bangalore spanning across 2 days (November 14 – 15, 2013).

FORMCEPT was featured among 50 startups selected from hundreds of entries for Pitch Presentations. The event showcased these startups launching their products before a live and online audience, including a panel of 50 investors and expert judges.

We are proud to be a part of chosen few to demonstrate out product MECBOT at TechCrunch platform.

Tech Crunch IndiaTechCrunch is a leading technology media property, dedicated to obsessively profiling startups, reviewing new Internet products, and breaking tech news. TechCrunch Bangalore focused on encouraging the upcoming Indian startups to have a ground-breaking impact on the global stage.

Posted in FORMCEPT | Comments Off

FORMCEPT features in NASSCOM Emerge 50

FORMCEPT has been recognized as one of “NASSCOM Emerge 50″ companies of India for 2013 in the Emerge Start-up category. 

Emerge_50_2013NASSCOM Emerge 50 Awards annually recognize top 50 highly innovative and agile Emerging & Start-up companies that are foraying into untapped territories and redefining the way IT can make a difference. In the start-up category, the award recognizes the companies that are innovative and are growing at a rapid pace within 3 years of existence.

About NASSCOM
NASSCOM is a global trade body with over 1200 members, of which over 250 are global companies from the US, UK, EU, Japan and China. With “Emerge 50” NASSCOM lends out support to upcoming companies by laying a platform to showcase their potential.

Posted in Award, FORMCEPT | Tagged , , , , | Comments Off

FORMCEPT’s approach to Telematics

“Connected cars have the potential to dramatically reduce the 1.2 million traffic deaths that occur worldwide each year”
                                                   – The National Highway Traffic Safety Administration, June 2013

Since the first ever successful installation of car radio in 1930, telecommunications technology has evolved significantly. Our vehicles now provide an enhanced and safe experience by embedding the latest technologies and gadgets. The rapidly evolving mobile technology is driving the next phase of innovation in vehicles and becoming an integral part of the system which provides benefits like infotainment, assistance and navigation facilities on-the-go.

With these fruitful business opportunities, the automotive industry is gearing up to the challenge and it comes as no surprise that by 2022, the connected car market’s worth is projected at staggering $422 billion.

Machina research, 2013

Source: Machina Research, 2013

This is an age of smart phones and we are slowly progressing towards “smart cars” that will soon become an everyday “common” term. Continue reading

Posted in FORMCEPT, Infographics, Telematics | Tagged , , | Comments Off